Umeå University's logo

umu.sePublications
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • apa-6th-edition.csl
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Personalized models and optimization in federated learning
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.ORCID iD: 0000-0003-1134-2615
2025 (English)Doctoral thesis, comprehensive summary (Other academic)Alternative title
Personanpassade modeller och optimering inom federerad inlärning (Swedish)
Abstract [en]

The rapid increase in data generation, combined with the impracticality of centralizing large-scale datasets and the growing complexity of machine learning tasks, has driven the development of distributed learning techniques. Among these, Federated Learning (FL) has gained significant attention due to its privacy-preserving approach, where multiple clients collaboratively train a global model without sharing their local data. However, FL faces several key challenges, including data heterogeneity, high computational costs, and communication inefficiencies. These issues become more pronounced in real-world scenarios where client data distributions are non-IID, computational resources are limited, and communication is constrained.

This thesis addresses these challenges through the development of efficient algorithms for Personalized Federated Learning (pFL) and Constrained Federated Learning. The proposed approaches are designed to handle heterogeneous data, minimize computational overhead, and reduce communication costs while maintaining strong theoretical guarantees.

Specifically, the thesis introduces three key contributions: (1) pFLMF, a novel pFL formulation based on low-rank matrix optimization, leveraging Burer-Monteiro factorization to enable personalization without relying on predefined distance metrics. (2) PerMFL, an algorithm for multi-tier pFL that introduces personalized decision variables for both teams and individual devices, enabling efficient optimization in scenarios with hierarchical client structures. (3) FedFW, a projection-free algorithm for constrained FL, which emphasizes low computational cost, privacy preservation, and communication efficiency through sparse signal exchanges.

By addressing critical issues in FL, such as data heterogeneity, computation costs, and communication bottlenecks, the proposed algorithms advance the field of Federated Learning, providing robust and scalable solutions for real-world applications. 

Place, publisher, year, edition, pages
Umeå: Umeå University, 2025. , p. 30
Series
Research report in mathematical statistics, ISSN 1653-0829
Keywords [en]
Federated Learning, Machine Learning, Optimization
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:umu:diva-234464ISBN: 9789180706001 (electronic)ISBN: 9789180705998 (print)OAI: oai:DiVA.org:umu-234464DiVA, id: diva2:1930455
Public defence
2025-02-12, UB.A.220, Samhällsvetarhuset, 09:00 (English)
Opponent
Supervisors
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Available from: 2025-01-29 Created: 2025-01-23 Last updated: 2025-01-27Bibliographically approved
List of papers
1. Personalized federated learning via low-rank matrix factorization
Open this publication in new window or tab >>Personalized federated learning via low-rank matrix factorization
2024 (English)In: OPT 2024: optimization for machine learning, 2024, article id 130Conference paper, Published paper (Refereed)
Abstract [en]

Personalized Federated Learning (pFL) has gained attention for building a suite of models tailored to different clients. In pFL, the challenge lies in balancing the reliance on local datasets, which may lack representativeness, against the diversity of other clients’ models, whose quality and relevance are uncertain. Focusing on the clustered FL scenario, where devices are grouped based on similarities intheir data distributions without prior knowledge of cluster memberships, we develop a mathematical model for pFL using low-rank matrix optimization. Building on this formulation, we propose a pFL approach leveraging the Burer-Monteiro factorization technique. We examine the convergence guarantees of the proposed method, and present numerical experiments on training deep neural networks, demonstrating the empirical performance of the proposed method in scenarios where personalization is crucial. 

Keywords
Personalized Federated Learning, Machine Learning, Optimization
National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-234480 (URN)
Conference
NeurIPS 2024 Workshop, OPT2024: 16th Annual Workshop on Optimization for Machine Learning, Vancouver, Canada, December 15, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Swedish Research Council, 2023-05476
Available from: 2025-01-23 Created: 2025-01-23 Last updated: 2025-01-23Bibliographically approved
2. Personalized multi-tier federated learning
Open this publication in new window or tab >>Personalized multi-tier federated learning
2024 (English)Conference paper, Published paper (Refereed)
Abstract [en]

The key challenge of personalized federated learning (PerFL) is to capture the statistical heterogeneity properties of data with inexpensive communications and gain customized performance for participating devices. To address these, we introduced personalized federated learning in multi-tier architecture (PerMFL) to obtain optimized and personalized local models when there are known team structures across devices. We provide theoretical guarantees of PerMFL, which offers linear convergence rates for smooth strongly convex problems and sub-linear convergence rates for smooth non-convex problems. We conduct numerical experiments demonstrating the robust empirical performance of PerMFL, outperforming the state-of-the-art in multiple personalized federated learning tasks.

National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-228859 (URN)
Conference
ICONIP 2024, 31st International Conference on Neural Information Processing, Auckland, New Zealand, December 2-6, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Available from: 2024-08-27 Created: 2024-08-27 Last updated: 2025-01-23
3. Federated Frank-Wolfe algorithm
Open this publication in new window or tab >>Federated Frank-Wolfe algorithm
2024 (English)In: Machine learning and knowledge discovery in databases. Research track: European Conference, ECML PKDD 2024, Vilnius, Lithuania, September 9–13, 2024, proceedings, part III / [ed] Albert Bifet; Jesse Davis; Tomas Krilavičius; Meelis Kull; Eirini Ntoutsi; Indrė Žliobaitė, Springer Nature, 2024, p. 58-75Conference paper, Published paper (Refereed)
Abstract [en]

Federated learning (FL) has gained a lot of attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW features data privacy, low per-iteration cost, and communication of sparse signals. In the deterministic setting, FedFW achieves an ε-suboptimal solution within O(ε-2) iterations for smooth and convex objectives, and O(ε-3) iterations for smooth but non-convex objectives. Furthermore, we present a stochastic variant of FedFW and show that it finds a solution within O(ε-3) iterations in the convex setting. We demonstrate the empirical performance of FedFW on several machine learning tasks.

Place, publisher, year, edition, pages
Springer Nature, 2024
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14943
Keywords
federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-228614 (URN)10.1007/978-3-031-70352-2_4 (DOI)978-3-031-70351-5 (ISBN)978-3-031-70352-2 (ISBN)
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2024), Vilnius, Lithuania, September 9-13, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Swedish Research Council, 2023-05476
Note

Also part of the book sub series: Lecture Notes in Artificial Intelligence (LNAI). 

Available from: 2024-08-19 Created: 2024-08-19 Last updated: 2025-01-23Bibliographically approved

Open Access in DiVA

spikblad(89 kB)18 downloads
File information
File name SPIKBLAD01.pdfFile size 89 kBChecksum SHA-512
c860c9feba1c9aded72d1bbddd5c32abbba53769cb0a46fcc3b4c54ef96f056b4612acdd2d3d89b13c7bee869aea50ea6e816857550314c3a78ce0bc6661d95a
Type spikbladMimetype application/pdf
fulltext(670 kB)44 downloads
File information
File name FULLTEXT02.pdfFile size 670 kBChecksum SHA-512
528accbbb468a58eafa14ffc10991b816cf137d3ee4d4e1138aa0697d8742532bcc16b2f29341f288b757ce09bbd7d3960830113b928823d6e44f4c7a199611f
Type fulltextMimetype application/pdf

Authority records

Dadras, Ali

Search in DiVA

By author/editor
Dadras, Ali
By organisation
Department of Mathematics and Mathematical Statistics
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar
Total: 44 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 1094 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • apa-6th-edition.csl
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf