Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Personalized federated learning via low-rank matrix factorization
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.ORCID iD: 0000-0003-1134-2615
CISPA Helmholtz Center, Germany.
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.ORCID iD: 0000-0001-7320-1506
2024 (English)In: OPT 2024: optimization for machine learning, 2024, article id 130Conference paper, Published paper (Refereed)
Abstract [en]

Personalized Federated Learning (pFL) has gained attention for building a suite of models tailored to different clients. In pFL, the challenge lies in balancing the reliance on local datasets, which may lack representativeness, against the diversity of other clients’ models, whose quality and relevance are uncertain. Focusing on the clustered FL scenario, where devices are grouped based on similarities intheir data distributions without prior knowledge of cluster memberships, we develop a mathematical model for pFL using low-rank matrix optimization. Building on this formulation, we propose a pFL approach leveraging the Burer-Monteiro factorization technique. We examine the convergence guarantees of the proposed method, and present numerical experiments on training deep neural networks, demonstrating the empirical performance of the proposed method in scenarios where personalization is crucial. 

Place, publisher, year, edition, pages
2024. article id 130
Keywords [en]
Personalized Federated Learning, Machine Learning, Optimization
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:umu:diva-234480OAI: oai:DiVA.org:umu-234480DiVA, id: diva2:1930432
Conference
NeurIPS 2024 Workshop, OPT2024: 16th Annual Workshop on Optimization for Machine Learning, Vancouver, Canada, December 15, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Swedish Research Council, 2023-05476Available from: 2025-01-23 Created: 2025-01-23 Last updated: 2025-01-23Bibliographically approved
In thesis
1. Personalized models and optimization in federated learning
Open this publication in new window or tab >>Personalized models and optimization in federated learning
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Alternative title[sv]
Personanpassade modeller och optimering inom federerad inlärning
Abstract [en]

The rapid increase in data generation, combined with the impracticality of centralizing large-scale datasets and the growing complexity of machine learning tasks, has driven the development of distributed learning techniques. Among these, Federated Learning (FL) has gained significant attention due to its privacy-preserving approach, where multiple clients collaboratively train a global model without sharing their local data. However, FL faces several key challenges, including data heterogeneity, high computational costs, and communication inefficiencies. These issues become more pronounced in real-world scenarios where client data distributions are non-IID, computational resources are limited, and communication is constrained.

This thesis addresses these challenges through the development of efficient algorithms for Personalized Federated Learning (pFL) and Constrained Federated Learning. The proposed approaches are designed to handle heterogeneous data, minimize computational overhead, and reduce communication costs while maintaining strong theoretical guarantees.

Specifically, the thesis introduces three key contributions: (1) pFLMF, a novel pFL formulation based on low-rank matrix optimization, leveraging Burer-Monteiro factorization to enable personalization without relying on predefined distance metrics. (2) PerMFL, an algorithm for multi-tier pFL that introduces personalized decision variables for both teams and individual devices, enabling efficient optimization in scenarios with hierarchical client structures. (3) FedFW, a projection-free algorithm for constrained FL, which emphasizes low computational cost, privacy preservation, and communication efficiency through sparse signal exchanges.

By addressing critical issues in FL, such as data heterogeneity, computation costs, and communication bottlenecks, the proposed algorithms advance the field of Federated Learning, providing robust and scalable solutions for real-world applications. 

Place, publisher, year, edition, pages
Umeå: Umeå University, 2025. p. 30
Series
Research report in mathematical statistics, ISSN 1653-0829
Keywords
Federated Learning, Machine Learning, Optimization
National Category
Computational Mathematics
Identifiers
urn:nbn:se:umu:diva-234464 (URN)9789180706001 (ISBN)9789180705998 (ISBN)
Public defence
2025-02-12, UB.A.220, Samhällsvetarhuset, 09:00 (English)
Opponent
Supervisors
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Available from: 2025-01-29 Created: 2025-01-23 Last updated: 2025-01-27Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Full text

Authority records

Dadras, AliYurtsever, Alp

Search in DiVA

By author/editor
Dadras, AliYurtsever, Alp
By organisation
Department of Mathematics and Mathematical Statistics
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 329 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf