Umeå universitets logga

umu.sePublikationer
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Personalized models and optimization in federated learning
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för matematik och matematisk statistik.ORCID-id: 0000-0003-1134-2615
2025 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)Alternativ titel
Personanpassade modeller och optimering inom federerad inlärning (Svenska)
Abstract [en]

The rapid increase in data generation, combined with the impracticality of centralizing large-scale datasets and the growing complexity of machine learning tasks, has driven the development of distributed learning techniques. Among these, Federated Learning (FL) has gained significant attention due to its privacy-preserving approach, where multiple clients collaboratively train a global model without sharing their local data. However, FL faces several key challenges, including data heterogeneity, high computational costs, and communication inefficiencies. These issues become more pronounced in real-world scenarios where client data distributions are non-IID, computational resources are limited, and communication is constrained.

This thesis addresses these challenges through the development of efficient algorithms for Personalized Federated Learning (pFL) and Constrained Federated Learning. The proposed approaches are designed to handle heterogeneous data, minimize computational overhead, and reduce communication costs while maintaining strong theoretical guarantees.

Specifically, the thesis introduces three key contributions: (1) pFLMF, a novel pFL formulation based on low-rank matrix optimization, leveraging Burer-Monteiro factorization to enable personalization without relying on predefined distance metrics. (2) PerMFL, an algorithm for multi-tier pFL that introduces personalized decision variables for both teams and individual devices, enabling efficient optimization in scenarios with hierarchical client structures. (3) FedFW, a projection-free algorithm for constrained FL, which emphasizes low computational cost, privacy preservation, and communication efficiency through sparse signal exchanges.

By addressing critical issues in FL, such as data heterogeneity, computation costs, and communication bottlenecks, the proposed algorithms advance the field of Federated Learning, providing robust and scalable solutions for real-world applications. 

Ort, förlag, år, upplaga, sidor
Umeå: Umeå University, 2025. , s. 30
Serie
Research report in mathematical statistics, ISSN 1653-0829
Nyckelord [en]
Federated Learning, Machine Learning, Optimization
Nationell ämneskategori
Beräkningsmatematik
Identifikatorer
URN: urn:nbn:se:umu:diva-234464ISBN: 9789180706001 (digital)ISBN: 9789180705998 (tryckt)OAI: oai:DiVA.org:umu-234464DiVA, id: diva2:1930455
Disputation
2025-02-12, UB.A.220, Samhällsvetarhuset, 09:00 (Engelska)
Opponent
Handledare
Forskningsfinansiär
Wallenberg AI, Autonomous Systems and Software Program (WASP)Tillgänglig från: 2025-01-29 Skapad: 2025-01-23 Senast uppdaterad: 2025-01-27Bibliografiskt granskad
Delarbeten
1. Personalized federated learning via low-rank matrix factorization
Öppna denna publikation i ny flik eller fönster >>Personalized federated learning via low-rank matrix factorization
2024 (Engelska)Ingår i: OPT 2024: optimization for machine learning, 2024, artikel-id 130Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Personalized Federated Learning (pFL) has gained attention for building a suite of models tailored to different clients. In pFL, the challenge lies in balancing the reliance on local datasets, which may lack representativeness, against the diversity of other clients’ models, whose quality and relevance are uncertain. Focusing on the clustered FL scenario, where devices are grouped based on similarities intheir data distributions without prior knowledge of cluster memberships, we develop a mathematical model for pFL using low-rank matrix optimization. Building on this formulation, we propose a pFL approach leveraging the Burer-Monteiro factorization technique. We examine the convergence guarantees of the proposed method, and present numerical experiments on training deep neural networks, demonstrating the empirical performance of the proposed method in scenarios where personalization is crucial. 

Nyckelord
Personalized Federated Learning, Machine Learning, Optimization
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
urn:nbn:se:umu:diva-234480 (URN)
Konferens
NeurIPS 2024 Workshop, OPT2024: 16th Annual Workshop on Optimization for Machine Learning, Vancouver, Canada, December 15, 2024
Forskningsfinansiär
Wallenberg AI, Autonomous Systems and Software Program (WASP)Vetenskapsrådet, 2023-05476
Tillgänglig från: 2025-01-23 Skapad: 2025-01-23 Senast uppdaterad: 2025-01-23Bibliografiskt granskad
2. Personalized multi-tier federated learning
Öppna denna publikation i ny flik eller fönster >>Personalized multi-tier federated learning
2025 (Engelska)Ingår i: Neural information processing: 31st International Conference, ICONIP 2024, Auckland, New Zealand, December 2–6, 2024, proceedings, part II / [ed] Mufti Mahmud; Maryam Doborjeh; Kevin Wong; Andrew Chi Sing Leung; Zohreh Doborjeh; M. Tanveer, Springer Nature, 2025, s. 192-207Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

The key challenge of personalized federated learning (PerFL) is to capture the statistical heterogeneity properties of data with inexpensive communications and gain customized performance for participating devices. To address these, we introduced personalized federated learning in multi-tier architecture (PerMFL) to obtain optimized and personalized local models when there are known team structures across devices. We provide theoretical guarantees of PerMFL, which offers linear convergence rates for smooth strongly convex problems and sub-linear convergence rates for smooth non-convex problems. We conduct numerical experiments demonstrating the robust empirical performance of PerMFL, outperforming the state-of-the-art in multiple personalized federated learning tasks.

Ort, förlag, år, upplaga, sidor
Springer Nature, 2025
Serie
Communications in Computer and Information Science, ISSN 1865-0929, E-ISSN 1865-0937 ; 2283
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
urn:nbn:se:umu:diva-228859 (URN)10.1007/978-981-96-6951-6_14 (DOI)2-s2.0-105010821830 (Scopus ID)978-981-96-6950-9 (ISBN)978-981-96-6951-6 (ISBN)
Konferens
ICONIP 2024, 31st International Conference on Neural Information Processing, Auckland, New Zealand, December 2-6, 2024
Forskningsfinansiär
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Tillgänglig från: 2024-08-27 Skapad: 2024-08-27 Senast uppdaterad: 2025-07-28Bibliografiskt granskad
3. Federated Frank-Wolfe algorithm
Öppna denna publikation i ny flik eller fönster >>Federated Frank-Wolfe algorithm
2024 (Engelska)Ingår i: Machine learning and knowledge discovery in databases. Research track: European Conference, ECML PKDD 2024, Vilnius, Lithuania, September 9–13, 2024, proceedings, part III / [ed] Albert Bifet; Jesse Davis; Tomas Krilavičius; Meelis Kull; Eirini Ntoutsi; Indrė Žliobaitė, Springer Nature, 2024, s. 58-75Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Federated learning (FL) has gained a lot of attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW features data privacy, low per-iteration cost, and communication of sparse signals. In the deterministic setting, FedFW achieves an ε-suboptimal solution within O(ε-2) iterations for smooth and convex objectives, and O(ε-3) iterations for smooth but non-convex objectives. Furthermore, we present a stochastic variant of FedFW and show that it finds a solution within O(ε-3) iterations in the convex setting. We demonstrate the empirical performance of FedFW on several machine learning tasks.

Ort, förlag, år, upplaga, sidor
Springer Nature, 2024
Serie
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14943
Nyckelord
federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
urn:nbn:se:umu:diva-228614 (URN)10.1007/978-3-031-70352-2_4 (DOI)001308375900004 ()978-3-031-70351-5 (ISBN)978-3-031-70352-2 (ISBN)
Konferens
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2024), Vilnius, Lithuania, September 9-13, 2024
Forskningsfinansiär
Wallenberg AI, Autonomous Systems and Software Program (WASP)Vetenskapsrådet, 2023-05476
Anmärkning

Also part of the book sub series: Lecture Notes in Artificial Intelligence (LNAI). 

Tillgänglig från: 2024-08-19 Skapad: 2024-08-19 Senast uppdaterad: 2025-04-24Bibliografiskt granskad

Open Access i DiVA

spikblad(89 kB)77 nedladdningar
Filinformation
Filnamn SPIKBLAD01.pdfFilstorlek 89 kBChecksumma SHA-512
c860c9feba1c9aded72d1bbddd5c32abbba53769cb0a46fcc3b4c54ef96f056b4612acdd2d3d89b13c7bee869aea50ea6e816857550314c3a78ce0bc6661d95a
Typ spikbladMimetyp application/pdf
fulltext(670 kB)318 nedladdningar
Filinformation
Filnamn FULLTEXT02.pdfFilstorlek 670 kBChecksumma SHA-512
528accbbb468a58eafa14ffc10991b816cf137d3ee4d4e1138aa0697d8742532bcc16b2f29341f288b757ce09bbd7d3960830113b928823d6e44f4c7a199611f
Typ fulltextMimetyp application/pdf

Person

Dadras, Ali

Sök vidare i DiVA

Av författaren/redaktören
Dadras, Ali
Av organisationen
Institutionen för matematik och matematisk statistik
Beräkningsmatematik

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 318 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

isbn
urn-nbn

Altmetricpoäng

isbn
urn-nbn
Totalt: 2024 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf