Umeå University's logo

umu.sePublications
Change search
Link to record
Permanent link

Direct link
Publications (6 of 6) Show all publications
Dadras, A. (2025). Personalized models and optimization in federated learning. (Doctoral dissertation). Umeå: Umeå University
Open this publication in new window or tab >>Personalized models and optimization in federated learning
2025 (English)Doctoral thesis, comprehensive summary (Other academic)
Alternative title[sv]
Personanpassade modeller och optimering inom federerad inlärning
Abstract [en]

The rapid increase in data generation, combined with the impracticality of centralizing large-scale datasets and the growing complexity of machine learning tasks, has driven the development of distributed learning techniques. Among these, Federated Learning (FL) has gained significant attention due to its privacy-preserving approach, where multiple clients collaboratively train a global model without sharing their local data. However, FL faces several key challenges, including data heterogeneity, high computational costs, and communication inefficiencies. These issues become more pronounced in real-world scenarios where client data distributions are non-IID, computational resources are limited, and communication is constrained.

This thesis addresses these challenges through the development of efficient algorithms for Personalized Federated Learning (pFL) and Constrained Federated Learning. The proposed approaches are designed to handle heterogeneous data, minimize computational overhead, and reduce communication costs while maintaining strong theoretical guarantees.

Specifically, the thesis introduces three key contributions: (1) pFLMF, a novel pFL formulation based on low-rank matrix optimization, leveraging Burer-Monteiro factorization to enable personalization without relying on predefined distance metrics. (2) PerMFL, an algorithm for multi-tier pFL that introduces personalized decision variables for both teams and individual devices, enabling efficient optimization in scenarios with hierarchical client structures. (3) FedFW, a projection-free algorithm for constrained FL, which emphasizes low computational cost, privacy preservation, and communication efficiency through sparse signal exchanges.

By addressing critical issues in FL, such as data heterogeneity, computation costs, and communication bottlenecks, the proposed algorithms advance the field of Federated Learning, providing robust and scalable solutions for real-world applications. 

Place, publisher, year, edition, pages
Umeå: Umeå University, 2025. p. 30
Series
Research report in mathematical statistics, ISSN 1653-0829
Keywords
Federated Learning, Machine Learning, Optimization
National Category
Computational Mathematics
Identifiers
urn:nbn:se:umu:diva-234464 (URN)9789180706001 (ISBN)9789180705998 (ISBN)
Public defence
2025-02-12, UB.A.220, Samhällsvetarhuset, 09:00 (English)
Opponent
Supervisors
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Available from: 2025-01-29 Created: 2025-01-23 Last updated: 2025-01-27Bibliographically approved
Dadras, A., Leffler, K. & Yu, J. (2024). A ridgelet approach to poisson denoising.
Open this publication in new window or tab >>A ridgelet approach to poisson denoising
2024 (English)Manuscript (preprint) (Other academic)
Abstract [en]

This paper introduces a novel ridgelet transform-based method for Poisson image denoising. Our work focuses on harnessing the Poisson noise's unique non-additive and signal-dependent properties, distinguishing it from Gaussian noise. The core of our approach is a new thresholding scheme informed by theoretical insights into the ridgelet coefficients of Poisson-distributed images and adaptive thresholding guided by Stein's method. We verify our theoretical model through numerical experiments and demonstrate the potential of ridgelet thresholding across assorted scenarios. Our findings represent a significant step in enhancing the understanding of Poisson noise and offer an effective denoising method for images corrupted with it.

Keywords
sparse signal processing, compressed sensing, positron emission tomography, denoising, inpainting
National Category
Probability Theory and Statistics Signal Processing
Research subject
Mathematical Statistics
Identifiers
urn:nbn:se:umu:diva-220205 (URN)10.48550/arXiv.2401.16099 (DOI)978-91-8070-279-9 (ISBN)978-91-8070-280-5 (ISBN)
Funder
Swedish Research Council, 340-2013-5342
Available from: 2024-02-05 Created: 2024-02-05 Last updated: 2024-02-06Bibliographically approved
Dadras, A., Banerjee, S., Prakhya, K. & Yurtsever, A. (2024). Federated Frank-Wolfe algorithm. In: Albert Bifet; Jesse Davis; Tomas Krilavičius; Meelis Kull; Eirini Ntoutsi; Indrė Žliobaitė (Ed.), Machine learning and knowledge discovery in databases. Research track: European Conference, ECML PKDD 2024, Vilnius, Lithuania, September 9–13, 2024, proceedings, part III. Paper presented at European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2024), Vilnius, Lithuania, September 9-13, 2024 (pp. 58-75). Springer Nature
Open this publication in new window or tab >>Federated Frank-Wolfe algorithm
2024 (English)In: Machine learning and knowledge discovery in databases. Research track: European Conference, ECML PKDD 2024, Vilnius, Lithuania, September 9–13, 2024, proceedings, part III / [ed] Albert Bifet; Jesse Davis; Tomas Krilavičius; Meelis Kull; Eirini Ntoutsi; Indrė Žliobaitė, Springer Nature, 2024, p. 58-75Conference paper, Published paper (Refereed)
Abstract [en]

Federated learning (FL) has gained a lot of attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW features data privacy, low per-iteration cost, and communication of sparse signals. In the deterministic setting, FedFW achieves an ε-suboptimal solution within O(ε-2) iterations for smooth and convex objectives, and O(ε-3) iterations for smooth but non-convex objectives. Furthermore, we present a stochastic variant of FedFW and show that it finds a solution within O(ε-3) iterations in the convex setting. We demonstrate the empirical performance of FedFW on several machine learning tasks.

Place, publisher, year, edition, pages
Springer Nature, 2024
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14943
Keywords
federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-228614 (URN)10.1007/978-3-031-70352-2_4 (DOI)001308375900004 ()978-3-031-70351-5 (ISBN)978-3-031-70352-2 (ISBN)
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2024), Vilnius, Lithuania, September 9-13, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Swedish Research Council, 2023-05476
Note

Also part of the book sub series: Lecture Notes in Artificial Intelligence (LNAI). 

Available from: 2024-08-19 Created: 2024-08-19 Last updated: 2025-04-24Bibliographically approved
Dadras, A., Stich, S. U. & Yurtsever, A. (2024). Personalized federated learning via low-rank matrix factorization. In: OPT 2024: optimization for machine learning. Paper presented at NeurIPS 2024 Workshop, OPT2024: 16th Annual Workshop on Optimization for Machine Learning, Vancouver, Canada, December 15, 2024. , Article ID 130.
Open this publication in new window or tab >>Personalized federated learning via low-rank matrix factorization
2024 (English)In: OPT 2024: optimization for machine learning, 2024, article id 130Conference paper, Published paper (Refereed)
Abstract [en]

Personalized Federated Learning (pFL) has gained attention for building a suite of models tailored to different clients. In pFL, the challenge lies in balancing the reliance on local datasets, which may lack representativeness, against the diversity of other clients’ models, whose quality and relevance are uncertain. Focusing on the clustered FL scenario, where devices are grouped based on similarities intheir data distributions without prior knowledge of cluster memberships, we develop a mathematical model for pFL using low-rank matrix optimization. Building on this formulation, we propose a pFL approach leveraging the Burer-Monteiro factorization technique. We examine the convergence guarantees of the proposed method, and present numerical experiments on training deep neural networks, demonstrating the empirical performance of the proposed method in scenarios where personalization is crucial. 

Keywords
Personalized Federated Learning, Machine Learning, Optimization
National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-234480 (URN)
Conference
NeurIPS 2024 Workshop, OPT2024: 16th Annual Workshop on Optimization for Machine Learning, Vancouver, Canada, December 15, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Swedish Research Council, 2023-05476
Available from: 2025-01-23 Created: 2025-01-23 Last updated: 2025-01-23Bibliographically approved
Banerjee, S., Dadras, A., Yurtsever, A. & Bhuyan, M. H. (2024). Personalized multi-tier federated learning. In: : . Paper presented at ICONIP 2024, 31st International Conference on Neural Information Processing, Auckland, New Zealand, December 2-6, 2024.
Open this publication in new window or tab >>Personalized multi-tier federated learning
2024 (English)Conference paper, Published paper (Refereed)
Abstract [en]

The key challenge of personalized federated learning (PerFL) is to capture the statistical heterogeneity properties of data with inexpensive communications and gain customized performance for participating devices. To address these, we introduced personalized federated learning in multi-tier architecture (PerMFL) to obtain optimized and personalized local models when there are known team structures across devices. We provide theoretical guarantees of PerMFL, which offers linear convergence rates for smooth strongly convex problems and sub-linear convergence rates for smooth non-convex problems. We conduct numerical experiments demonstrating the robust empirical performance of PerMFL, outperforming the state-of-the-art in multiple personalized federated learning tasks.

National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-228859 (URN)
Conference
ICONIP 2024, 31st International Conference on Neural Information Processing, Auckland, New Zealand, December 2-6, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Available from: 2024-08-27 Created: 2024-08-27 Last updated: 2025-01-23
Dadras, A., Prakhya, K. & Yurtsever, A. (2022). Federated Frank-Wolfe Algorithm. In: : . Paper presented at FL-NeurIPS'22, International Workshop on Federated Learning: Recent Advances and New Challenges in Conjunction with NeurIPS 2022, New Orleans, LA, USA, December 2, 2022.
Open this publication in new window or tab >>Federated Frank-Wolfe Algorithm
2022 (English)Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Federated learning (FL) has gained much attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still very limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW provably finds an ε-suboptimal solution of the constrained empirical risk-minimization problem after O(ε−2) iterations if the objective function is convex. The rate becomes O(ε−3) if the objective is non-convex. The method enjoys data privacy, low per-iteration cost and communication of sparse signals. We demonstrate empirical performance of the FedFW algorithm on several machine learning tasks.

Keywords
federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
National Category
Computer Sciences
Identifiers
urn:nbn:se:umu:diva-205126 (URN)
Conference
FL-NeurIPS'22, International Workshop on Federated Learning: Recent Advances and New Challenges in Conjunction with NeurIPS 2022, New Orleans, LA, USA, December 2, 2022
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Available from: 2023-02-23 Created: 2023-02-23 Last updated: 2025-01-23Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-1134-2615

Search in DiVA

Show all publications