Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Personalized multi-tier federated learning
Umeå University, Faculty of Science and Technology, Department of Computing Science.ORCID iD: 0000-0002-3451-2851
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.ORCID iD: 0000-0001-7320-1506
Umeå University, Faculty of Science and Technology, Department of Computing Science.ORCID iD: 0000-0002-9842-7840
2024 (English)Conference paper, Published paper (Refereed)
Abstract [en]

The key challenge of personalized federated learning (PerFL) is to capture the statistical heterogeneity properties of data with inexpensive communications and gain customized performance for participating devices. To address these, we introduced personalized federated learning in multi-tier architecture (PerMFL) to obtain optimized and personalized local models when there are known team structures across devices. We provide theoretical guarantees of PerMFL, which offers linear convergence rates for smooth strongly convex problems and sub-linear convergence rates for smooth non-convex problems. We conduct numerical experiments demonstrating the robust empirical performance of PerMFL, outperforming the state-of-the-art in multiple personalized federated learning tasks.

Place, publisher, year, edition, pages
2024.
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:umu:diva-228859OAI: oai:DiVA.org:umu-228859DiVA, id: diva2:1892627
Conference
ICONIP 2024, 31st International Conference on Neural Information Processing, Auckland, New Zealand, December 2-6, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Available from: 2024-08-27 Created: 2024-08-27 Last updated: 2024-08-27
In thesis
1. Advancing federated learning: algorithms and use-cases
Open this publication in new window or tab >>Advancing federated learning: algorithms and use-cases
2024 (English)Doctoral thesis, comprehensive summary (Other academic)
Alternative title[sv]
Förbättrad federerad maskininlärning : algoritmer och tillämpningar
Abstract [en]

Federated Learning (FL) is a distributed machine learning paradigm that enables the training of models across numerous clients or organizations without requiring the transfer of local data. This method addresses concerns about data privacy and ownership by keeping raw data on the client itself and only sharing model updates with a central server. Despite its benefits, federated learning faces unique challenges, such as data heterogeneity, computation and communication overheads, and the need for personalized models. Thereby results in reduced model performance, lower efficiency, and longer training times.

This thesis investigates these issues from theoretical, empirical, and practical application perspectives with four-fold contributions, such as federated feature selection, adaptive client selection, model personalization, and socio-cognitive applications. Firstly, we addressed the data heterogeneity problems for federated feature selection in horizontal FL by developing algorithms based on mutual information and multi-objective optimization. Secondly, we tackled system heterogeneity issues that involved variations in computation, storage, and communication capabilities among clients. We proposed a solution that ranks clients with multi-objective optimization for efficient, fair, and adaptive participation in model training. Thirdly, we addressed the issue of client drift caused by data heterogeneity in hierarchical federated learning with a personalized federated learning approach. Lastly, we focused on two key applications that benefit from the FL framework but suffer from data heterogeneity issues. The first application attempts to predict the level of autobiographic memory recall of events associated with the lifelog image by developing clustered personalized FL algorithms, which help in selecting effective lifelog image cues for cognitive interventions for the clients. The second application is the development of a personal image privacy advisor for each client. Along with data heterogeneity, the privacy advisor faces data scarcity issues. We developed a daisy chain-enabled clustered personalized FL algorithm, which predicts whether an image should be shared, kept private, or recommended for sharing by a third party.

Our findings reveal that the proposed methods significantly outperformed the current state-of-the-art FL  algorithms. Our methods deliver superior performance, earlier convergence, and training efficiency.

Place, publisher, year, edition, pages
Umeå: Umeå University, 2024. p. 84
Series
Report / UMINF, ISSN 0348-0542 ; 24.09
Keywords
Federated Learning, Federated Feature Selection, Statistical Heterogeneity, System Heterogeneity, Model Personalization, Socio-Cognitive Applications
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:umu:diva-228863 (URN)978-91-8070-463-2 (ISBN)978-91-8070-464-9 (ISBN)
Public defence
2024-09-23, Hörsal HUM.D.210, Humanisthuset, Umeå, 13:00 (English)
Opponent
Supervisors
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Available from: 2024-09-02 Created: 2024-08-27 Last updated: 2024-08-28Bibliographically approved

Open Access in DiVA

No full text in DiVA

Authority records

Banerjee, SourasekharDadras, AliYurtsever, AlpBhuyan, Monowar H.

Search in DiVA

By author/editor
Banerjee, SourasekharDadras, AliYurtsever, AlpBhuyan, Monowar H.
By organisation
Department of Computing ScienceDepartment of Mathematics and Mathematical Statistics
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 76 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf