Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Hierarchical federated transfer learning: a multi-cluster approach on the computing continuum
Vienna University of Technology, Vienna, Austria.
Umeå University, Faculty of Science and Technology, Department of Computing Science. University of Vienna, Vienna, Austria.ORCID iD: 0000-0002-2281-8183
2023 (English)In: 2023 international conference on machine learning and applications (ICMLA), IEEE, 2023, p. 1163-1168Conference paper, Published paper (Refereed)
Abstract [en]

Federated Learning (FL) involves training models over a set of geographically distributed users. We address the problem where a single global model is not enough to meet the needs of geographically distributed heterogeneous clients. This setup captures settings where different groups of users have their own objectives however, users based on geographical location or task similarity, can be grouped together and by inter-cluster knowledge they can leverage the strength in numbers and better generalization in order to perform more efficient FL. We introduce a Hierarchical Multi-Cluster Computing Continuum for Federated Learning Personalization (HC3FL) to cluster similar clients and train one edge model per cluster. HC3FL incorporates federated transfer learning to enhance the performance of edge models by leveraging a global model that captures collective knowledge from all edge models. Furthermore, we introduce dynamic clustering based on task similarity to handle client drift and to dynamically recluster mobile (non-stationary) clients. We evaluate the HC3FL approach through extensive experiments on real-world datasets. The results demonstrate that our approach effectively improves the performance of edge models compared to traditional FL approaches.

Place, publisher, year, edition, pages
IEEE, 2023. p. 1163-1168
Series
International Conference on Emerging Technologies and Factory Automation proceedings, ISSN 1946-0740, E-ISSN 1946-0759
Keywords [en]
dynamic clustering, federated transfer learning, hierarchical collab-orative learning
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:umu:diva-223681DOI: 10.1109/ICMLA58977.2023.00174Scopus ID: 2-s2.0-85190111400ISBN: 9798350345346 (electronic)ISBN: 9798350318913 (print)OAI: oai:DiVA.org:umu-223681DiVA, id: diva2:1853851
Conference
22nd IEEE International Conference on Machine Learning and Applications, ICMLA 2023, Jacksonville, USA, December 15-17, 2023
Available from: 2024-04-23 Created: 2024-04-23 Last updated: 2024-07-02Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Aral, Atakan

Search in DiVA

By author/editor
Aral, Atakan
By organisation
Department of Computing Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 28 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf