Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Accelerating convergence in wireless federated learning by sharing marginal data
Umeå University, Faculty of Science and Technology, Department of Computing Science.ORCID iD: 0000-0003-2514-3043
Sungkyunkwan University, Computer Science and Engineering Department, Suwon, South Korea.
Umeå University, Faculty of Science and Technology, Department of Computing Science.ORCID iD: 0000-0002-2633-6798
2023 (English)In: 2023 International Conference on Information Networking (ICOIN), IEEE, 2023, p. 122-127Conference paper, Published paper (Refereed)
Abstract [en]

Deploying federated learning (FL) over wireless mobile networks can be expensive because of the cost of wireless communication resources. Efforts have been made to reduce communication costs by accelerating model convergence, leading to the development of model-driven methods based on feature extraction, model-integrated algorithms, and client selection. However, the resulting performance gains are limited by the dependence of neural network convergence on input data quality. This work, therefore, investigates the use of marginal shared data (e.g., a single data entry) to accelerate model convergence and thereby reduce communication costs in FL. Experimental results show that sharing even a single piece of data can improve performance by 14.6% and reduce communication costs by 61.13% when using the federated averaging algorithm (FedAvg). Marginal data sharing could therefore be an attractive and practical solution in privacy-flexible environments or collaborative operational systems such as fog robotics and vehicles. Moreover, by assigning new labels to the shared data, it is possible to extend the number of classifying labels of an FL model even when the initial input datasets lack the labels in question.

Place, publisher, year, edition, pages
IEEE, 2023. p. 122-127
Series
International conference on information networking, ISSN 1976-7684
Keywords [en]
data sharing, Edge computing, federated learning, wireless mobile network
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:umu:diva-205646DOI: 10.1109/ICOIN56518.2023.10048937ISI: 000981938900023Scopus ID: 2-s2.0-85149182136ISBN: 9781665462686 (electronic)OAI: oai:DiVA.org:umu-205646DiVA, id: diva2:1742963
Conference
37th International Conference on Information Networking, ICOIN 2023, January 11-14, 2023
Available from: 2023-03-13 Created: 2023-03-13 Last updated: 2023-09-05Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Seo, EunilElmroth, Erik

Search in DiVA

By author/editor
Seo, EunilElmroth, Erik
By organisation
Department of Computing Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 110 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf