Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Accelerating convergence in wireless federated learning by sharing marginal data
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0003-2514-3043
Sungkyunkwan University, Computer Science and Engineering Department, Suwon, South Korea.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-2633-6798
2023 (Engelska)Ingår i: 2023 International Conference on Information Networking (ICOIN), IEEE, 2023, s. 122-127Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Deploying federated learning (FL) over wireless mobile networks can be expensive because of the cost of wireless communication resources. Efforts have been made to reduce communication costs by accelerating model convergence, leading to the development of model-driven methods based on feature extraction, model-integrated algorithms, and client selection. However, the resulting performance gains are limited by the dependence of neural network convergence on input data quality. This work, therefore, investigates the use of marginal shared data (e.g., a single data entry) to accelerate model convergence and thereby reduce communication costs in FL. Experimental results show that sharing even a single piece of data can improve performance by 14.6% and reduce communication costs by 61.13% when using the federated averaging algorithm (FedAvg). Marginal data sharing could therefore be an attractive and practical solution in privacy-flexible environments or collaborative operational systems such as fog robotics and vehicles. Moreover, by assigning new labels to the shared data, it is possible to extend the number of classifying labels of an FL model even when the initial input datasets lack the labels in question.

Ort, förlag, år, upplaga, sidor
IEEE, 2023. s. 122-127
Serie
International conference on information networking, ISSN 1976-7684
Nyckelord [en]
data sharing, Edge computing, federated learning, wireless mobile network
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
URN: urn:nbn:se:umu:diva-205646DOI: 10.1109/ICOIN56518.2023.10048937ISI: 000981938900023Scopus ID: 2-s2.0-85149182136ISBN: 9781665462686 (digital)OAI: oai:DiVA.org:umu-205646DiVA, id: diva2:1742963
Konferens
37th International Conference on Information Networking, ICOIN 2023, January 11-14, 2023
Tillgänglig från: 2023-03-13 Skapad: 2023-03-13 Senast uppdaterad: 2023-09-05Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Seo, EunilElmroth, Erik

Sök vidare i DiVA

Av författaren/redaktören
Seo, EunilElmroth, Erik
Av organisationen
Institutionen för datavetenskap
Datavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 299 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf