Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Federated Frank-Wolfe Algorithm
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.
Umeå University, Faculty of Science and Technology, Department of Computing Science.ORCID iD: 0000-0002-3451-2851
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.
Umeå University, Faculty of Science and Technology, Department of Mathematics and Mathematical Statistics.ORCID iD: 0000-0001-7320-1506
2024 (English)In: Machine Learning and Knowledge Discovery in Databases. Research Track: European Conference, ECML PKDD 2024, Vilnius, Lithuania, September 9–13, 2024, Proceedings, Part III / [ed] Albert Bifet; Jesse Davis; Tomas Krilavičius; Meelis Kull; Eirini Ntoutsi; Indrė Žliobaitė, 2024Conference paper, Published paper (Refereed)
Abstract [en]

Federated learning (FL) has gained a lot of attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW features data privacy, low per-iteration cost, and communication of sparse signals. In the deterministic setting, FedFW achieves an ε-suboptimal solution within O(ε-2) iterations for smooth and convex objectives, and O(ε-3) iterations for smooth but non-convex objectives. Furthermore, we present a stochastic variant of FedFW and show that it finds a solution within O(ε-3) iterations in the convex setting. We demonstrate the empirical performance of FedFW on several machine learning tasks.

Place, publisher, year, edition, pages
2024.
Series
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14943
Keywords [en]
federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:umu:diva-228614OAI: oai:DiVA.org:umu-228614DiVA, id: diva2:1890479
Conference
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2024), Vilnius, Lithuania, September 9-13, 2024
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Swedish Research Council, 2023-05476
Note

Also part of the book sub series: Lecture Notes in Artificial Intelligence (LNAI). 

Available from: 2024-08-19 Created: 2024-08-19 Last updated: 2024-08-20

Open Access in DiVA

No full text in DiVA

Authority records

Dadras, AliBanerjee, SourasekharPrakhya, KarthikYurtsever, Alp

Search in DiVA

By author/editor
Dadras, AliBanerjee, SourasekharPrakhya, KarthikYurtsever, Alp
By organisation
Department of Mathematics and Mathematical StatisticsDepartment of Computing Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 71 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf