Umeå universitets logga

umu.sePublikationer
Driftstörningar
Just nu har vi driftstörningar på sök-portalerna på grund av hög belastning. Vi arbetar på att lösa problemet, ni kan tillfälligt mötas av ett felmeddelande.
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • apa-6th-edition.csl
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Federated Frank-Wolfe algorithm
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för matematik och matematisk statistik.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-3451-2851
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för matematik och matematisk statistik.
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för matematik och matematisk statistik.ORCID-id: 0000-0001-7320-1506
2024 (Engelska)Ingår i: Machine learning and knowledge discovery in databases. Research track: European Conference, ECML PKDD 2024, Vilnius, Lithuania, September 9–13, 2024, proceedings, part III / [ed] Albert Bifet; Jesse Davis; Tomas Krilavičius; Meelis Kull; Eirini Ntoutsi; Indrė Žliobaitė, Springer Nature, 2024, s. 58-75Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Federated learning (FL) has gained a lot of attention in recent years for building privacy-preserving collaborative learning systems. However, FL algorithms for constrained machine learning problems are still limited, particularly when the projection step is costly. To this end, we propose a Federated Frank-Wolfe Algorithm (FedFW). FedFW features data privacy, low per-iteration cost, and communication of sparse signals. In the deterministic setting, FedFW achieves an ε-suboptimal solution within O(ε-2) iterations for smooth and convex objectives, and O(ε-3) iterations for smooth but non-convex objectives. Furthermore, we present a stochastic variant of FedFW and show that it finds a solution within O(ε-3) iterations in the convex setting. We demonstrate the empirical performance of FedFW on several machine learning tasks.

Ort, förlag, år, upplaga, sidor
Springer Nature, 2024. s. 58-75
Serie
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 14943
Nyckelord [en]
federated learning, frank wolfe, conditional gradient method, projection-free, distributed optimization
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
URN: urn:nbn:se:umu:diva-228614DOI: 10.1007/978-3-031-70352-2_4ISBN: 978-3-031-70351-5 (tryckt)ISBN: 978-3-031-70352-2 (digital)OAI: oai:DiVA.org:umu-228614DiVA, id: diva2:1890479
Konferens
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2024), Vilnius, Lithuania, September 9-13, 2024
Forskningsfinansiär
Wallenberg AI, Autonomous Systems and Software Program (WASP)Vetenskapsrådet, 2023-05476
Anmärkning

Also part of the book sub series: Lecture Notes in Artificial Intelligence (LNAI). 

Tillgänglig från: 2024-08-19 Skapad: 2024-08-19 Senast uppdaterad: 2025-01-23Bibliografiskt granskad
Ingår i avhandling
1. Personalized models and optimization in federated learning
Öppna denna publikation i ny flik eller fönster >>Personalized models and optimization in federated learning
2025 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Alternativ titel[sv]
Personanpassade modeller och optimering inom federerad inlärning
Abstract [en]

The rapid increase in data generation, combined with the impracticality of centralizing large-scale datasets and the growing complexity of machine learning tasks, has driven the development of distributed learning techniques. Among these, Federated Learning (FL) has gained significant attention due to its privacy-preserving approach, where multiple clients collaboratively train a global model without sharing their local data. However, FL faces several key challenges, including data heterogeneity, high computational costs, and communication inefficiencies. These issues become more pronounced in real-world scenarios where client data distributions are non-IID, computational resources are limited, and communication is constrained.

This thesis addresses these challenges through the development of efficient algorithms for Personalized Federated Learning (pFL) and Constrained Federated Learning. The proposed approaches are designed to handle heterogeneous data, minimize computational overhead, and reduce communication costs while maintaining strong theoretical guarantees.

Specifically, the thesis introduces three key contributions: (1) pFLMF, a novel pFL formulation based on low-rank matrix optimization, leveraging Burer-Monteiro factorization to enable personalization without relying on predefined distance metrics. (2) PerMFL, an algorithm for multi-tier pFL that introduces personalized decision variables for both teams and individual devices, enabling efficient optimization in scenarios with hierarchical client structures. (3) FedFW, a projection-free algorithm for constrained FL, which emphasizes low computational cost, privacy preservation, and communication efficiency through sparse signal exchanges.

By addressing critical issues in FL, such as data heterogeneity, computation costs, and communication bottlenecks, the proposed algorithms advance the field of Federated Learning, providing robust and scalable solutions for real-world applications. 

Ort, förlag, år, upplaga, sidor
Umeå: Umeå University, 2025. s. 30
Serie
Research report in mathematical statistics, ISSN 1653-0829
Nyckelord
Federated Learning, Machine Learning, Optimization
Nationell ämneskategori
Beräkningsmatematik
Identifikatorer
urn:nbn:se:umu:diva-234464 (URN)9789180706001 (ISBN)9789180705998 (ISBN)
Disputation
2025-02-12, UB.A.220, Samhällsvetarhuset, 09:00 (Engelska)
Opponent
Handledare
Forskningsfinansiär
Wallenberg AI, Autonomous Systems and Software Program (WASP)
Tillgänglig från: 2025-01-29 Skapad: 2025-01-23 Senast uppdaterad: 2025-01-27Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltext

Person

Dadras, AliBanerjee, SourasekharPrakhya, KarthikYurtsever, Alp

Sök vidare i DiVA

Av författaren/redaktören
Dadras, AliBanerjee, SourasekharPrakhya, KarthikYurtsever, Alp
Av organisationen
Institutionen för matematik och matematisk statistikInstitutionen för datavetenskap
Datavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 167 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • apa-6th-edition.csl
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf