Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Faster One-Sample Stochastic Conditional Gradient Method for Composite Convex Minimization
Show others and affiliations
2022 (English)In: Proceedings of The 25th International Conference on Artificial Intelligence and Statistics / [ed] Gustau Camps-Valls; Francisco J. R. Ruiz; Isabel Valera, PMLR , 2022Conference paper, Published paper (Refereed)
Abstract [en]

We propose a stochastic conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms. Existing CGM variants for this template either suffer from slow convergence rates, or require carefully increasing the batch size over the course of the algorithm's execution, which leads to computing full gradients. In contrast, the proposed method, equipped with a stochastic average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques. In applications we put special emphasis on problems with a large number of separable constraints. Such problems are prevalent among semidefinite programming (SDP) formulations arising in machine learning and theoretical computer science. We provide numerical experiments on matrix completion, unsupervised clustering, and sparsest-cut SDPs.

Place, publisher, year, edition, pages
PMLR , 2022.
Series
Proceedings of Machine Learning Research, ISSN 2640-3498 ; 51
Keywords [en]
conditional gradient method, Frank-Wolfe, convex optimization, composite optimization, stochastic optimization, stochastic constraints, variance reduction, stochastic average gradient, semidefinite programming
National Category
Computational Mathematics
Identifiers
URN: urn:nbn:se:umu:diva-192720Scopus ID: 2-s2.0-85163122575OAI: oai:DiVA.org:umu-192720DiVA, id: diva2:1641349
Conference
25th International Conference on Artificial Intelligence and Statistics (AISTATS) 2022, Virtual, March 28-30, 2022
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP)Wallenberg AI, Autonomous Systems and Software Program (WASP)Available from: 2022-03-01 Created: 2022-03-01 Last updated: 2024-07-02Bibliographically approved

Open Access in DiVA

fulltext(1588 kB)131 downloads
File information
File name FULLTEXT01.pdfFile size 1588 kBChecksum SHA-512
02a793b723f54e88320080d1f55a5d24bd422bab36ea53a4f1da51adfb8bff1a29621f4facdf59f40356685add475e2412bc81daafb0482c9fd3e3289a24b44c
Type fulltextMimetype application/pdf

Other links

ScopusFull text

Authority records

Yurtsever, Alp

Search in DiVA

By author/editor
Yurtsever, Alp
By organisation
Department of Mathematics and Mathematical Statistics
Computational Mathematics

Search outside of DiVA

GoogleGoogle Scholar
Total: 131 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 423 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf