SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Locatello Francesco) "

Sökning: WFRF:(Locatello Francesco)

  • Resultat 1-2 av 2
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Dresdner, Gideon, et al. (författare)
  • Faster One-Sample Stochastic Conditional Gradient Method for Composite Convex Minimization
  • 2022
  • Ingår i: Proceedings of The 25th International Conference on Artificial Intelligence and Statistics. - : PMLR.
  • Konferensbidrag (refereegranskat)abstract
    • We propose a stochastic conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms. Existing CGM variants for this template either suffer from slow convergence rates, or require carefully increasing the batch size over the course of the algorithm's execution, which leads to computing full gradients. In contrast, the proposed method, equipped with a stochastic average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques. In applications we put special emphasis on problems with a large number of separable constraints. Such problems are prevalent among semidefinite programming (SDP) formulations arising in machine learning and theoretical computer science. We provide numerical experiments on matrix completion, unsupervised clustering, and sparsest-cut SDPs.
  •  
2.
  • Locatello, Francesco, et al. (författare)
  • Stochastic Frank-Wolfe for Composite Convex Minimization
  • 2019
  • Ingår i: Advances in Neural Information Processing Systems 32 (NeurIPS 2019).
  • Konferensbidrag (refereegranskat)abstract
    • A broad class of convex optimization problems can be formulated as a semidefinite program (SDP), minimization of a convex function over the positive-semidefinite cone subject to some affine constraints. The majority of classical SDP solvers are designed for the deterministic setting where problem data is readily available. In this setting, generalized conditional gradient methods (aka Frank-Wolfe-type methods) provide scalable solutions by leveraging the so-called linear minimization oracle instead of the projection onto the semidefinite cone. Most problems in machine learning and modern engineering applications, however, contain some degree of stochasticity. In this work, we propose the first conditional-gradient-type method for solving stochastic optimization problems under affine constraints. Our method guarantees O(k-1/3) convergence rate in expectation on the objective residual and O(k-5/12) on the feasibility gap.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-2 av 2
Typ av publikation
konferensbidrag (2)
Typ av innehåll
refereegranskat (2)
Författare/redaktör
Cevher, Volkan (2)
Yurtsever, Alp (2)
Locatello, Francesco (2)
Rätsch, Gunnar (1)
Dresdner, Gideon (1)
Vladarean, Maria-Lui ... (1)
visa fler...
Fercoq, Olivier (1)
visa färre...
Lärosäte
Umeå universitet (2)
Språk
Engelska (2)
Forskningsämne (UKÄ/SCB)
Naturvetenskap (2)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy