SAMOVAR – SAMOVAR
Telecom SudParis
9 rue Charles Fourier
91011 EVRY CEDEX

Fax : +33 (0) 1 60 76 20 80

Sholom SCHECHTMAN

Ma√ģtre de Conf√©rences
SOP

sholom.schechtman@-Code to remove to avoid SPAM-telecom-sudparis.eu

Article dans une revue

2023

ref_biblio
Pascal Bianchi, Walid Hachem, Sholom Schechtman. Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions. Mathematics of Operations Research, In press, ⟨10.1287/moor.2021.0194⟩. ⟨hal-03442137v3⟩
Accès au texte intégral et bibtex
https://hal.science/hal-03442137/file/tame.pdf BibTex

2022

ref_biblio
Pascal Bianchi, Walid Hachem, Sholom Schechtman. Convergence of constant step stochastic gradient descent for non-smooth non-convex functions. Set-Valued and Variational Analysis, 2022, 30 (3), pp.1117-1147. ⟨10.1007/s11228-022-00638-z⟩. ⟨hal-02564349v3⟩
Accès au texte intégral et bibtex
https://hal.science/hal-02564349/file/clarke.pdf BibTex
ref_biblio
Sholom Schechtman. Stochastic proximal subgradient descent oscillates in the vicinity of its accumulation set. Optimization Letters, 2022, ⟨10.1007/s11590-022-01884-8⟩. ⟨hal-03676675⟩
Accès au texte intégral et bibtex
https://hal.science/hal-03676675/file/oscillations.pdf BibTex

2021

ref_biblio
Anas Barakat, Pascal Bianchi, Walid Hachem, Sholom Schechtman. Stochastic optimization with momentum: convergence, fluctuations, and traps avoidance. Electronic Journal of Statistics , 2021, 15 (2), pp.3892-3947. ⟨10.1214/21-EJS1880⟩. ⟨hal-03310455⟩
Accès au texte intégral et bibtex
https://hal.science/hal-03310455/file/20adam.pdf BibTex

Communication dans un congrès

2023

ref_biblio
Louis Leconte, Sholom Schechtman, Eric Moulines. ASkewSGD : an annealed interval-constrained optimisation method to train quantized neural networks. The 26th International Conference on Artificial Intelligence and Statistics (AISTATS), Apr 2023, Valencia (Espagne), Spain. pp.3644–3663. ⟨hal-04527690⟩
Accès au texte intégral et bibtex
https://hal.science/hal-04527690/file/leconte23a.pdf BibTex
ref_biblio
Louis Leconte, Sholom Schechtman, Eric Moulines. AskewSGD : an annealed interval-constrained optimisation method to train quantized neural networks. 26th International Conference on Artificial Intelligence and Statistics, Apr 2023, Valencia, Spain. pp.3644–3663. ⟨hal-04063706⟩
Accès au texte intégral et bibtex
https://hal.science/hal-04063706/file/2211.03741.pdf BibTex
ref_biblio
Sholom Schechtman, Daniil Tiapkin, Michael Muehlebach, √Čric Moulines. Orthogonal directions constrained gradient method: from non-linear equality constraints to stiefel manifold. The 36th Annual Conference on Learning Theory (COLT 2023), Jul 2023, Bangalore, India. pp.1228–1258. ⟨hal-04273789⟩
Accès au bibtex
https://arxiv.org/pdf/2303.09261 BibTex

Thèse

2021

ref_biblio
Sholom Schechtman. Some Problems in Nonconvex Stochastic Optimization. Numerical Analysis [math.NA]. Universit√© Gustave Eiffel, 2021. English. ⟨NNT : 2021UEFL2031⟩. ⟨tel-03698454⟩
Accès au texte intégral et bibtex
https://theses.hal.science/tel-03698454/file/TH2021UEFL2031.pdf BibTex

Pré-publication, Document de travail

2024

ref_biblio
Sholom Schechtman. The gradient‚Äôs limit of a definable family of functions is a conservative set-valued field. 2024. ⟨hal-04452981⟩
Accès au texte intégral et bibtex
https://hal.science/hal-04452981/file/lim_consv3.pdf BibTex

2023

ref_biblio
Evgenii Chzhen, Sholom Schechtman. SignSVRG: fixing signSGD via variance reduction. 2023. ⟨hal-04112556⟩
Accès au bibtex
https://arxiv.org/pdf/2305.13187 BibTex