dismiss

./Stefano aboutcurriculumcontacts
My name is Stefano Spigler. I am a postdoc in the Laboratory of Physics of Complex Systems at the École Polytechnique Fédérale de Lausanne (EPFL, Lausanne, Switzerland). My funding comes from the Simons Collaboration on Cracking the glass problem.
My current research focuses on the study of the loss landscape and of the learning dynamics in deep neural networks, as well as the asymptotic behavior of generalization error for large networks or large datasets.


Education & academic positions

Download the full CV (pdf)
2017-2020 Postdoc
Eneergy landscape and learning dynamics in deep learning
In collaboration with: Matthieu Wyart & Giulio Biroli
Physics of Complex Systems Laboratory
École Polytechnique Fédérale de Lausanne

2014-2017 PhD in Physics
Distribution of avalanches in disordered systems
Supervisor: Silvio Franz
Laboratoire de Physique Théorique et Modèles Statistiques
Université Paris Sud (Université Paris-Saclay)

Scholarship by the École Normale Supérieure
You can read here my Ph.D. thesis
2012-2014 Master in Physics of Complex Systems (link)
Politecnico di Torino, International School for Advanced Studies, International Centre for Theoretical Physics, Université Pierre et Marie Curie, Université Paris Diderot, Université Paris Sud, École Normale Supérieure de Cachan
Ranked 1st among all the participants
You can read here my M.Sc. thesis done at the Laboratoire de Physique Théorique et Modèles Statistiques under the supervision of Silvio Franz
2009-2012 Scuola Galileiana di Studi Superiori (link) (scholarship)
2009-2012 Laurea in Physics (BSc)
Università degli Studi di Padova

Skills

Languages Italian
English
French
Swedish (beginner)
Informatics Debian/Red Hat based linux distros
C/C++, Python and Pytorch
XML and derived languages (HTML), CSS, JavaScript (basic), PHP
SQL (basic), Pandas, R (basic)
Mathematica (basic), LaTeX, Office suite

Publications

2019 M. Geiger, S. Spigler, A. Jacot, M. Wyart
Scaling description of generalization with number of parameters in deep learning
submitted to conference (on arXiv soon)
2019 S. Spigler, M. Geiger, M. Wyart
Asymptotic learning curves of kernel methods: empirical data v.s. Teacher-Student paradigm
submitted to conference (arXiv preprint)
2019 M. Geiger, A. Jacot, S. Spigler, F. Gabriel, L. Sagun, S. d'Ascoli, G. Biroli, C. Hongler, M. Wyart
Scaling description of generalization with number of parameters in deep learning
to be submitted (arXiv preprint)
2018 S. Spigler, M. Geiger, S. d'Ascoli, L. Sagun, M. Baity-Jesi, G. Biroli, M. Wyart
A jamming transition from under- to over-parametrization affects loss landscape and generalization
NeurIPS 2018 workshop "Integration of Deep Learning Theories" (arXiv preprint)
2018 M. Geiger, S. Spigler, S. d'Ascoli, L. Sagun, M. Baity-Jesi, G. Biroli, M. Wyart
The jamming transition as a paradigm to understand the loss landscape of deep neural networks
PR E, to be published (arXiv preprint)
2018 M. Baity-Jesi, L. Sagun, M. Geiger, S. Spigler, G.B. Arous, C. Cammarota, Y. LeCun, M. Wyart, G. Biroli
Comparing dynamics: deep neural networks versus glassy systems
ICML, PMLR 80:314-323 (PMLR)
2017 S. Franz, S. Spigler
Mean-field avalanches in jammed spheres
Phys. Rev. E 95(2), 022139 (PR E)
2016 S. Franz, G. Gradenigo, S. Spigler
Random-diluted triangular plaquette model: Study of phase transitions in a kinetically constrained model
Phys. Rev. E 93(3), 032601 (PR E)

Contact me

You can contact me via email.

www.000webhost.com