Post-Doctoral Research Visit F/M PAC-Bayesian Analysis of Dynamical Systems

Le descriptif de l’offre ci-dessous est en Anglais

Type de contrat : CDD

Niveau de diplôme exigé : Thèse ou équivalent

Fonction : Post-Doctorant

Contexte et atouts du poste

The PostDoc position will be in the framework of the ERC Starting Grant DYNASTY (Dynamics-Aware Theory of Deep Learning).

The position might include traveling to conferences for paper presentation. Travel expenses will be covered within the limits of the scale in force.

Mission confiée

Deep learning shows significant empirical success in a wide range of applications. However, there is a lack of understanding on the theoretical side: it is not clear in which situations deep learning models generalize well. The PAC-Bayesian theory offers promising perspectives for this type of models [DR17; Per+21]. However, these bounds do not consider the learning algorithm, which might be key to obtaining tight generalization bounds. In order to reduce the gap between theory and practice (and obtain tight bounds), the hyper-parameters of the learning algorithm could be integrated into the bounds. More generally, integrating the dynamics of the learning processes might tighten the generalization bounds. In this context, the objective of the postdoc is (i) to develop new PAC-Bayesian bounds that take the learning process (and the dynamics) into account and (ii) to derive new learning algorithms based on the minimization of these new bounds [see e.g., Fre98]. For instance, the works of London [Lon17] and Rivasplata et al. [Riv+20] could be leveraged to obtain tight generalization bounds for models obtained through stochastic gradient descent with hyper-parameters sampled from a probability distribution.


References

[DR17] Gintare Karolina Dziugaite and Daniel M. Roy. “Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data”. In: Conference on Uncertainty in Artificial Intelligence (UAI). 2017.

[Fre98] Yoav Freund. “Self Bounding Learning Algorithms”. In: Conference on Learning Theory (COLT). 1998.

[Lon17] Ben London. “A PAC-Bayesian Analysis of Randomized Learning with Application to Stochastic Gradient Descent”. In: Advances
in Neural Information Processing System (NIPS). 2017.

[Per+21] Maria Perez-Ortiz et al. “Tighter Risk Certificates for Neural Networks”. In: Journal of Machine Learning Research (2021).

[Riv+20] Omar Rivasplata et al. “PAC-Bayes Analysis Beyond the Usual Bounds”. In: Advances in Neural Information Processing System
(NeurIPS). 2020.

 

Principales activités

Main activities:

  • Conduct theoretical research
  • Conduct experiments for empirical verification
  • Write scientific articles
  • Disseminate the scientific work in appropriate venues.

Compétences

Technical skills and level required :
Languages : High-level of professional/academic English
Coding skills : Good level of coding in Python and related deep learning libraries

Avantages

  • Subsidized meals
  • Partial reimbursement of public transport costs
  • Leave: 7 weeks of annual leave + 10 extra days off due to RTT (statutory reduction in working hours) + possibility of exceptional leave (sick children, moving home, etc.)
  • Possibility of teleworking and flexible organization of working hours (after 12 months of employment) 
  • Professional equipment available (videoconferencing, loan of computer equipment, etc.)
  • Social, cultural and sports events and activities
  • Access to vocational training
  • Social security coverage