PhD Position F/M Neural Implicit Representation and operator learning for mutliscale problems in physics

Contract type : Fixed-term contract

Level of qualifications required : Graduate degree or equivalent

Fonction : PhD Position


Location: The PhD will be hosted at the University of Strabourg in the city center and founded by the project PDE-IA of the PEPR IA. Since this is a co-supervision with UKAEA Cuham the student is expected to go to Culham campus (near Oxford) between 3 and 6 weeks by year.

PhD director: Emmanuel Franck (INRIA, Unistra). PhD supervisors: V. Michel Dansac (IN- RIA, Unistra) and S. Pamela (UKAEA Culham).

Potential Collaborators: As part of this project, the student will interact with a number of researchers such as: Laurent Navoret (Unistra), Joubine Aghili (Unistra) and will be immersed in a stimulating ecosystem of researchers and students as part of the large projects: PEPR PDE-IA and Numpex (exascale computing). Existing links with Nvidia and Caltech at UKAEA will also be relevant to the PhD project.



Main activities

In light of the significant successes achieved by deep learning methods in computer-aided vision or language processing, new learning-based methods have emerged for the simulation and resolution of PDEs. We can mention PINN methods which allow solving a PDE by replacing finite-element approx- imations with neural networks [WSWP23]-[SS22] or neural operators that approximate the inverse operator of the PDE and allow for quickly predicting the solution from the source. For example, in [GPZ+23]-[CZP+24], the authors use a neural operator to predict the dynamics of a plasma in a simplified configuration in a relatively short time. Many realistic applications such as plasma physics require dealing with complicated geometries and multi-scale phenomena over long times. The challenge of this thesis is therefore to try to push these neural network-based approaches to a higher level for multi-scale problems. We would like to investigate approaches that maintain accuracy and stability over long times on general geometries.

A first approach will be to consider the Neural Galerkin method [BPVE24] which maintains an ODE structure in time but approximates the spatial part as well as the parametric dependence of the PDE by a neural network. This method allows using the good properties in high dimensions of networks to reduce the number of degrees of freedom. We propose to couple this approach with recent approaches from PINNs to deal with general geometries. Secondly, we aim to study long-term stabil- ity, which is a critical problem by incorporating the structure of the equations [Sun19], using splitting schemes to preserve the structure, or combining the scheme with ”stabilization” methods [BP24]. One of the key points will be to determine robust neural network architectures.

The second approach will focus on neural operators. Early results have shown that this is a promising direction. However, long-term stability issues remain significant. We wish to explore several methods to improve long-term approximations [MHSB23]-[LVP+23] and extend them to multi-scale configurations. In addition to these general approaches, we can also study how to incorporate the structure of the physical problem into the architecture of the operators. The obtained approaches will be coupled with methods capable of dealing with general geometries such as [LKC+24]-[BET22] which use parameterized integral kernels in the physical domain.

Purely neural methods will remain limited in precision. For this reason, ultimately, we would like to couple them with more classical numerical approaches to obtain algorithms that are faster than traditional approaches and reliable. This type of coupling has already yielded very encouraging results [FMDN23].

In sum, this topic represents a stimulating opportunity to explore recent advances in the field of deep learning and numerical simulation. We propose a balanced approach that combines traditional methods and innovative techniques to solve complex problems in the physical sciences. In particular, we will validate the approaches Fluid and MHD PDE systems with turbulence and convective mixing, with potential applications to engineering and fusion. Students who are interested in the intellectual challenges and practical applications of computational modeling are encouraged to apply.


[BET22] Nicolas Boullé, Christopher J Earls, and Alex Townsend. Data-driven discovery of green’s functions with human-understandable deep learning. Scientific reports, 12(1):4824, 2022.

[BP24] Jules Berman and Benjamin Peherstorfer. Randomized sparse neural galerkin schemes for solving evolution equations with deep networks. Advances in Neural Information Process- ing Systems, 36, 2024.

[BPVE24] Joan Bruna, Benjamin Peherstorfer, and Eric Vanden-Eijnden. Neural galerkin schemes with active learning for high-dimensional evolution equations. Journal of Computational Physics, 496:112588, 2024.

[CZP+ 24] N Carey, L Zanisi, S Pamela, V Gopakumar, J Omotani, J Buchanan, and J Brandstetter. Data efficiency and long term prediction capabilities for neural operator surrogate models of core and edge plasma codes. arXiv preprint arXiv:2402.08561, 2024.

[FMDN23] Emmanuel Franck, Victor Michel-Dansac, and Laurent Navoret. Approximately well- balanced discontinuous galerkin methods using bases enriched with physics-informed neu- ral networks. arXiv preprint arXiv:2310.14754, 2023.

[GPZ+ 23] Vignesh Gopakumar, Stanislas Pamela, Lorenzo Zanisi, Zongyi Li, Ander Gray, Daniel Brennand, Nitesh Bhatia, Gregory Stathopoulos, Matt Kusner, Marc Peter Deisen- roth, et al. Plasma surrogate modelling using fourier neural operators. arXiv preprint arXiv:2311.05967, 2023.

[LKC+ 24] Zongyi Li, Nikola Kovachki, Chris Choy, Boyi Li, Jean Kossaifi, Shourya Otta, Moham- mad Amin Nabian, Maximilian Stadler, Christian Hundt, Kamyar Azizzadenesheli, et al. Geometry-informed neural operator for large-scale 3d pdes. Advances in Neural Informa- tion Processing Systems, 36, 2024.

[LVP+ 23] Phillip Lippe, Bastiaan S. Veeling, Paris Perdikaris, Richard E Turner, and Johannes Brandstetter. PDE-Refiner: Achieving Accurate Long Rollouts with Temporal Neural PDE Solvers. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.

[MHSB23] Michael McCabe, Peter Harrington, Shashank Subramanian, and Jed Brown. Towards stability of autoregressive neural operators. arXiv preprint arXiv:2306.10619, 2023.

[SS22] Sukumar and Ankit Srivastava. Exact imposition of boundary conditions with dis- tance functions in physics-informed deep neural networks. Computer Methods in Applied Mechanics and Engineering, 389:114333, 2022.

[Sun19] Zhengjie Sun. A meshless symplectic method for two-dimensional nonlinear schr ̈odinger equations based on radial basis function approximation. Engineering Analysis with Bound- ary Elements, 2019.

[WSWP23] Sifan Wang, Shyam Sankaran, Hanwen Wang, and Paris Perdikaris. An expert’s guide to training physics-informed neural networks. arXiv preprint arXiv:2308.08468, 2023.

Benefits package

  • Subsidized meals
  • Partial reimbursement of public transport costs
  • Leave: 7 weeks of annual leave + 10 extra days off due to RTT (statutory reduction in working hours) + possibility of exceptional leave (sick children, moving home, etc.)
  • Possibility of teleworking (after 6 months of employment) and flexible organization of working hours
  • Professional equipment available (videoconferencing, loan of computer equipment, etc.)
  • Social, cultural and sports events and activities
  • Access to vocational training
  • Social security coverage


2100€ gross/month the 1st year