Machine Learning in Computational Fluid Dynamics

When:
28/02/2022 – 01/03/2022 all-day
2022-02-28T01:00:00+01:00
2022-03-01T01:00:00+01:00

Offre en lien avec l’Action/le Réseau : – — –/– — –

Laboratoire/Entreprise : Sorbonne Universite – Equipe Machine Learning and
Durée : 6 mois
Contact : patrick.gallinari@sorbonne-universite.fr
Date limite de publication : 2022-02-28

Contexte :
Numerical simulation of fluids plays an essential role in modeling complex physical phenomena in domains ranging from climate to aerodynamics. Fluid flows are well described by Navier-Stokes equations, but solving these equations at all scales remains extremely complex in many situations and only an averaged solution supplemented by a turbulence model is simulated in practice (Xiao and Cinnella, 2019). The increased availability of large amounts of high fidelity data and the recent development and deployment of powerful machine learning methods has motivated a surge of recent work for using machine learning in the context of computational fluid dynamics (CFD) (Durasaimy et al., 2019). Combining powerful statistical techniques and model-based methods leads to an entirely new perspective for modeling physics phenomena (Willard 2020). From the machine learning (ML) side, modeling complex dynamical systems and combining model-based and data-based approaches is the topic of active new research directions. This is then the context of this project, and our aim is to develop the interplay between Deep Learning (DL) and CFD in order to improve turbulence modeling and to challenge state of the art ML techniques.

Sujet :
Combining CFD models and Deep Learning

Our objective is to improve traditional CFD models, both in terms of complexity and of accuracy of the predictions, with the addition of ML components. Recent progresses, and the generalized use of automatic differentiation both for differentiable solvers and DL algorithms have paved the road to the integration of DL techniques and ODE/PDE solvers. In the ML community, a starting point for such investigations was the Neural ODE paper (Chen 2018) that promoted the use of ODE solvers for ML problems. We advocate for this research the use of DL modules for complementing CFD solvers, in the spirit of (Yin 2021) who introduced a principled approach however still limited to basic PDEs. In our new context, our final objective is to analyze how to model unclosed terms in the Reynolds-Averaged Navier-Stokes (RANS) equations. In order to simplify the problem, for the internship, the approach will be developed for a scalar surrogate of the Navier-Stokes equations, namely, the nonlinear Burgers’ equation, which has been widely used in the literature as a simplified ansatz for Navier-Stokes The whole system will be trained end to end with the DL modules and the numerical solvers using high-fidelity data.

In order to be useful for CFD applications a learned model must accurately simulate flows outside of the training distribution: operational conditions and environment may vary according to different physical factors thus requiring models to extrapolate to these new conditions. For providing such capabilities, we will adopt a new perspective by considering learning dynamical models from multiple environments and analyze the ability of this framework to extrapolate to new conditions.

Profil du candidat :
Master or engineering degree in engineering, computer science or applied mathematics.

Formation et compétences requises :
The candidate should have a strong scientific background with good technical skills in programming.

Adresse d’emploi :
Machine Learning and Information Access team – MLIA – https://mlia.lip6.fr, Sorbonne University, 75005 Paris, Fr

Document attaché : 202112141459_2021-12-MLIA-JLRA-Machine-Learning-Computational-Fluid-Dynamics.pdf