S'abonner à l'agenda
  • Yvonne Alama Bronsard

    Numerical approximations to nonlinear dispersive equations, from short to long times

    16 janvier 2025 - 14:00Salle de conférences IRMA

    The first part of this talk deals with the numerical approximation to nonlinear dispersive equations, such as the prototypical nonlinear Schrödinger equation. We introduce novel integration techniques allowing for the construction of schemes which perform well both in smooth and non-smooth settings. We obtain symmetric low-regularity schemes with very good structure preserving properties over long times. Higher order extensions will be presented, following new techniques based on decorated trees series inspired by singular stochastic PDEs via the theory of regularity structures. In the second part, we introduce a new approach for designing and analyzing schemes for some nonlinear and nonlocal integrable PDEs, including the well-known Benjamin-Ono equation. This work is heavily inspired by recent theoretical breakthroughs in the field of nonlinear integrable equations, and opens the way to numerical approximations which are far more accurate and efficient for simulating integrable PDEs, from short up to long times.
  • Philippe Helluy

    Schéma ALE aléatoire pour les écoulements bifluides compressibles. Application à la simulation du déferlement.

    4 février 2025 - 14:00Salle de conférences IRMA

    Le modèle Euler compressible bifluide ne présente pas de difficultés théoriques supplémentaires comparé au cas monofluide. Mais sa résolution numérique est notoirement plus difficile à cause du phénomène d'oscillations de pression à l'interface entre fluides. Nous présentons une approche basée sur un échantillonnage aléatoire "à la Glimm" à l'interface, qui permet de s'affranchir de ce défaut. Le schéma obtenu est applicable à des maillages non structurés, il a d'excellentes propriétés de robustesse et de convergence. Nous l'appliquons à des cas de déferlement.
  • Simon Schneider

    Estimatable Variation Neural Networks and their Application to Scalar Hyperbolic Conservation Laws

    25 février 2025 - 14:00Salle 301

    In this talk we introduce a class of neural networks for which a computationally cheap local estimate on the BV norm is available. The architecture of these networks is motivated by a linear function space we denote BMV. This space is the natural analogue to the space BPV of functions with bounded pointwise variation in one dimension. As the networks are elements of BMV, we are able to investigate the sharpness of the BV estimate. Further, we prove a universal approximation theorem in BMV and discuss practical considerations concerning the implementation.

    We use these networks as ansatz functions to solve scalar hyperbolic conservations laws. Here, the big advantage of the estimate on the BV norm is that compactness in L¹ of sequences of networks can be enforced. For a loss function inspired by the finite volume method we are able to show convergence of sequences of networks under the assumption that the training error vanishes. Moreover, we show the existence of sequences of loss minimizing neural networks if the solution is an element of BMV. Several numerical test cases illustrate that it is possible to use standard techniques to minimize these loss functionals for networks with the proposed architecture.