Conference on Deep Learning for Computational Physics, UCL, 4-6 July 2023

Summary of conference

The Maths4DL conference on Deep Learning for Computational Physics was hosted by University College London from 4-6 July 2023. The conference was open to the public and approximately 90 people attended in total. During the conference, participants heard from keynote and invited speakers, as well as short lightning talks from early career researchers and spent time networking during breaks. Slides from the talks will be available to download from these pages (programme and lightning talk sections) from speakers who are happy to share them.

The first day was opened by Prof Simon Arridge, UCL and after several talks from speakers, a poster session and drinks reception was held. Prizes (kindly donated by World Scientific Publishing and Springer) for the best posters were awarded to Pablo Arratia, Amir Ehsan Khorashadizadeh, Agnese Pacifico, and Victor Wang. Congratulations to you all!

The second day included further talks and discussion followed by the popular conference dinner at Drake & Morgan, Kings Cross. The final day’s talks concluded with a closing speech from Prof Chris Budd, University of Bath, Principal Investigator on the Maths4DL grant.

The overall feedback has been very positive, with over 90% of respondents rating the conference and the organisation of the conference as excellent or very good. Participants included PhD students, early career researchers, academics, plus some representatives from industry and truly represented the international nature of the work being carried out in the Maths4DL programme. Attendees came from towns and cities in Canada, England, Estonia, Finland, Germany, Italy, Japan, Netherlands, Norway, Saudi Arabia, Scotland, Sweden, Switzerland, USA and Wales.

We are already looking forward to our next conference, so watch this space!

 

Introduction

Deep learning in physics represents a very active and rapidly growing field of research. This shift in approach has already brought with it many advances, which this meeting aims to highlight. Recent examples include PINNs, SINDy, symbolic regression, Fourier neural operators, meta-learning, and neural ODEs to name a few. The applications also embrace many disciplines across the scientific spectrum, from medical sciences, to computer vision, to the physical sciences.  We believe that the next steps for machine learning require a firm theoretical understanding and have organised this conference, taking place at UCL in central London, to bring together like-minded individuals to discuss current and future research in this area.

Included themes:

  • Data-driven approaches for forward and inverse problems
  • Discovery and solving of differential equations with machine learning
  • New methods for physical applications, such as CT, MRI, fluid flows, or climate systems
  • Learning improved experimental design or adaptive-mesh–like methods
  • Theoretical results on the capabilities of neural networks
  • Robustness and generalisability in machine learning
  • Methods for training neural networks
  • The links between deep neural networks and continuum models
  • Interpretability of learned approaches
  • Uncertainty quantification
  • Mesh-free or off-the-grid methods
  • Use of non-standard metrics, e.g. physics-based or on manifolds
  • Scaling machine learning to big data problems

Registration

Places at the conference are now full but we are looking at ways to increase capacity. If you are still interested in registering, please email maths4dl@bath.ac.uk and we will keep you informed of our plans.

Please note, there are no longer places available at the conference dinner.

 

Programme - available talks are linked below

Tuesday 4th July Wednesday 5th July Thursday 6th July
Arrival, registration, and refreshments    
Welcome and introduction – Prof. Simon Arridge Arrival, registration, and refreshments Arrival, registration, and refreshments
Asst. Prof. Sophie Langer
Understanding dropout in the linear world
Prof. Giovanni S. Alberti
Machine learning for infinite-dimensional inverse problems
Prof. Elena Celledoni
A dynamical systems view to Deep learning: contractivity and structure preservation.
Coffee break Coffee break Dr Marta Betcke
Complementary learning in photoacoustic tomography
Prof. Jason McEwen
Geometric deep learning on the sphere for the physical sciences
Dr Julián Tachella
Learning to reconstruct images without ground-truth
Coffee break
Prof Andreas Hauptmann
Model-corrected learned primal-dual models for fast limited-view photoacoustic tomography
Dr Benjamin Moseley
Scaling physics-informed neural networks to high frequency and multiscale problems using domain decomposition
Dr Cristopher Salvi
Large-width limits of neural ODE-type methods
Lunch Lunch Lunch
Dr Steve Brunton
Machine Learning for Scientific Discovery, with Examples in Fluid Mechanics
Prof Eldad Haber
PDE’s, ODE’s Graphs and Neural Networks
Dr Chris Rackauckas
Generalizing Scientific Machine Learning and Differentiable Simulation Beyond Continuous Models
Dr Zhi Zhou
Identification of Conductivity in Elliptic equations using Deep Neural Networks 
Dr Nicolas Boulle
Data-efficient PDE learning
Emanuel Ström
Acceleration of multiscale solvers via adjoint operator learning
Coffee break Coffee break Coffee break
Lightning talks Janek Gödeke
TorchPhysics: A Deep Learning Library for Solving Differential Equations
Dr Patrick Kidger
Scientific machine learning in JAX
Dr Tatiana Bubba
Integrating data-driven techniques and theoretical guarantees for limited angle tomography
Prof. Michael Hintermüller
Learning-informed and PINN-based multi scale PDE models in optimization
16:30-18:00, Poster session and drinks reception   16:30-16:40, Summary and close, Prof. Chris Budd
  18:30-late, Conference dinner at Drake & Morgan, Kings Cross.  

Keynote speakers

Invited speakers

Information for delegates

Dinner

The conference dinner will take place on Wednesday 5 July. The venue is Drake and Morgan, King Cross. You need to sign up and pay to attend the dinner when you register. You will be contacted nearer the time regarding your menu choices.

Travel

UCL is located in the Bloomsbury district at the very centre of London. There are easy connections to UCL from London’s global hub airports at Heathrow, Gatwick and Stansted and you will find that London’s extensive public transport system is convenient and easy to use.

The conference venue is marked on this map.
Further information about getting to UCL can be found here.

Accommodation

London has a wide variety of accommodation to suit all tastes and budgets. There are lots of options in and around Bloomsbury, close to the conference.

Below are a few hotels located nearby:
The Tavistock Hotel
Bloomsbury Palace hotel
Hub by Premier inn – Goodge Street
Gower House Hotel
Travelodge Euston
Euston Square Hotel
Royal National Hotel

Please note:

Accommodation is not included in the registration fee, delegates are required to book their own accommodation. We encourage delegates to book accommodation as early as possible.

Lightning talks and posters

Posters were presented by the following people. Click on their name for their slides and poster title for the poster pdf (where available).

Slides

Poster

Pablo Arratia  
Jonathan Chirinos Rodriguez A Supervised Learning Approach to Regularization of Inverse Problems
Alexander Denker  Invertible residual networks in the context of regularization theory for linear inverse problems 
Simon Driscoll Sensitivity Analysis and Machine Learning of Sea Ice Thermodynamics
Sebastian Götschel  
Dawit Hailu  
Amir Ehsan Khorashadizadeh FunkNN: Neural Interpolation for Functional Generation
Takashi Matsubara Deep Learning for Discrete-Time Physics
Derick Nganyu Tanyu Deep Learning Methods for PDEs and Related Parameter Identification Problems 
Agnese Pacifico Online identification and control of PDEs via Reinforcement Learning methods
Danilo Riccio Regularization of Inverse Problems: Deep Equilibrium Models versus Bilevel Learning
Alex Richardson   
Antonio Stanziola  
Ivan Sudakow Statistical mechanics in climate emulation: Challenges and perspectives
Xiaoyu (Victor) Wang Lifted Bregman Training of Neural Networks
Takaharu Yaguchi Neural symplectic form and its variational principle