Seminars

 

MATHRISK WEEKLY SEMINAR

 

STOCHASTIC METHODS AND FINANCE 
ENPC – INRIA – UGE  

 
 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __ _ _ 
 
 

RECENT PAST EVENTS

 
 
Séminaire MathRisk / LPSM – 19 octobre 2023 à l’INRIA Centre de Paris,
Salle Jacques-Louis Lions 
 

09h – 09:45 : Gudmund PAMMER , ETH Zurich
09:45 – 10:30 : Mehdi TALBI, LPSM

***** 10:30-11h : Pause café *****

11h00 – 11:45 : Robert DENKERT , HU Berlin
11:45 – 12:30: Aurélien ALFONSI, MathRisk CERMICS/ENPC

 

 
 

CONFERENCE UDINE  – 14- 16 June 2023

The project-team MATHRISK of INRIA Paris/Ecole des Ponts Paris Tech/ Université Gustave Eiffel, and the Department of Economics and Statistics of the University of Udine are pleased to announce the MathRisk Conference on Numerical Methods in Finance to celebrate the 25th anniversary of the project and the numerical platform PREMIA http://premia.fr

The conference will be held in Udine, on 14 – 16 June 2023 and hosted by the Department of Economics and Statistics of the University of Udine (Italy).

The main topics are: Neural networks and machine learning in computational finance, Stochastic volatility and jumps models, Risk measures, Systemic risk,  stochastic control,  (Martingale) Optimal transport, Mean-field systems and games, Green finance, Quantum computing in finance.

Plenary Lectures:  Christa Cuchiero (Vienna University), Antoine Jacquier (Imperial College London), Arnulf Jentzen (University of Münster & The Chinese University of Hong Kong, Shenzhen), Peter Tankov (ENSAE Paris).

Invited finance industry speakers: Michel Crouhy (Natixis), Christophe Michel (CA-CIB)

Deadlines: Abstract submission: 15 March 2023, Notification of acceptance:  15 April 2023, Registration (free but mandatory): 15 May 2023.

Conference Web Site : https://mathrisk2023.sciencesconf.org/

 
 
 
 
Jeudi 16 février 2023 à 14h00 à l’INRIA Paris
 
 
Salle Jacques-Louis Lions 2
2 Rue Simone IFF, 75012 Paris
 
Zhenjie REN
CEREMADE, Université Paris-Dauphine
 
Titre:
Regularized Mean Field Optimization with Application to Neural Networks
 
 
Abstract:
Our recent works on the regularized mean field optimization aim at providing a theoretical foundation for analyzing the efficiency of the training of neural networks, as well as inspiring new training algorithms. In this talk we shall see how different regularizers, such as relative entropy and Fisher information, lead to different gradient flows on the space of probability measures.  Besides the gradient flows, we also propose and study alternative algorithms, such as the entropic fictitious play, to search for the optimal weights of neural networks. Each of these algorithms is ensured to have exponential convergence, and we shall highlight their performances in some simple numerical tests.