GSI'25

Meet our 2023 Keynote Speakers

Francis BACH

Inria Ecole Normale Supérieure

Information Theory with Kernel Methods

Estimating and computing entropies of probability distributions are key computational tasks throughout data science. In many situations, the underlying distributions are only known through the expectation of some feature vectors, which has led to a series of works within kernel methods. In this talk, I will explore the particular situation where the feature vector is a rank-one positive definite matrix, and show how the associated expectations (a covariance matrix) can be used with information divergences from quantum information theory to draw direct links with the classical notions of Shannon entropies.

Eva MIRANDA

Polytechnic University of Catalonia

From Alan Turing to Contact geometry: towards a "Fluid computer”

Is hydrodynamics capable of performing computations? (Moore 1991). Can a mechanical system (including a fluid flow) simulate a universal Turing machine? (Tao, 2016). Etnyre and Ghrist unveiled a mirror between contact geometry and fluid dynamics reflecting Reeb vector fields as Beltrami vector fields. With the aid of this mirror, we can answer in the positive the questions raised by Moore and Tao. This is a recent result that mixes up techniques from Alan Turing with modern Geometry (contact geometry) to construct a « Fluid computer » in dimension 3. This construction shows, in particular, the existence of undecidable fluid paths. I will also explain applications of this mirror to the detection of escape trajectories in Celestial mechanics (for which I’ll need to extend the mirror to a singular set up). This mirror allows us to construct a tunnel connecting problems in Celestial mechanics and Fluid Dynamics.

Diara FALL

Institut Denis Poisson, UMR CNRS Université d'Orléans & Université de Tours

Statistics Methods for Medical Image Processing and Reconstruction

In this talk, we will see how statistical methods, from the simplest to the most advanced ones, can be used to address various problems in medical image processing and reconstruction for different imaging modalities. Image reconstruction allows obtaining the images in question, while image processing (on the already reconstructed images) aims at extracting some information of interest. We will review several statistical methods (mainly Bayesian) to address various problems of this type.

Hervé SABOURIN

Poitiers University

Transverse Poisson Structures to adjoint orbits in a complex semi-simple Lie algebra

The notion of transverse Poisson structure was introduced by Arthur Weinstein, stating in his famous splitting theorem that any Poisson manifold MM is, in the neighborhood of each point mm, the product of a symplectic manifold, the symplectic leaf SS at mm, and a submanifold NN which can be endowed with a structure of Poisson manifold of rank 0 at mm. NN is called a transverse slice at MM of SS. When MM is the dual of a complex Lie algebra g\mathfrak{g} equipped with its standard Lie-Poisson structure, we know that the symplectic leaf through xx is the coadjoint G⋅xG \cdot x of the adjoint Lie group GG of g\mathfrak{g}. Moreover, there is a natural way to describe the transverse slice to the coadjoint orbit, and using a canonical system of linear coordinates (q1,…,qk)(q_1, \dots, q_k), it follows that the coefficients of the transverse Poisson structure are rational in (q1,…,qk)(q_1, \dots, q_k).

Bernd STURMFELS

MPI-MiS

Algebraic Statistics and Gibbs Manifolds

Gibbs manifolds are images of affine spaces of symmetric matrices under the exponential map. They arise in applications such as optimization, statistics, and quantum physics, where they extend the ubiquitous role of toric geometry. The Gibbs variety is the zero locus of all polynomials that vanish on the Gibbs manifold. This lecture provides an introduction to these objects from the perspective of Algebraic Statistics.

Juan-Pablo ORTEGA

Nanyang Technological University

Learning of Dynamic Processes

The last decade has seen the emergence of learning techniques that use the computational power of dynamical systems for information processing. Some of these paradigms are based on architectures that are partially randomly generated and require a relatively cheap training effort, making them ideal for many applications. The need for a mathematical understanding of the working principles underlying this approach, collectively known as Reservoir Computing, has led to the construction of new techniques that combine well-known results in systems theory and dynamics with others from approximation and statistical learning theory. This combination has recently elevated Reservoir Computing to the realm of provable machine learning paradigms and, as we will see in this talk, it also reveals various connections with kernel maps, structure-preserving algorithms, and physics-inspired learning.

Aller au contenu principal