The Conference will be held in areas of topics of mutual interest in Geometric Science Information with the aim to:
• Provide an overview on the most recent state-of-the-art
• Exchange mathematical information / knowledge / expertise in the area
• Identify research areas/applications for future collaboration
from the roots to the future of geometry in Saint Malo, town of Pierre-Louis Moreau de Maupertuis
5000 pages published in Springer's Lecture Notes in Computer Science
Three days of main and poster sessions, tutorials for you
in the corsair historic Brittany seascape
GSI’23 will bring together pure/applied mathematicians and engineers, with common interest for Geometric tools and their applications for Information analysis
Sub-Riemannian Geometry and Neuromathematics - Statistical Manifold & Hessian Information Geometry - Information Geometry in Physics Geometric & Symplectic Methods for Hydrodynamical Models - Geometry of Quantum States - Deformed entropy, cross-entropy, and relative entropy - Geometric structures in thermodynamics and statistical physics - Geometric Deep Learning - Computational Information Geometry Optimal Transport & Learning - Statistics, Information and Topology Topological and Geometrical Structures in Neurosciences - Manifolds & Optimization - Divergence Statistics - Transport information geometry
Head, Division of Mathematical Sciences. Associate Chair (Faculty), School of Physical and Mathematical Sciences. Nanyang Technological University, Singapore
- Learning of Dynamic Processes -
Abstract: The last decade has seen the emergence of learning techniques that use the computational power of dynamical systems for information processing. Some of those paradigms are based on architectures that are partially randomly generated and require a relatively cheap training effort, which makes them ideal in many applications. The need for a mathematical understanding of the working principles underlying this approach, collectively known as Reservoir Computing, has led to the construction of new techniques that put together well-known results in systems theory and dynamics with others coming from approximation and statistical learning theory. This combination has allowed in recent times to elevate Reservoir Computing to the realm of provable machine learning paradigms and, as we will see in this talk, it also hints at various connections with kernel maps, structure-preserving algorithms, and physics-inspired learning.
Director for Strategic projects of the Réseau Figure® (network of 31 universities)
Former Regional Director of the A.U.F (Agence Universitaire de la Francophonie) for the Middle East
Former Vice-President of the University of Poitiers (France)
Title: Transverse Poisson Structures to adjoint orbits in a complex semi-simple Lie algebra
The notion of transverse Poisson structure has been introduced by Arthur Weinstein stating in his famous splitting theorem that any Poisson Manifold M is, in the neighbourhood of each point m, the product of a symplectic manifold, the symplectic leaf S at m, and a submanifold N which can be endowed with a structure of Poisson manifold of rank 0 at m. N is called a transverse slice at M of S. When M is the dual of a complex Lie algebra g equipped with its standard Lie-Poisson structure, we know that the symplectic leaf through x is the coadjoint G. x of the adjoint Lie group G of g. Moreover, there is a natural way to describe the transverse slice to the coadjoint orbit and, using a canonical system of linear coordinates (q1, ….., qk), it follows that the coefficients of the transverse Poisson structure are rational in (q1, ….., qk). Then, one can wonder for which cases that structure is polynomial. Nice answers have been given when g is semi-simple, taking advantage of the explicit machinery of semi-simple Lie algebras. One shows that a general adjoint orbit can be reduced to the case of a nilpotent orbit where the transverse Poisson structure can be expressed in terms of quasihomogeneous polynomials. In particular, in the case of the subregular nilpotent orbit the Poisson structure is given by a determinantal formula and is entirely determined by the singular variety of nilpotent elements of the slice.
Inria, Ecole Normale Supérieure
Title : Information Theory with Kernel Methods
Abstract: Estimating and computing entropies of probability distributions are key computational tasks throughout data science. In many situations, the underlying distributions are only known through the expectation of some feature vectors, which has led to a series of works within kernel methods. In this talk, I will explore the particular situation where the feature vector is a rank-one positive definite matrix, and show how the associated expectations (a covariance matrix) can be used with information divergences from quantum information theory to draw direct links with the classical notions of Shannon entropies.
Francis Bach. Information Theory with Kernel Methods. To appear in IEEE Transactions in Information Theory, 2022. https://arxiv.org/pdf/2202.08545
Universitat Politècnica de Catalunya and Centre de Recerca Matemàtica
Title: From Alan Turing to Contact geometry: towards a "Fluid computer”
Abstract: Abstract: Is hydrodynamics capable of performing computations? (Moore 1991). Can a mechanical system (including a fluid flow) simulate a universal Turing machine? (Tao, 2016).
Etnyre and Ghrist unveiled a mirror between contact geometry and fluid dynamics reflecting Reeb vector fields as Beltrami vector fields. With the aid of this mirror, we can answer in the positive the questions raised by Moore and Tao. This is a recent result that mixes up techniques from Alan Turing with modern Geometry (contact geometry) to construct a "Fluid computer" in dimension 3. This construction shows, in particular, the existence of undecidable fluid paths. I will also explain applications of this mirror to the detection of escape trajectories in Celestial
mechanics (for which I'll need to extend the mirror to a singular set-up). This mirror allows us to construct a tunnel connecting problems in Celestial mechanics and Fluid Dynamics.
Robert Cardona, Eva Miranda, Daniel Peralta-Salas, and Francisco Presas, Constructing Turing complete Euler flows in dimension 3. Proc. Natl. Acad. Sci. USA 118 (2021), no. 19, Paper No. e2026818118, 9 pp.
Eva Miranda, Cédric Oms and Daniel Peralta-Salas, On the singular Weinstein conjecture and the existence of escape orbits for b-Beltrami fields. Commun. Contemp. Math. 24 (2022), no. 7, Paper No. 2150076, 25 pp.
Alan Turing, On Computable Numbers, with an Application to the Entscheidungsproblem. Proceedings of the London Mathematical Society. Wiley. s2-42 (1): 230–265. doi:10.1112/plms/s2-42.1.230. ISSN 0024-6115., (1937).
Institut Denis Poisson, UMR CNRS, Université d'Orléans & Université de Tours, France.
Title: Statistics Methods for Medical Image Processing and Reconstruction
In this talk we will see how statistical methods, from the simplest to the most advanced ones, can be used to address various problems in medical image processing and reconstruction for different imaging modalities. Image reconstruction allows to obtain the images in question, while image processing (on the already reconstructed images) aims at extracting some information of interest. We will review several statistical methods (manely Bayesian) to address various problems of this type.
Keywords: Image processing, image reconstruction, Statistics, frequentist, Bayesian, Parametrics, Nonparametrics.
MPI-MiS Leipzig, Germany
- Algebraic Statistics and Gibbs Manifolds -
Abstract: Gibbs manifolds are images of affine spaces of symmetric matrices under the exponential map. They arise in applications such as optimization, statistics and quantum physics, where they extend the ubiquitous role of toric geometry. The Gibbs variety is the zero locus of all polynomials that vanish on the Gibbs manifold. This lecture gives an introduction to these objects from the perspective of Algebraic Statistics.
1, quai Duguay-Trouin – B.P.109
35407 Saint-Malo Cedex, France
Name: Frédéric BARBARESCO
Name: Imène AHMED