The natural world is full of complex systems characterized by intricate relations between their components: from social interactions between individuals in a social network to electrostatic interactions between atoms in a protein. Topological Deep Learning (TDL) provides a framework to process and extract knowledge from data associated with these systems, such as predicting the social community to which an individual belongs or predicting whether a protein can be a reasonable target for drug development. By extending beyond traditional graph-based methods, TDL incorporates higher-order relational structures, providing a new lens to tackle challenges in applied sciences and beyond. This talk will introduce the core principles of TDL and provide a comprehensive review of its rapidly growing literature, with a particular focus on neural network architectures and their performance across various domains. I will present open-source implementations that make TDL methods more accessible and practical for real-world applications. All in all, this talk will showcase how TDL models can effectively capture and reason about the complexity of real-world systems, while highlighting the remaining challenges and exciting opportunities for future advancements in the field.
Bibliography
Coisotropic reduction in symplectic geometry can be phrased cohomologically and goes under the name of BRST cohomology in Physics. It provides a quantisation procedure for gauge theories which is equivariant under global symmetries. It was first discovered in the context of gauge theories in the mid 1970s, but it plays a very important role in the quantisation of string theories, where it usually appears in th guise of semi-infinite cohomology, a cohomology theory for certain infinite-dimensional Lie algebras which sits in between homology and cohomology. I will summarise some of the history of the subject and mention a recent application in the context of so-called non-relativistic strings.
Bibliography:
Classical descriptions of matter present many fluid mechanical and kinetic theory dynamical systems. These include, e.g., the Navier-Stokes-Fourier system, the Cahn-Hilliard-Navier-Stokes system for multiphase fluid flow, and various types of collisional kinetic theories for gaseous and plasma modeling. A desirable feature of such modeling is thermodynamic consistency, i.e., conservation of energy and production of entropy, in agreement with the first and second laws of thermodynamics. Metriplectic dynamics is a kind of dynamical system (finite or infinite) that encapsulates in a geometrical formalism such thermodynamic consistency. An algorithmic procedure for building such theories is based on the metriplectic 4-bracket, a bracket akin to the Poisson bracket that maps phase space functions to another. However, the 4-bracket maps 4 such functions and has algebraic curvature symmetries. Metriplectic 4-brackets can be constructed using the Kulkarni-Nomizu product or via a pure Lie algebraic formalism based on the Koszul connection. The formalism algorithmically produces many known and new dynamical systems, and it provides a pathway for constructing structure preserving numerical algorithms.
Bibliography:
Many statistical learning and inference methods, from Bayesian inference to empirical risk minimization, can be unified through a variational perspective that balances empirical risk and prior knowledge. We introduce a new, general class of variational methods based on Fenchel-Young (FY) losses. These losses, derived using Fenchel conjugation (a central tool of convex analysis), generalize the Kullback-Leibler divergence and encompass Bayesian as well as classical variational learning. This FY framework provides generalized notions of free energy, evidence, evidence lower bound, and posterior, while still enabling standard optimization techniques like alternating minimization and gradient backpropagation. This allows learning a broader class of models than previous variational formulations. This talk will review FY losses and then detail this new generalized variational inference approach to machine learning.
Bibliography: