(c) photo Michel Hasson

4th International Conference on Machine Learning for Networking (MLN'2021)

Virtual conference, December 1-3, 2021

Invited speaker: Jean-Claude Belfiore, Huawei, France

Title: Beyond Shannon : A theory of semantic communication


This talk is an attempt to answer the question “How can intelligent machines efficiently communicate?” which is one of the main goals of the so-called “Semantic Communication”. I will present a joint work with Daniel Bennequin which shows our progresses towards a mathematical theory of semantic communication, inspired by the foundational works of Claude Shannon and Alexander Grothendieck. To communicate efficiently we need a language. Using category theory, we can define a category transporting the semantics of a language. We will see then that the notion of semantics depends on many aspects that can be found in machine learning: Sampling (the data), structures (a kind of presemantic that will be carefully defined), the language itself.
Some important mathematical notions as Grothendieck Toposes and Stacks will be introduced through simple examples and we will see how neural networks can be modelled this way. Finally, after showing how a language is transported through the layers of a neural network, we will give a definition of semantic information measures which are not numbers as in Shannon information theory, but spaces. An example inspired by Carnap and Bar-Hillel will show the validity of such a definition. We will also propose semantic coding theorems.