Subventions et des contributions :
Subvention ou bourse octroyée s'appliquant à plus d'un exercice financier. (2017-2018 à 2022-2023)
The concept of entropy is central to many areas of modern science and technology. Entropy is a quantitative measure of information content of a corpus of data or a statistical process. It is also used as a primary component in the computation of channel capacity, which is the optimal rate at which data can be transmitted over a communication channel, such as a network of cellular phones or computers, or stored in a data recording device, such as a computer disk drive or DVD.
For some channels and devices, in order to improve reliability, it is necessary to restrict the sequences that can be transmitted or stored. This motivated the concept of an input-constrained channel. The capacity of such a channel determines the optimal data rate and in many cases is characterized in terms of optimal entropies of processes known as hidden Markov processes (HMPs). There is no known general formula for the entropy of an HMP. In Part I of this proposal we seek to develop methods to prove existence of efficient schemes to approximate entropies of HMPs and capacity of input-constrained channels.
The capacity of an input-constrained channel incorporates the information content of the set of input-constrained sequences itself as well as the noise inherent in the channel. The former is quantified by the so-called noiseless capacity. Sets of constrained input sequences are of a type nearly identical to those used in the theory of dynamical systems to model chaotic systems. In dynamical systems, these sets are known as shifts of finite type (SFTs), and in this setting noiseless capacity is known as topological entropy of the SFT. There is an explicit, general and extremely useful formula known for the topological entropy of an SFT.
Input constraints arise in two dimensions as well as one dimension, i.e., arrays instead of sequences, in applications such as holographic recording. There is a corresponding notion of SFT and topological entropy. In contrast to one dimension, there is no general formula for this entropy in two dimensions. However, the topological entropies of some two-dimensional SFTs can be approximated efficiently.
In Part II of this proposal, we focus on methods of proving existence of efficient approximation schemes to compute topological entropy for certain two-dimensional SFTs. The methods make use of a representation of topological entropy in terms of a maximal entropy statistical process compatible with the SFT. In this way, under certain conditions, the topological entropy can be expressed as an average of a function of a few samples of this process, each of which can be computed efficiently. The methods naturally generalize to efficient approximation of the so-called topological pressure of interactions that are of interest in statistical physics, in particular for such classical models as the Ising model and hard core model.