P. Baudot and D. Bennequin, The Homological Nature of Entropy, Entropy, vol.17, pp.3253-3318, 2015.

J. Vigneaux, The structure of information: From probability to homology, 2017.

J. P. Vigneaux, Topology of Statistical Systems. A Cohomological Approach to Information Theory, 2019.

M. Tapia, P. Baudot, C. Formizano-treziny, M. Dufour, S. Temporal et al., Neurotransmitter identity and electrophysiological phenotype are genetically coupled in midbrain dopaminergic neurons, Sci. Rep, vol.8, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01963481

J. Gibbs, Elementary Principles in Statistical Mechanics

C. Scribner and &. Sons, , 1902.

C. E. Shannon, A Mathematical Theory of Communication, Bell Syst. Tech. J, vol.27, pp.379-423, 1948.

C. Shannon, A lattice theory of information, Trans. IRE Prof. Group Inform. Theory, vol.1, pp.105-107, 1953.

W. Mcgill, Multivariate information transmission, Psychometrika, vol.19, pp.97-116, 1954.

R. Fano, Transmission of Information: A Statistical Theory of Communication, 1961.

K. T. Hu, On the Amount of Information, Theory Probab. Appl, vol.7, pp.439-447, 1962.

T. S. Han, Linear dependence structure of the entropy space, Inf. Control, vol.29, pp.337-368, 1975.

T. S. Han, Nonnegative entropy measures of multivariate symmetric correlations, IEEE Inf, vol.36, pp.133-156, 1978.

H. Matsuda, Information theoretic characterization of frustrated systems, Phys. Stat. Mech. Its Appl, vol.294, pp.180-190, 2001.

A. Bell, The co-information lattice, Proceedings of the 4th International Symposium on Independent Component Analysis and Blind Signal Separation, pp.1-4, 2003.

N. Brenner, S. Strong, R. Koberle, and W. Bialek, Synergy in a Neural Code, Neural Comput, vol.12, pp.1531-1552, 2000.

J. Watkinson, K. Liang, X. Wang, T. Zheng, and D. Anastassiou, Inference of Regulatory Gene Interactions from Expression Data Using Three-Way Mutual Information, Chall. Syst. Biol. Ann. N. Y. Acad. Sci, vol.1158, pp.302-313, 2009.

H. Kim, J. Watkinson, V. Varadan, and D. Anastassiou, Multi-cancer computational analysis reveals invasion-associated variant of desmoplastic reaction involving INHBA, THBS2 and COL11A1, BMC Med. Genom, vol.3, p.51, 2010.

S. Watanabe, Information theoretical analysis of multivariate correlation, Ibm J. Res. Dev, vol.4, pp.66-81, 1960.

G. Tononi and G. Edelman, Consciousness and Complexity, Science, vol.282, pp.1846-1851, 1998.

G. Tononi, G. Edelman, and O. Sporns, Complexity and coherency: Integrating information in the brain, Trends Cogn. Sci, vol.2, pp.474-484, 1998.

M. Studeny and J. Vejnarova, The multiinformation function as a tool for measuring stochastic dependence, In Learning in Graphical Models

M. I. Jordan and . Ed, , pp.261-296, 1999.

E. Schneidman, W. Bialek, and M. Berry, Synergy, redundancy, and independence in population codes, J. Neurosci, vol.23, pp.11539-11553, 2003.

N. Slonim, G. Atwal, G. Tkacik, and W. Bialek, Information-based clustering, Proc. Natl. Acad. Sci, vol.102, pp.18297-18302, 2005.

N. Brenner, W. Bialek, and R. De-ruyter-van-steveninck, Adaptive Rescaling Maximizes Information Transmission, Neuron, vol.26, pp.695-702, 2000.

S. Laughlin, A simple coding procedure enhances the neuron's information capacity, Z. Naturforsch, vol.36, pp.910-912, 1981.

A. Margolin, K. Wang, A. Califano, and I. Nemenman, Multivariate dependence and genetic networks inference, IET Syst. Biol, vol.4, pp.428-440, 2010.

P. Williams and R. Beer, Nonnegative Decomposition of Multivariate Information. arXiv 2010

E. Olbrich, N. Bertschinger, and . Rauh, J. Information Decomposition and Synergy. Entropy, vol.17, pp.3501-3517, 2015.

N. Bertschinger, J. Rauh, E. Olbrich, J. Jost, and N. Ay, Quantifying unique information, Entropy, vol.16, pp.2161-2183, 2014.

V. Griffith and C. Koch, Quantifying Synergistic Mutual Information, In Guided Self-Organization: Inception

M. Prokopenko, . Ed, and . Springer, , pp.159-190, 2014.

M. Wibral, C. Finn, P. Wollstadt, J. Lizier, and V. Priesemann, Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition, vol.19, 2017.

J. Kay, R. Ince, B. Dering, and W. Phillips, Partial and Entropic Information Decompositions of a Neuronal Modulatory Interaction, vol.19, 2017.

J. Rauh, N. Bertschinger, E. Olbrich, and J. Jost, Reconsidering unique information: Towards a multivariate information decomposition, Proceedings of the IEEE International Symposium on Information Theory, 2014.

S. A. Abdallah and M. D. Plumbley, Predictive Information, Multiinformation and Binding Information, 2010.

F. Valverde-albacete and C. Pelaez-moreno, Assessing Information Transmission in Data Transformations with the Channel Multivariate Entropy Triangle, Entropy, vol.20, 2018.

F. Valverde-albacete and C. Pelaez-moreno, The evaluation of data sources using multivariate entropy tools, Expert Syst. Appl, vol.78, pp.145-157, 2017.

P. Baudot, The Poincaré-Boltzmann Machine: From Statistical Physics to Machine Learning and back. arXiv 2019

A. Khinchin, Mathematical Foundations of Information Theory, from Two Russian Articles in Uspekhi Matematicheskikh Nauk, vol.7, p.1775, 1953.

M. Artin, A. Grothendieck, and J. Verdier, Theorie des Topos et Cohomologie Etale des Schemas-(SGA 4

, Lecture Notes in Mathematics, 1963.

G. Rota, On the Foundations of Combinatorial Theory I. Theory of Moebius Functions, Z. Wahrseheinlichkeitstheorie, vol.2, pp.340-368, 1964.

T. Cover and J. Thomas, Elements of Information Theory

, Wiley Series in Telecommunication

J. Wiley, . Sons, and . Inc, , 1991.

H. M. Kellerer and . Marginalprobleme, Math. Ann, vol.153, pp.168-198, 1964.

F. Matus, Discrete marginal problem for complex measures, vol.24, pp.39-46, 1988.

D. Reshef, Y. Reshef, H. Finucane, S. Grossman, G. Mcvean et al., Detecting Novel Associations in Large Data Sets, Science, vol.334, 1518.

M. Tapia, P. Baudot, M. Dufour, C. Formizano-treziny, S. Temporal et al., Information topology of gene expression profile in dopaminergic neurons, 2017.

R. Dawkins and . Gene, , 1976.

S. Pethel and D. Hahs, Exact Test of Independence Using Mutual Information, Entropy, vol.16, pp.2839-2849, 2014.

T. Schreiber, Measuring Information Transfer, Phys. Rev. Lett, vol.85, pp.461-464, 2000.

L. Barnett, A. Barrett, and A. K. Seth, Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables, Phys. Rev. Lett, vol.103, 2009.

A. N. Kolmogorov, Grundbegriffe der Wahrscheinlichkeitsrechnung, 1933.

J. L. Loday, B. Valette, . Algebr, and . Operads, , 2012.

G. Tkacik, O. Marre, D. Amodei, E. Schneidman, W. Bialek et al., Searching for collective behavior in a large network of sensory neurons, PLoS Comput. Biol, vol.10, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01342627

E. Schneidman, M. Berry, R. Segev, and W. Bialek, Weak pairwise correlations imply strongly correlated network states in a neural population, Nature, vol.440, pp.1007-1012, 2006.

L. Merchan and I. Nemenman, On the Sufficiency of Pairwise Interactions in Maximum Entropy Models of Networks, J. Stat. Phys, vol.162, pp.1294-1308, 2016.

J. Humplik and G. Tkacik, Probabilistic models for neural populations that naturally capture global coupling and criticality, PLoS Comput. Biol, vol.13, 2017.

J. Atick, Could information theory provide an ecological theory of sensory processing, Netw. Comput. Neural Syst, vol.3, pp.213-251, 1992.

P. Baudot, Natural Computation: Much ado about Nothing? An Intracellular Study of Visual Coding in Natural Condition, 2006.

J. Yedidia, W. Freeamn, and Y. Weiss, Understanding belief propagation and its generalizations, Destin. Lect. Conf. Artif. Intell, vol.8, pp.236-239, 2001.

M. Reimann, M. Nolte, M. Scolamiero, K. Turner, R. Perin et al., Cliques of Neurons Bound into Cavities Provide a Missing Link between Structure and Function, Front. Comput. Neurosci, vol.12, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01706964

J. Gibbs, A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces, Trans. Conn. Acad, vol.1873, pp.382-404

R. Landauer, Irreversibility and heat generation in the computing process, IBM J. Res. Dev, vol.5, pp.183-191, 1961.

J. Shipman, Tkinter Reference: A GUI for Python, 2010.

J. Hunter, Matplotlib: A 2D graphics environment, Comput. Sci. Eng, vol.9, pp.22-30, 2007.

S. Van-der-walt, C. Colbert, and G. Varoquaux, The NumPy array: A structure for efficient numerical computation, Comput. Sci. Eng, vol.13, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00564007

A. Hagberg, D. Schult, and P. Swart, Exploring network structure, dynamics, and function using NetworkX, Proceedings of the 7th Python in Science Conference (SciPy2008), pp.19-24, 2008.

G. Varoquaux, T. Vaught, and J. Millman, , pp.11-15

G. Hughes, On the mean accuracy of statistical pattern recognizers, IEEE Trans. Inf. Theory, vol.14, pp.55-63, 1968.

S. Strong, R. De-ruyter-van-steveninck, W. Bialek, and R. Koberle, On the application of information theory to neural spike trains, Pac. Symp. Biocomput, pp.621-632, 1998.

I. Nemenman, W. Bialek, and R. De-ruyter-van-steveninck, Entropy and information in neural spike trains: Progress on the sampling problem, Phys. Rev. E, vol.69, p.56111, 2004.

E. Borel, La mechanique statistique et l'irreversibilite, J. Phys. Theor. Appl, vol.3, pp.189-196, 1913.

D. Scott, Multivariate Density Estimation. Theory, Practice and Visualization, 1992.

C. Epstein, G. Carlsson, and H. Edelsbrunner, Topological data analysis. Inverse Probl, 2011.

P. Baudot, M. Tapia, and J. Goaillard, Topological Information Data Analysis: Poincare-Shannon Machine and Statistical Physic of Finite Heterogeneous Systems, 2018.

A. Ly, M. Marsman, J. Verhagen, R. Grasman, and E. J. Wagenmakers, A Tutorial on Fisher Information, J. Math. Psychol, vol.80, pp.44-55, 2017.

R. Mori, New Understanding of the Bethe Approximation and the Replica Method, Licensee MDPI, 2013.