This repository generates a plot which displays the dependency of marginal redundancy
To build up to the definition of marginal redundancy, the concept of mutual information and redundancy will first be defined. Mutual information quantifies the amount of information one variable, or system, gives about another.
Redundancy
Marginal redundancy
Kolmogorov-Sinai entropy, also known as measure-theoretic entropy, metric entropy, Kolmogorov entropy, or simply KS entropy, represents the information production rate of a dynamical system. For a given bin width ε, define
For noiseless signals, non-chaotic systems exhibit zero KS entropy while chaotic systems display positive values. For signals containing noise, KS entropy diverges to infinity as ε approaches zero, regardless of the amount of noise present. In practical applications, a sufficiently large ε is chosen in order to minimize the effect of noise and keep the value finite.
The python function 'marginal_redundancies_calculation' takes in a time series array with the specified parameters (max_dim, max_lag, bins) and outputs a plot with marginal redundancy on the y-axis and time lag on the x-axis. It plots a separate curve for each embedding dimension starting from
An example is shown below for the Lorenz system using 1,000,000 data points. The parameters and time series generation function are included in example.py. The computation time is approximately 30 seconds on an Intel® Core™ i7-1255U at base clock speed.
-
Installation: Download marginal_redundancy.py in your project directory and import the file as a Python module using import marginal_redundancy.
-
Required Packages: numpy, matplotlib
[1] A. M. Fraser, “Using Mutual Information to Estimate Metric Entropy,” Springer series in synergetics, pp. 82–91, Jan. 1986, doi: https://doi.org/10.1007/978-3-642-71001-8_11.
[2] A. M. Fraser, “Information and entropy in strange attractors,” IEEE Transactions on Information Theory, vol. 35, no. 2, pp. 245–262, Mar. 1989, doi: https://doi.org/10.1109/18.32121.
[3] M. Palus, "Kolmogorov Entropy From Time Series Using Information-Theoretic Functionals," Neural Network World, vol. 7, 1997.
[4] Y. Sinai, “Kolmogorov-Sinai entropy,” Scholarpedia, vol. 4, no. 3, p. 2034, 2009, doi: https://doi.org/10.4249/scholarpedia.2034.
[5] G. P. Williams, “Chaos Theory Tamed,” CRC Press eBooks, Sep. 1997, doi: https://doi.org/10.1201/9781482295412.