References
-
Alizadeh, N. H. and Arghami, N. R. (2010). A new estimator of entropy. Journal of the Iranian Statistical Society (JIRSS).
-
Amigó, J. M.; Balogh, S. G. and Hernández, S. (2018). A brief review of generalized entropies. Entropy 20, 813.
-
Amigó, J. M.; Szczepański, J.; Wajnryb, E. and Sanchez-Vives, M. V. (2004). Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity. Neural Computation 16, 717-736, arXiv:https://direct.mit.edu/neco/article-pdf/16/4/717/815838/089976604322860677.pdf.
-
Anteneodo, C. and Plastino, A. R. (1999). Maximum entropy approach to stretched exponential probability distributions. Journal of Physics A: Mathematical and General 32, 1089.
-
Arora, A.; Meister, C. and Cotterell, R. (2022). Estimating the Entropy of Linguistic Distributions, arXiv:2204.01469 [cs.CL].
-
Azami, H. and Escudero, J. (2016). Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation. Computer Methods and Programs in Biomedicine 128, 40-51.
-
Azami, H.; Rostaghi, M.; Abásolo, D. and Escudero, J. (2017). Refined Composite Multiscale Dispersion Entropy and its Application to Biomedical Signals. IEEE Transactions on Biomedical Engineering 64, 2872-2879.
-
Azami, H.; da Silva, L. E.; Omoto, A. C. and Humeau-Heurtier, A. (2019). Two-dimensional dispersion entropy: An information-theoretic method for irregularity analysis of images. Signal Processing: Image Communication 75, 178-187.
-
Bandt, C. and Pompe, B. (2002). Permutation Entropy: A Natural Complexity Measure for Time Series. Phys. Rev. Lett. 88, 174102.
-
Berger, S.; Kravtsiv, A.; Schneider, G. and Jordan, D. (2019). Teaching Ordinal Patterns to a Computer: Efficient Encoding Algorithms Based on the Lehmer Code. Entropy 21.
-
Charzyńska, A. and Gambin, A. (2016). Improvement of the k-nn Entropy Estimator with Applications in Systems Biology. Entropy 18.
-
Costa, M.; Goldberger, A. L. and Peng, C.-K. (2002). Multiscale Entropy Analysis of Complex Physiologic Time Series. Phys. Rev. Lett. 89, 068102.
-
Costa, M. D. and Goldberger, A. L. (2015). Generalized Multiscale Entropy Analysis: Application to Quantifying the Complex Volatility of Human Heartbeat Time Series. Entropy 17, 1197–1203.
-
Curado, E. M. and Nobre, F. D. (2004). On the stability of analytic entropic forms. Physica A: Statistical Mechanics and its Applications 335, 94-106.
-
Datseris, G. and Parlitz, U. (2022). Nonlinear dynamics: a concise introduction interlaced with code. Springer Nature.
-
Diego, D.; Haaga, K. A. and Hannisdal, B. (2019). Transfer entropy computation using the Perron-Frobenius operator. Phys. Rev. E 99, 042212.
-
Ebrahimi, N.; Pflughoeft, K. and Soofi, E. S. (1994). Two measures of sample entropy. Statistics & Probability Letters 20, 225-234.
-
Fadlallah, B.; Chen, B.; Keil, A. and Prı́ncipe, J. (2013). Weighted-permutation entropy: A complexity measure for time series incorporating amplitude information. Phys. Rev. E 87, 022911.
-
Gao, S.; Ver Steeg, G. and Galstyan, A. (2015). Efficient Estimation of Mutual Information for Strongly Dependent Variables. In: Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, Proceedings of Machine Learning Research, 277–286, San Diego, California, USA, PMLR.
-
Goria, M. N.; Leonenko, N. N.; Mergel, V. V. and Inverardi, P. L. (2005). A new class of random vector entropy estimators and its applications in testing statistical hypotheses. Journal of Nonparametric Statistics 17, 277-297, arXiv:https://doi.org/10.1080/104852504200026815.
-
Grassberger, P. (2022). On Generalized Schürmann Entropy Estimators. Entropy 24.
-
Hausser, J. and Strimmer, K. (2009). Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks. Journal of Machine Learning Research 10.
-
James, W. and Stein, C. (1992). Estimation with quadratic loss. In: Breakthroughs in statistics: Foundations and basic theory, editors, 443–460. Springer.
-
Kozachenko, L. F. and Leonenko, N. N. (1987). Sample estimate of the entropy of a random vector. Problemy Peredachi Informatsii 23, 9–16.
-
Kraskov, A.; Stögbauer, H. and Grassberger, P. (2004). Estimating mutual information. Phys. Rev. E 69, 066138.
-
Lad, F.; Sanfilippo, G. and Agrò, G. (2015). Extropy: Complementary Dual of Entropy. Statistical Science 30, 40 – 58.
-
Lempel, A. and Ziv, J. (1976). On the Complexity of Finite Sequences. IEEE Transactions on Information Theory 22, 75-81.
-
Leonenko, N.; Pronzato, L. and Savani, V. (2008). A class of Rényi information estimators for multidimensional densities. The Annals of Statistics 36, 2153 – 2182.
-
Li, Y.; Gao, X. and Wang, L. (2019). Reverse Dispersion Entropy: A New Complexity Measure for Sensor Signal. Sensors 19.
-
Liu, J. and Xiao, F. (2023). Renyi extropy. Communications in Statistics, Theory and Methods 52, 5836-5847.
-
Llanos, F.; Alexander, J. M.; Stilp, C. E. and Kluender, K. R. (2017). Power spectral entropy as an information-theoretic correlate of manner of articulation in American English. The Journal of the Acoustical Society of America 141, EL127–EL133.
-
Lord, W. M.; Sun, J. and Bollt, E. M. (2018). Geometric k-nearest neighbor estimation of entropy and mutual information. Chaos: An Interdisciplinary Journal of Nonlinear Science 28.
-
Miller, G. (1955). Note on the bias of information estimates. Information theory in psychology: Problems and methods.
-
Paninski, L. (2003). Estimation of entropy and mutual information. Neural computation 15, 1191–1253.
-
Pincus, S. M. (1991). Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences 88, 2297–2301.
-
Prichard, D. and Theiler, J. (1995). Generalized redundancies for time series analysis. Physica D: Nonlinear Phenomena 84, 476–493.
-
Ribeiro, H. V.; Zunino, L.; Lenzi, E. K.; Santoro, P. A. and Mendes, R. S. (2012). Complexity-Entropy Causality Plane as a Complexity Measure for Two-Dimensional Patterns. PLOS ONE 7, 1-9.
-
Richman, J. S. and Moorman, J. R. (2000). Physiological time-series analysis using approximate entropy and sample entropy. American journal of physiology-heart and circulatory physiology 278, H2039–H2049.
-
Rosso, O. A.; Blanco, S.; Yordanova, J.; Kolev, V.; Figliola, A.; Schürmann, M. and Başar, E. (2001). Wavelet entropy: a new tool for analysis of short duration brain electrical signals. Journal of Neuroscience Methods 105, 65-75.
-
Rosso, O. A.; Larrondo, H.; Martin, M. T.; Plastino, A. and Fuentes, M. A. (2007). Distinguishing noise from chaos. Physical review letters 99, 154102.
-
Rosso, O. A.; Martín, M.; Larrondo, H. A.; Kowalski, A. and Plastino, A. (2013). Generalized statistical complexity: A new tool for dynamical systems. Concepts and recent advances in generalized information measures and statistics, 169–215.
-
Rostaghi, M. and Azami, H. (2016). Dispersion entropy: A measure for time-series analysis. IEEE Signal Processing Letters 23, 610–614.
-
Rényi, A. (1961). On measures of entropy and information. In: Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, 547–562.
-
Schlemmer, A.; Berg, S.; Lilienkamp, T.; Luther, S. and Parlitz, U. (2018). Spatiotemporal permutation entropy as a measure for complexity of cardiac arrhythmia. Frontiers in Physics 6, 39.
-
Schürmann, T. (2004). Bias analysis in entropy estimation. Journal of Physics A: Mathematical and General 37, L295.
-
Shannon, C. E. (1948). A mathematical theory of communication. The Bell system technical journal 27, 379–423.
-
Singh, H.; Misra, N.; Hnizdo, V.; Fedorowicz, A. and Demchuk, E. (2003). Nearest neighbor estimates of entropy. American journal of mathematical and management sciences 23, 301–321.
-
Sippel, S.; Lange, H. and Gans, F. (2016), statcomp: Statistical Complexity and Information measures for time series analysis. R package version.
-
Tian, Y.; Zhang, H.; Xu, W.; Zhang, H.; Yang, L.; Zheng, S. and Shi, Y. (2017). Spectral entropy can predict changes of working memory performance reduced by short-time training in the delayed-match-to-sample task. Frontiers in human neuroscience 11, 437.
-
Tsallis, C. (1988). Possible generalization of Boltzmann-Gibbs statistics. Journal of statistical physics 52, 479–487.
-
Tsallis, C. (2009). Introduction to nonextensive statistical mechanics: approaching a complex world. Springer.
-
Vasicek, O. (1976). A test for normality based on sample entropy. Journal of the Royal Statistical Society Series B: Statistical Methodology 38, 54–59.
-
Wang, X.; Si, S. and Li, Y. (2020). Multiscale diversity entropy: A novel dynamical measure for fault diagnosis of rotating machinery. IEEE Transactions on Industrial Informatics 17, 5419–5429.
-
Wu, S.-D.; Wu, C.-W.; Lin, S.-G.; Wang, C.-C. and Lee, K.-Y. (2013). Time series analysis using composite multiscale entropy. Entropy 15, 1069–1084.
-
Xue, Y. and Deng, Y. (2023). Tsallis extropy. Communications in Statistics-Theory and Methods 52, 751–762.
-
Zahl, S. (1977). Jackknifing an index of diversity. Ecology 58, 907–913.
-
Zhou, Q.; Shang, P. and Zhang, B. (2023). Using missing dispersion patterns to detect determinism and nonlinearity in time series data. Nonlinear Dynamics 111, 439–458.
-
Zhu, J.; Bellanger, J.-J.; Shu, H. and Le Bouquin Jeannès, R. (2015). Contribution to transfer entropy estimation via the k-nearest-neighbors approach. Entropy 17, 4173–4201.
-
Zunino, L.; Olivares, F.; Scholkmann, F. and Rosso, O. A. (2017). Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions. Physics Letters A 381, 1883–1892.