References
- Abe, S. and Rajagopal, A. K. (2001). Nonadditive conditional entropy and its significance for local realism. Physica A: Statistical Mechanics and its Applications 289, 157–164.
- Alizadeh, N. H. and Arghami, N. R. (2010). A new estimator of entropy. Journal of the Iranian Statistical Society (JIRSS).
- Amigó, J. M. and Hirata, Y. (2018). Detecting directional couplings from multivariate flows by the joint distance distribution. Chaos 28, 075302.
- Andrzejak, R. G.; Kraskov, A.; Stögbauer, H.; Mormann, F. and Kreuz, T. (2003). Bivariate surrogate techniques: necessity, strengths, and caveats. Physical review E 68, 066202.
- Arnhold, J.; Grassberger, P.; Lehnertz, K. and Elger, C. E. (1999). A robust method for detecting interdependences: application to intracranially recorded EEG. Physica D: Nonlinear Phenomena 134, 419–430.
- Arora, A.; Meister, C. and Cotterell, R. (2022). Estimating the Entropy of Linguistic Distributions, arXiv, arXiv:2204.01469 [cs.CL].
- Azadkia, M. and Chatterjee, S. (2021). A simple measure of conditional dependence. The Annals of Statistics 49, 3070–3102.
- Azami, H. and Escudero, J. (2016). Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation. Computer Methods and Programs in Biomedicine 128, 40–51.
- Bandt, C. and Pompe, B. (2002). Permutation Entropy: A Natural Complexity Measure for Time Series. Phys. Rev. Lett. 88, 174102.
- Berger, S.; Kravtsiv, A.; Schneider, G. and Jordan, D. (2019). Teaching Ordinal Patterns to a Computer: Efficient Encoding Algorithms Based on the Lehmer Code. Entropy 21.
- Charzyńska, A. and Gambin, A. (2016). Improvement of the k-nn Entropy Estimator with Applications in Systems Biology. Entropy 18.
- Chatterjee, S. (2021). A new coefficient of correlation. Journal of the American Statistical Association 116, 2009–2022.
- Chicharro, D. and Andrzejak, R. G. (2009). Reliable detection of directional couplings using rank statistics. Physical Review E 80, 026217.
- Cover, T. M. (1999). Elements of information theory (John Wiley & Sons).
- Datseris, G. and Haaga, K. A. (2024). ComplexityMeasures. jl: scalable software to unify and accelerate entropy and complexity timeseries analysis, arXiv preprint arXiv:2406.05011.
- Dette, H.; Siburg, K. F. and Stoimenov, P. A. (2013). A Copula-Based Non-parametric Measure of Regression Dependence. Scandinavian Journal of Statistics 40, 21–41.
- Ebrahimi, N.; Pflughoeft, K. and Soofi, E. S. (1994). Two measures of sample entropy. Statistics & Probability Letters 20, 225–234.
- van Erven, T. and Harremos, P. (2014). Rényi Divergence and Kullback-Leibler Divergence. IEEE Transactions on Information Theory 60, 3797–3820.
- Frenzel, S. and Pompe, B. (2007). Partial Mutual Information for Coupling Analysis of Multivariate Time Series. Phys. Rev. Lett. 99, 204101.
- Furuichi, S. (2006). Information theoretical properties of Tsallis entropies. Journal of Mathematical Physics 47.
- Gao, S.; Ver Steeg, G. and Galstyan, A. (09–12 May 2015). Efficient Estimation of Mutual Information for Strongly Dependent Variables. In: Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, Vol. 38 of Proceedings of Machine Learning Research, edited by Lebanon, G. and Vishwanathan, S. V. (PMLR, San Diego, California, USA); pp. 277–286.
- Gao, W.; Kannan, S.; Oh, S. and Viswanath, P. (2017). Estimating Mutual Information for Discrete-Continuous Mixtures. In: Advances in Neural Information Processing Systems, Vol. 30, edited by Guyon, I.; Luxburg, U. V.; Bengio, S.; Wallach, H.; Fergus, R.; Vishwanathan, S. and Garnett, R. (Curran Associates, Inc.).
- Gao, W.; Oh, S. and Viswanath, P. (2018). Demystifying Fixed $k$ -Nearest Neighbor Information Estimators. IEEE Transactions on Information Theory 64, 5629–5661.
- Golshani, L.; Pasha, E. and Yari, G. (2009). Some properties of Rényi entropy and Rényi entropy rate. Information Sciences 179, 2426–2433.
- Goria, M. N.; Leonenko, N. N.; Mergel, V. V. and Inverardi, P. L. (2005). A new class of random vector entropy estimators and its applications in testing statistical hypotheses. Journal of Nonparametric Statistics 17, 277–297, arXiv:https://doi.org/10.1080/104852504200026815.
- Grassberger, P. (2022). On Generalized Schuermann Entropy Estimators. Entropy 24.
- Jizba, P.; Kleinert, H. and Shefaat, M. (2012). Rényi's information transfer between financial time series. Physica A: Statistical Mechanics and its Applications 391, 2971–2989.
- Kozachenko, L. F. and Leonenko, N. N. (1987). Sample estimate of the entropy of a random vector. Problemy Peredachi Informatsii 23, 9–16.
- Kraskov, A.; Stögbauer, H. and Grassberger, P. (2004). Estimating mutual information. Phys. Rev. E 69, 066138.
- Leonenko, N.; Pronzato, L. and Savani, V. (2008). A class of Rényi information estimators for multidimensional densities. The Annals of Statistics 36, 2153–2182.
- Levy, K. J. and Narula, S. C. (1978). Testing hypotheses concerning partial correlations: Some methods and discussion. International Statistical Review/Revue Internationale de Statistique, 215–218.
- Lindner, M.; Vicente, R.; Priesemann, V. and Wibral, M. (2011). TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neuroscience 12, 119–119.
- Lord, W. M.; Sun, J. and Bollt, E. M. (2018). Geometric k-nearest neighbor estimation of entropy and mutual information. Chaos: An Interdisciplinary Journal of Nonlinear Science 28.
- Luo, M.; Kantz, H.; Lau, N.-C.; Huang, W. and Zhou, Y. (2015). Questionable dynamical evidence for causality between galactic cosmic rays and interannual variation in global temperature. Proceedings of the National Academy of Sciences 112, E4638 - E4639.
- Manis, G.; Aktaruzzaman, M. and Sassi, R. (2017). Bubble entropy: An entropy almost free of parameters. IEEE Transactions on Biomedical Engineering 64, 2711–2718.
- Martin, S.; Morison, G.; Nailon, W. H. and Durrani, T. S. (2004). Fast and accurate image registration using Tsallis entropy and simultaneous perturbation stochastic approximation. Electronics Letters 40, 595–597.
- McCracken, J. M. and Weigel, R. S. (2014). Convergent cross-mapping and pairwise asymmetric inference. Physical Review E 90, 062903.
- Mesner, O. C. and Shalizi, C. R. (2020). Conditional mutual information estimation for mixed, discrete and continuous data. IEEE Transactions on Information Theory 67, 464–484.
- Miller, G. (1955). Note on the bias of information estimates. Information theory in psychology: Problems and methods.
- Paluš, M. (2014). Cross-Scale Interactions and Information Transfer. Entropy 16, 5263–5289.
- Paninski, L. (2003). Estimation of entropy and mutual information. Neural computation 15, 1191–1253.
- Papapetrou, M. and Kugiumtzis, D. (2020). Tsallis conditional mutual information in investigating long range correlation in symbol sequences. Physica A: Statistical Mechanics and its Applications 540, 123016.
- Póczos, B. and Schneider, J. (2012). Nonparametric estimation of conditional information and divergences. In: Artificial Intelligence and Statistics (PMLR); pp. 914–923.
- Quiroga, R. Q.; Arnhold, J. and Grassberger, P. (2000). Learning driver-response relationships from synchronization patterns. Physical Review E 61, 5142.
- Rahimzamani, A.; Asnani, H.; Viswanath, P. and Kannan, S. (2018). Estimators for multivariate information measures in general probability spaces. Advances in Neural Information Processing Systems 31.
- Ramos, A. M.; Builes-Jaramillo, A.; Poveda, G.; Goswami, B.; Macau, E. E.; Kurths, J. and Marwan, N. (2017). Recurrence measure of conditional dependence and applications. Phys. Rev. E 95, 052206.
- Romano, M. C.; Thiel, M.; Kurths, J. and Grebogi, C. (2007). Estimation of the direction of the coupling by conditional probabilities of recurrence. Phys. Rev. E 76, 036211.
- Rostaghi, M. and Azami, H. (2016). Dispersion entropy: A measure for time-series analysis. IEEE Signal Processing Letters 23, 610–614.
- Runge, J. (09–11 Apr 2018). Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information. In: Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, Vol. 84 of Proceedings of Machine Learning Research, edited by Storkey, A. and Perez-Cruz, F. (PMLR); pp. 938–947.
- Rényi, A. (1961). On measures of entropy and information. In: Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, Vol. 4 (University of California Press); pp. 547–562.
- Sarbu, S. (2014). Rényi information transfer: Partial Rényi transfer entropy and partial Rényi mutual information. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (IEEE); pp. 5666–5670.
- Schmidt, C.; Huegle, J. and Uflacker, M. (2018). Order-independent constraint-based causal structure learning for gaussian distribution models using GPUs. In: Proceedings of the 30th International Conference on Scientific and Statistical Database Management; pp. 1–10.
- Schreiber, T. (2000). Measuring information transfer. Physical review letters 85, 461.
- Schuermann, T. (2004). Bias analysis in entropy estimation. Journal of Physics A: Mathematical and General 37, L295.
- Shannon, C. E. (1948). A mathematical theory of communication. The Bell system technical journal 27, 379–423.
- Shi, H.; Drton, M. and Han, F. (2022). On the power of Chatterjee’s rank correlation. Biometrika 109, 317–333.
- Singh, H.; Misra, N.; Hnizdo, V.; Fedorowicz, A. and Demchuk, E. (2003). Nearest Neighbor Estimates of Entropy. American Journal of Mathematical and Management Sciences 23, 301–321.
- Spirtes, P.; Glymour, C. N. and Scheines, R. (2000). Causation, prediction, and search (MIT press).
- Staniek, M. and Lehnertz, K. (2008). Symbolic Transfer Entropy. Phys. Rev. Lett. 100, 158101.
- Sugihara, G.; May, R. M.; Ye, H.; Hsieh, C.-h.; Deyle, E. R.; Fogarty, M. and Munch, S. B. (2012). Detecting Causality in Complex Ecosystems. Science 338, 496–500.
- Sun, J.; Taylor, D. and Bollt, E. M. (2015). Causal Network Inference by Optimal Causation Entropy. SIAM Journal on Applied Dynamical Systems 14, 73–106, arXiv:https://doi.org/10.1137/140956166.
- Székely, G. J. and Rizzo, M. L. (2014). Partial distance correlation with methods for dissimilarities. The Annals of Statistics 42, 2382–2412.
- Székely, G. J.; Rizzo, M. L. and Bakirov, N. K. (2007). Measuring and testing dependence by correlation of distances. The Annals of Statistics 35, 2769–2794.
- Tsallis, C. (2009). Introduction to nonextensive statistical mechanics: approaching a complex world. Vol. 1 no. 1 (Springer).
- Vasicek, O. (1976). A test for normality based on sample entropy. Journal of the Royal Statistical Society Series B: Statistical Methodology 38, 54–59.
- Vejmelka, M. and Paluš, M. (2008). Inferring the directionality of coupling with conditional mutual information. Physical Review E 77, 026214.
- Wang, X.; Si, S. and Li, Y. (2020). Multiscale diversity entropy: A novel dynamical measure for fault diagnosis of rotating machinery. IEEE Transactions on Industrial Informatics 17, 5419–5429.
- Zahl, S. (1977). Jackknifing an index of diversity. Ecology 58, 907–913.
- Zhao, J.; Zhou, Y.; Zhang, X. and Chen, L. (2016). Part mutual information for quantifying direct associations in networks. Proceedings of the National Academy of Sciences 113, 5130–5135.
- Zhu, J.; Bellanger, J.-J.; Shu, H. and Le Bouquin Jeannès, R. (2015). Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach. Entropy 17, 4173–4201.
- Zunino, L.; Olivares, F.; Scholkmann, F. and Rosso, O. A. (2017). Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions. Physics Letters A 381, 1883–1892.