References
- Abe, S. and Rajagopal, A. K. (2001). Nonadditive conditional entropy and its significance for local realism. Physica A: Statistical Mechanics and its Applications 289, 157–164.
- Amigó, J. M. and Hirata, Y. (2018). Detecting directional couplings from multivariate flows by the joint distance distribution. Chaos 28, 075302.
- Andrzejak, R. G.; Kraskov, A.; Stögbauer, H.; Mormann, F. and Kreuz, T. (2003). Bivariate surrogate techniques: necessity, strengths, and caveats. Physical review E 68, 066202.
- Arnhold, J.; Grassberger, P.; Lehnertz, K. and Elger, C. E. (1999). A robust method for detecting interdependences: application to intracranially recorded EEG. Physica D: Nonlinear Phenomena 134, 419–430.
- Azadkia, M. and Chatterjee, S. (2021). A simple measure of conditional dependence. The Annals of Statistics 49, 3070–3102.
- Chatterjee, S. (2021). A new coefficient of correlation. Journal of the American Statistical Association 116, 2009–2022.
- Chicharro, D. and Andrzejak, R. G. (2009). Reliable detection of directional couplings using rank statistics. Physical Review E 80, 026217.
- Cover, T. M. (1999). Elements of information theory (John Wiley & Sons).
- Datseris, G. and Haaga, K. A. (2024). ComplexityMeasures. jl: scalable software to unify and accelerate entropy and complexity timeseries analysis, arXiv preprint arXiv:2406.05011.
- Dette, H.; Siburg, K. F. and Stoimenov, P. A. (2013). A Copula-Based Non-parametric Measure of Regression Dependence. Scandinavian Journal of Statistics 40, 21–41.
- van Erven, T. and Harremos, P. (2014). Rényi Divergence and Kullback-Leibler Divergence. IEEE Transactions on Information Theory 60, 3797–3820.
- Frenzel, S. and Pompe, B. (2007). Partial Mutual Information for Coupling Analysis of Multivariate Time Series. Phys. Rev. Lett. 99, 204101.
- Furuichi, S. (2006). Information theoretical properties of Tsallis entropies. Journal of Mathematical Physics 47.
- Gao, W.; Kannan, S.; Oh, S. and Viswanath, P. (2017). Estimating Mutual Information for Discrete-Continuous Mixtures. In: Advances in Neural Information Processing Systems, Vol. 30, edited by Guyon, I.; Luxburg, U. V.; Bengio, S.; Wallach, H.; Fergus, R.; Vishwanathan, S. and Garnett, R. (Curran Associates, Inc.).
- Gao, W.; Oh, S. and Viswanath, P. (2018). Demystifying Fixed $k$ -Nearest Neighbor Information Estimators. IEEE Transactions on Information Theory 64, 5629–5661.
- Golshani, L.; Pasha, E. and Yari, G. (2009). Some properties of Rényi entropy and Rényi entropy rate. Information Sciences 179, 2426–2433.
- Jizba, P.; Kleinert, H. and Shefaat, M. (2012). Rényi's information transfer between financial time series. Physica A: Statistical Mechanics and its Applications 391, 2971–2989.
- Kraskov, A.; Stögbauer, H. and Grassberger, P. (2004). Estimating mutual information. Phys. Rev. E 69, 066138.
- Kubkowski, M.; Mielniczuk, J. and Teisseyre, P. (2021). How to gain on power: novel conditional independence tests based on short expansion of conditional mutual information. Journal of Machine Learning Research 22, 1–57.
- Levy, K. J. and Narula, S. C. (1978). Testing hypotheses concerning partial correlations: Some methods and discussion. International Statistical Review/Revue Internationale de Statistique, 215–218.
- Lindner, M.; Vicente, R.; Priesemann, V. and Wibral, M. (2011). TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neuroscience 12, 119–119.
- Luo, M.; Kantz, H.; Lau, N.-C.; Huang, W. and Zhou, Y. (2015). Questionable dynamical evidence for causality between galactic cosmic rays and interannual variation in global temperature. Proceedings of the National Academy of Sciences 112, E4638 - E4639.
- Martin, S.; Morison, G.; Nailon, W. H. and Durrani, T. S. (2004). Fast and accurate image registration using Tsallis entropy and simultaneous perturbation stochastic approximation. Electronics Letters 40, 595–597.
- McCracken, J. M. and Weigel, R. S. (2014). Convergent cross-mapping and pairwise asymmetric inference. Physical Review E 90, 062903.
- Mesner, O. C. and Shalizi, C. R. (2020). Conditional mutual information estimation for mixed, discrete and continuous data. IEEE Transactions on Information Theory 67, 464–484.
- Paluš, M. (2014). Cross-Scale Interactions and Information Transfer. Entropy 16, 5263–5289.
- Papapetrou, M. and Kugiumtzis, D. (2020). Tsallis conditional mutual information in investigating long range correlation in symbol sequences. Physica A: Statistical Mechanics and its Applications 540, 123016.
- Póczos, B. and Schneider, J. (2012). Nonparametric estimation of conditional information and divergences. In: Artificial Intelligence and Statistics (PMLR); pp. 914–923.
- Quiroga, R. Q.; Arnhold, J. and Grassberger, P. (2000). Learning driver-response relationships from synchronization patterns. Physical Review E 61, 5142.
- Rahimzamani, A.; Asnani, H.; Viswanath, P. and Kannan, S. (2018). Estimators for multivariate information measures in general probability spaces. Advances in Neural Information Processing Systems 31.
- Ramos, A. M.; Builes-Jaramillo, A.; Poveda, G.; Goswami, B.; Macau, E. E.; Kurths, J. and Marwan, N. (2017). Recurrence measure of conditional dependence and applications. Phys. Rev. E 95, 052206.
- Romano, M. C.; Thiel, M.; Kurths, J. and Grebogi, C. (2007). Estimation of the direction of the coupling by conditional probabilities of recurrence. Phys. Rev. E 76, 036211.
- Runge, J. (09–11 Apr 2018). Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information. In: Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, Vol. 84 of Proceedings of Machine Learning Research, edited by Storkey, A. and Perez-Cruz, F. (PMLR); pp. 938–947.
- Sarbu, S. (2014). Rényi information transfer: Partial Rényi transfer entropy and partial Rényi mutual information. In: 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (IEEE); pp. 5666–5670.
- Schmidt, C.; Huegle, J. and Uflacker, M. (2018). Order-independent constraint-based causal structure learning for gaussian distribution models using GPUs. In: Proceedings of the 30th International Conference on Scientific and Statistical Database Management; pp. 1–10.
- Schreiber, T. (2000). Measuring information transfer. Physical review letters 85, 461.
- Shi, H.; Drton, M. and Han, F. (2022). On the power of Chatterjee’s rank correlation. Biometrika 109, 317–333.
- Singh, H.; Misra, N.; Hnizdo, V.; Fedorowicz, A. and Demchuk, E. (2003). Nearest Neighbor Estimates of Entropy. American Journal of Mathematical and Management Sciences 23, 301–321.
- Spirtes, P.; Glymour, C. N. and Scheines, R. (2000). Causation, prediction, and search (MIT press).
- Staniek, M. and Lehnertz, K. (2008). Symbolic Transfer Entropy. Phys. Rev. Lett. 100, 158101.
- Sugihara, G.; May, R. M.; Ye, H.; Hsieh, C.-h.; Deyle, E. R.; Fogarty, M. and Munch, S. B. (2012). Detecting Causality in Complex Ecosystems. Science 338, 496–500.
- Sun, J.; Taylor, D. and Bollt, E. M. (2015). Causal Network Inference by Optimal Causation Entropy. SIAM Journal on Applied Dynamical Systems 14, 73–106, arXiv:https://doi.org/10.1137/140956166.
- Székely, G. J. and Rizzo, M. L. (2014). Partial distance correlation with methods for dissimilarities. The Annals of Statistics 42, 2382–2412.
- Székely, G. J.; Rizzo, M. L. and Bakirov, N. K. (2007). Measuring and testing dependence by correlation of distances. The Annals of Statistics 35, 2769–2794.
- Vejmelka, M. and Paluš, M. (2008). Inferring the directionality of coupling with conditional mutual information. Physical Review E 77, 026214.
- Zhao, J.; Zhou, Y.; Zhang, X. and Chen, L. (2016). Part mutual information for quantifying direct associations in networks. Proceedings of the National Academy of Sciences 113, 5130–5135.
- Zhu, J.; Bellanger, J.-J.; Shu, H. and Le Bouquin Jeannès, R. (2015). Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach. Entropy 17, 4173–4201.