References
-
Abe, S. and Rajagopal, A. K. (2001). Nonadditive conditional entropy and its significance for local realism. Physica A: Statistical Mechanics and its Applications 289, 157–164.
-
Amigó, J. M. and Hirata, Y. (2018). Detecting directional couplings from multivariate flows by the joint distance distribution.. Chaos 28, 075302.
-
Andrzejak, R. G.; Kraskov, A.; Stögbauer, H.; Mormann, F. and Kreuz, T. (2003). Bivariate surrogate techniques: necessity, strengths, and caveats. Physical review E 68, 066202.
-
Anishchenko, V. S.; Strelkova, G. I. and others, (1998). Irregular attractors. Discrete dynamics in Nature and Society 2, 53–72.
-
Arnhold, J.; Grassberger, P.; Lehnertz, K. and Elger, C. E. (1999). A robust method for detecting interdependences: application to intracranially recorded EEG. Physica D: Nonlinear Phenomena 134, 419–430.
-
Cao, L.; Mees, A. and Judd, K. (1997). Modeling and predicting non-stationary time series. International Journal of Bifurcation and Chaos 7, 1823–1831.
-
Chen, Y.; Rangarajan, G.; Feng, J. and Ding, M. (2004). Analyzing multiple nonlinear time series with extended Granger causality. Physics letters A 324, 26–35.
-
Chicharro, D. and Andrzejak, R. G. (2009). Reliable detection of directional couplings using rank statistics. Physical Review E 80, 026217.
-
Diego, D.; Haaga, K. A. and Hannisdal, B. (2019). Transfer entropy computation using the Perron-Frobenius operator. Physical Review E 99, 042212.
-
Elowitz, M. B. and Leibler, S. (2000). A synthetic oscillatory network of transcriptional regulators. Nature 403, 335–338.
-
Frenzel, S. and Pompe, B. (2007). Partial Mutual Information for Coupling Analysis of Multivariate Time Series. Phys. Rev. Lett. 99, 204101.
-
Furuichi, S. (2006). Information theoretical properties of Tsallis entropies. Journal of Mathematical Physics 47.
-
Gao, W.; Kannan, S.; Oh, S. and Viswanath, P. (2017). Estimating Mutual Information for Discrete-Continuous Mixtures. In Advances in Neural Information Processing Systems, Curran Associates, Inc..
-
Gao, W.; Oh, S. and Viswanath, P. (2018). Demystifying Fixed $k$ -Nearest Neighbor Information Estimators. IEEE Transactions on Information Theory 64, 5629-5661.
-
Haaga, K. A.; Diego, D.; Brendryen, J. and Hannisdal, B. (2020). A simple test for causality in complex systems. arXiv:2005.01860 [stat.AP].
-
Jizba, P.; Kleinert, H. and Shefaat, M. (2012). Rényi's information transfer between financial time series. Physica A: Statistical Mechanics and its Applications 391, 2971–2989.
-
Kalisch, M. and Bühlmann, P. (2008). Robustification of the PC-Algorithm for Directed Acyclic Graphs. https://doi.org/10.1198/106186008X381927.
-
Kraskov, A.; Stögbauer, H. and Grassberger, P. (2004). Estimating mutual information. Phys. Rev. E 69, 066138.
-
Levy, K. J. and Narula, S. C. (1978). Testing hypotheses concerning partial correlations: Some methods and discussion. International Statistical Review/Revue Internationale de Statistique, 215–218.
-
Lindner, M.; Vicente, R.; Priesemann, V. and Wibral, M. (2011). TRENTOOL: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neuroscience 12, 119 - 119.
-
Luo, M.; Kantz, H.; Lau, N.-C.; Huang, W. and Zhou, Y. (2015). Questionable dynamical evidence for causality between galactic cosmic rays and interannual variation in global temperature. Proceedings of the National Academy of Sciences 112, E4638 - E4639.
-
Martin, S.; Morison, G.; Nailon, W. H. and Durrani, T. S. (2004). Fast and accurate image registration using Tsallis entropy and simultaneous perturbation stochastic approximation. Electronics Letters 40, 595-597.
-
McCracken, J. M. and Weigel, R. S. (2014). Convergent cross-mapping and pairwise asymmetric inference. Physical Review E 90, 062903.
-
Murali, K. and Lakshmanan, M. (1993). Chaotic dynamics of the driven Chua's circuit. IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications 40, 836–840.
-
Paluš, M. (2014). Cross-Scale Interactions and Information Transfer. Entropy 16, 5263–5289.
-
Papana, A.; Kyrtsou, C.; Kugiumtzis, D. and Diks, C. (2013). Simulation study of direct causality measures in multivariate time series. Entropy 15, 2635–2661.
-
Quiroga, R. Q.; Arnhold, J. and Grassberger, P. (2000). Learning driver-response relationships from synchronization patterns. Physical Review E 61, 5142.
-
Rahimzamani, A.; Asnani, H.; Viswanath, P. and Kannan, S. (2018). Estimators for multivariate information measures in general probability spaces. Advances in Neural Information Processing Systems 31.
-
Ramos, A. M.; Builes-Jaramillo, A.; Poveda, G.; Goswami, B.; Macau, E. E.; Kurths, J. and Marwan, N. (2017). Recurrence measure of conditional dependence and applications. Phys. Rev. E 95, 052206.
-
Romano, M. C.; Thiel, M.; Kurths, J. and Grebogi, C. (2007). Estimation of the direction of the coupling by conditional probabilities of recurrence. Phys. Rev. E 76, 036211.
-
Runge, J. (2018). Causal network reconstruction from time series: From theoretical assumptions to practical estimation. Chaos: An Interdisciplinary Journal of Nonlinear Science 28.
-
Runge, J. (2018). Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information. In Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, Proceedings of Machine Learning Research, 938–947, PMLR.
-
Sarbu, S. (2014). Rényi information transfer: Partial Rényi transfer entropy and partial Rényi mutual information. In 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 5666–5670.
-
Schmidt, C.; Huegle, J. and Uflacker, M. (2018). Order-independent constraint-based causal structure learning for gaussian distribution models using GPUs. In Proceedings of the 30th International Conference on Scientific and Statistical Database Management, 1–10.
-
Schreiber, T. (2000). Measuring information transfer. Physical review letters 85, 461.
-
Singh, H.; Misra, N.; Hnizdo, V.; Fedorowicz, A. and Demchuk, E. (2003). Nearest Neighbor Estimates of Entropy. American Journal of Mathematical and Management Sciences 23, 301 - 321.
-
Spirtes, P.; Glymour, C. N. and Scheines, R. (2000). Causation, prediction, and search. MIT press.
-
Staniek, M. and Lehnertz, K. (2008). Symbolic Transfer Entropy. Phys. Rev. Lett. 100, 158101.
-
Sugihara, G.; May, R. M.; Ye, H.; Hsieh, C.-h.; Deyle, E. R.; Fogarty, M. and Munch, S. B. (2012). Detecting Causality in Complex Ecosystems. Science 338, 496 - 500.
-
Sun, J.; Cafaro, C. and Bollt, E. M. (2014). Identifying the coupling structure in complex systems through the optimal causation entropy principle. Entropy 16, 3416–3433.
-
Sun, J.; Taylor, D. and Bollt, E. M. (2015). Causal Network Inference by Optimal Causation Entropy. https://doi.org/10.1137/140956166.
-
Tang, W. K.; Zhong, G.; Chen, G. and Man, K. (2001). Generation of n-scroll attractors via sine function. IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications 48, 1369–1372.
-
Vejmelka, M. and Paluš, M. (2008). Inferring the directionality of coupling with conditional mutual information. Physical Review E 77, 026214.
-
Verdes, P. (2005). Assessing causality from multivariate time series. Physical Review E 72, 026222.
-
Ye, H.; Deyle, E. R.; Gilarranz, L. J. and Sugihara, G. (2015). Distinguishing time-delayed causal interactions using convergent cross mapping. Scientific reports 5, 14750.
-
Zhao, J.; Zhou, Y.; Zhang, X. and Chen, L. (2016). Part mutual information for quantifying direct associations in networks. Proceedings of the National Academy of Sciences 113, 5130–5135.
-
Zhu, J.; Bellanger, J.-J.; Shu, H. and Le Bouquin Jeannès, R. (2015). Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach. Entropy 17, 4173–4201.