#### Date of Award

Spring 2019

#### Document Type

Open Access Dissertation

#### Department

Mathematics

#### First Advisor

Joshua Cooper

#### Abstract

The quantification of causal relationships between time series data is a fundamen- tal problem in fields including neuroscience, social networking, finance, and machine learning. Amongst the various means of measuring such relationships, information- theoretic approaches are a rapidly developing area in concert with other methods. One such approach is to make use of the notion of transfer entropy (TE). Broadly speaking, TE is an information-theoretic measure of information transfer between two stochastic processes. Schreiber’s 2001 definition of TE characterizes information transfer as an informational divergence between conditional probability mass func- tions. The original definition is native to discrete-time stochastic processes whose comprising random variables have a discrete state space. While this formalism is applicable to a wealth of practical scenarios, there is a wide range of circumstances under which the processes of interest are indexed over an uncountable set (usually an interval). One can generalize Schreiber’s definition to handle the case when the ran- dom variables comprising the processes have state space R via the Radon-Nikodym Theorem, as demonstrated by Kaiser and Schreiber in 2002. A rigorous treatment of TE among processes that are either indexed over an uncountable set or do not have R as the state space of their comprising random variables has been lacking in the literature. A common workaround to this theoretical deficiency is to discretize time to create new stochastic processes and then apply Schreiber’s definition to these resulting processes. These time discretization workarounds have been widely used as a means to intuitively capture the notion of information transfer between pro- cesses in continuous-time, that is, those which are indexed by an interval. These approaches, while effective and practicable, do not provide a native definition of TE in continuous-time. We generalize Schreiber’s definition to the case when the processes are comprised of random variables with a Polish state space and generalize further to the case when the indexing set is an interval via projective limits. Our main result, Theorem 5, is a rigorous recasting of a claim made by Spinney, Propenko, and Lizier in 2016, which characterizes when continuous-time TE can be obtained as a limit of discrete-time TE.

In many applications, the instantaneous transfer entropy or transfer entropy rate is of particular interest. Using our definitions, we define the transfer entropy rate as the right-hand derivative of the expected pathwise transfer entropy (EPT) defined in Section 2.3. To this end, we use our main results to prove some of its properties, including a rigorous version of a result stated without proof in work by Spinney, Propenko, and Lizier regarding a particularly well-behaved class of stationary pro- cesses. We then consider time-homogeneous Markov jump processes and provide an analytic form of the EPT via a Girsanov formula, and finally, using a corollary of our main result, we demonstrate how to apply our main result to a lagged Poisson point process, providing a concrete example of two processes to which our aforementioned results apply.

#### Recommended Citation

Edgar, C. D.(2019). *A Development of Transfer Entropy in Continuous-Time.* (Doctoral dissertation). Retrieved from https://scholarcommons.sc.edu/etd/5243