Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/137769
Type: Thesis
Title: Characterisation and Estimation of Entropy Rate for Long Range Dependent Processes
Author: Feutrill, Andrew Robert
Issue Date: 2023
School/Discipline: School of Mathematical Sciences
Abstract: Much of the theory of random processes has been developed with the assumption that distant time periods are weakly correlated. However, it has been discovered in many real-world phenomena that this assumption is not valid. These findings have resulted in extensive research interest into stochastic processes that have strong correlations that persist over long time periods. This phenomenon is called long range dependence. This phenomena has been defined in the time and frequency domains by the slow decay of their autocorrelation function and the existence of a pole at the origin of the spectral density function, respectively. Information theory has proved very useful in statistics and probability theory. However, there has not been much research into the information theoretic properties and characterisations of this phenomena.This thesis characterises the phenomena of long range dependence, for discrete and continuous-valued stochastic processes in discrete time, by an information theoretic measure, the entropy rate. The entropy rate measures the amount of information contained in a stochastic process on average, per random variable. Common characterisations of long range dependence in the time and frequency domains are given by the slow convergence to quantities of interest, such as the sample mean. We show that this type of behaviour is present in the entropy rate function, by showing that long range dependence also has slow convergence of the conditional entropy to the entropy rate, due to some entropic quantities diverging to infinity. As an extension we show for classes of Gaussian processes and Markov chains that long range dependence by an infinite amount of shared information between the past and future of a stochastic process. The slow convergence has the impact of making accurate estimation of the differential entropy rate on data from long range dependent processes difficult, to the extent that existing techniques either are not accurate or are computationally intensive. We introduce a new estimation technique, that is able to balance these two concerns and make quick and accurate estimates of the differential entropy rate from continuous-valued data. We develop and utilise a connection between the differential entropy rate and the Shannon entropy rate of its quantised process as the basis of the estimation technique. This allows us to draw on the extensive research into Shannon entropy rate estimation on discrete-valued data, and we show that properties for the differential entropy rate estimator can be inherited from the choice of Shannon entropy rate estimator.
Advisor: Roughan, Matthew
Yarom, Yuval
Ross, Josh
Dissertation Note: Thesis (Ph.D.) -- University of Adelaide, School of Mathematical Sciences, 2023
Keywords: long range dependence
long memory
entropy rate
entropy rate estimation
gaussian process
markov chain
Provenance: This electronic version is made publicly available by the University of Adelaide in accordance with its open access policy for student theses. Copyright in this thesis remains with the author. This thesis may incorporate third party material which has been used by the author pursuant to Fair Dealing exceptions. If you are the owner of any included third party copyright material you wish to be removed from this electronic version, please complete the take down form located at: http://www.adelaide.edu.au/legals
Appears in Collections:Research Theses

Files in This Item:
File Description SizeFormat 
Feutrill2023_PhD.pdf1.33 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.