![]() And by definition, one cannot know beforehand the number of needed experiments. However, correctly estimating a probability distribution works only if each configuration happens many times. And using a base 2 logarithm, entropy will be expressed in bits ( Shannon, 1948 Cover and Thomas, 2006). It is considered that if p( x i) = 0, then p( x i) log 2 p( x i) = 0, as lim x→0 x( log 2 x) = 0. ![]() Where p( x i) is the probability that the signal will take the x i configuration among all the configurations ( x 1, x 2, x 3,…, x N) of the signal. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. We first demonstrate the applicability of this method using white-noise-like signals. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. In this article, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. Mathematical methods to overcome this so-called “sampling disaster” exist, but require significant expertise, great time and computational costs. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. They can be applied to all types of data, capture non-linear interactions and are model independent. 2Laboratory of Synaptic Imaging, Department of Clinical and Experimental Epilepsy, UCL Queen Square Institute of Neurology, University College London, London, United KingdomĬalculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. ![]() ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |