# Introduction to Information Theory

## Q1

This is mostly a computational question, you get familiar with some coding, and also on how to sample from any probability distribution.

• a Take a sample from each of the following distributions: uniform, exponential, gaussian and binomial. Draw a plot for each sample/distribution. Repeat for at least two different values of the parameters of the distributions.

• b Take several samples of different different sizes, and make visual observations as to how well your sample resembles the actual distribution.

• c Design ways to quantitative compare your sample to the actual distributions.

• d Calculate the entropy of each distribution. (Not a computational question but you could check that you got the right answer computationally.)

## Q2

Consider a system of 3 neurons that are all interconnected. We are going to assume that each neuron $n_i$ has two states +1 for firing and -1 for not firing.

Your experimental design allows you only to measure two neurons simultaneously (but not three at the time unfortunately), and what you can obtain is the averages of those pair measurements

Obviously, these averages are symmetric so we only need to consider the three cases above, as $% _{\mbox{obs}} = < n_1 n_2>_{\mbox{obs}} %]]>$, etc..

• Calculate the maximum entropy distribution

$$P(n_1, n_2, n_3)$$

that describes state of the three neurons, if you impose the average correlations $% _{\mbox{obs}} %]]>$ that you have observed.

• Can you generalize to $n$ neurons for which you know all average pairwise measurements?

If you need some inspiration, you can look into this paper Schneidman E, Still S, Berry MJ, Sergev R, Bialek W (2006) Weak pairwise correlations imply strongly correlated network states in a neural population.