MCB111: Mathematics in Biology (Fall 2024)


week 08:

Neural Networks - Learning as inference

Preliminars

Present all your reasoning, derivations, plots and code as part of the homework. Imagine that you are writing a short paper that anyone in class should to be able to understand. If you are stuck in some point, please describe the issue and how far you got. A jupyter notebook if you are working in Python is not required, but recommended.

Learn and remember your ABCs?

For this homework you have two options:

The representation for different letters are given in these files (you can build your own if you prefer): A, B, C, D, E, X, Z.

Here you can find imperfect examples for each of those letters: A, B, C, D, E, X, Z.

(As always, remember to check the data you are given.)

Feedforward one-neuron network

To do this homework, you want to

Because, each letter is represented by a \(5\times 5\) grid, your neuron has to have 26 inputs (one for each grid point plus the bias).

At the end, I expect you to describe, how did you build the network, which parameter you used for training and regularization. After you are happy with the network, please report the number of iterations, and your success separating A’s from B’s (or any other pair that you decide to consider).

Extra credit if you approach any of these:

Feedback Hopfield network

Here, you want to incrementally add letters to the network, starting with one letter (memory), two letters, up to 5 letters, and testing how many of the imperfect representations return the correct letter from the Hopfield network.

Your Hopfield network will need 25 neurons, and \(\left(\frac{25}{2}\right) = \frac{25 * 24}{2} = 300\) independent weights connecting all neurons. Weights will be calculated using Hebb’s learning rule.

Extra credit if you quantify the effect of adding more and more “memories” to the network.