MCB111: Mathematics in Biology (Fall 2024)
week 08:
Neural Networks - Learning as inference
Preliminars
Present all your reasoning, derivations, plots and code as part of the homework. Imagine that you are writing a short paper that anyone in class should to be able to understand. If you are stuck in some point, please describe the issue and how far you got. A jupyter notebook if you are working in Python is not required, but recommended.
Learn and remember your ABCs?
For this homework you have two options:
-
Building a feedforward one-neuron network to distinguish any two letters.
-
Building a feedback Hopfield network to distinguish letters.
The representation for different letters are given in these files (you can build your own if you prefer): A, B, C, D, E, X, Z.
Here you can find imperfect examples for each of those letters: A, B, C, D, E, X, Z.
(As always, remember to check the data you are given.)
Feedforward one-neuron network
To do this homework, you want to
-
select any two of those letters,
-
build a one-neuron network,
-
train the network by backpropagation on the examples given in the “imperfect representations” files.
-
After you consider the network has been trained, you want to go back and check how well you classify each of the examples.
Because, each letter is represented by a \(5\times 5\) grid, your neuron has to have 26 inputs (one for each grid point plus the bias).
At the end, I expect you to describe, how did you build the network, which parameter you used for training and regularization. After you are happy with the network, please report the number of iterations, and your success separating A’s from B’s (or any other pair that you decide to consider).
Extra credit if you approach any of these:
-
Any particular pair of letter that is harder to distinguish?
-
Monte Carlo approach to take different samples of the weights, instead of just one.
Feedback Hopfield network
Here, you want to incrementally add letters to the network, starting with one letter (memory), two letters, up to 5 letters, and testing how many of the imperfect representations return the correct letter from the Hopfield network.
Your Hopfield network will need 25 neurons, and \(\left(\frac{25}{2}\right) = \frac{25 * 24}{2} = 300\) independent weights connecting all neurons. Weights will be calculated using Hebb’s learning rule.
Extra credit if you quantify the effect of adding more and more “memories” to the network.