MCB111: Mathematics in Biology (Fall 2019)
week 08:
Neural Networks  Learning as inference
Learn and remember your ABCs?
For this homework you have two options:

Building a feedforward oneneuron network to distinguish any two letters.

Building a feedback Hopfield network to distinguish letters.
The representation for different letters are given in these files (you can build your own if you prefer): A, B, C, D, E, X, Z.
Here you can find imperfect examples for each of those letters: A, B, C, D, E, X, Z.
(As always, remember to check the data you are given.)
Feedforward oneneuron network
To do this homework, you want to * select any two of those letters, * build a oneneuron network, * train the network by backpropagation on the examples given in the “imperfect representations” files. * After you consider the network has been trained, you want to go back and check how well you classify each of the examples.
Because, each letter is represented by a grid, your neuron has to have 26 inputs (one for each grid point plus the bias).
At the end, I expect you to describe, how did you build the network, which parameter you used for training and regularization. After you are happy with the network, please report the number of iterations, and your success separating A’s from B’s (or any other pair that you decide to consider).
Extra credit if you approach any of these:

Any particular pair of letter that is harder to distinguish?

Monte Carlo approach to take different samples of the weights, instead of just one.
Feedback Hopfield network
Here, you want to incrementally add letters to the network, starting with one letter (memory), two letters, up to 5 letters, and testing how many of the imperfect representations return the correct letter from the Hopfield network.
Your Hopfield network will need 25 neurons, and independent weights connecting all neurons. Weights will be calculated using Hebb’s learning rule.
Extra credit if you quantify the effect of adding more and more “memories” to the network.