At some point in my life, as perhaps in yours, I had to write a multilayer perceptron code from scratch. It is a useful exercise, and the result posted here is a nice, barebones implementation that I use on occasion to get a peek under the hood of how my networks are working. If you use the code, please cite this page, and please let me know if you found it useful or not.

my-neural-network.zip This is the archive (5.8kB) containing 5 M source code files. Description of each is below, and the files are heavily commented.

The code files are as follows:

Script for generating synthetic data, calling the ANN, plotting and testing the output. This one can be run and it will call all the others.

This function intializes the model, presents the training patterns a given number of times, and returns the trained model. Arguments given are the training dataset, the target model output, the numbers of hidden layer neurons, the number of iterations to present the whole dataset, and the learning rate. Outputs are the trained model, the output of the model to the training set, and the correlation of output to target output.

This function is called a number of times (number of iterations * number of training examples) from train_mlp.m. This function does most of the complicated calculations for the ANN. It first calculates the activations of each layer of neurons, then calculates the error and propagates the error back through the network, and then updates the weights and biases accordingly. This is done in MATLAB so the calculations are done as matrices, which is nice at the end, but took some head scratching to figure out the first time.

A simple function for calculating the output of a trained model on a test dataset. Arguments to the function are the model, the input, and the target output. Returned variables are the output, and the correlation of output to target output.

This is just a helper function for generating some synthetic data that I use in run_mlp.m. It makes a 2D dataset which consists of populations scattered on the plane. There are some colorcoded plots which make it obvious what is going on.

## About Keith Kelleher

A neurobiologist, number cruncher, data miner, runner (lacking only speed), and a kung fu master (lacking only the knowledge of kung fu).

thanks.

it was usefull.

Hi,

Did you try this also in Octave? I get an error:

octave:13> run_mlp

percent_correct = 96.667

confusion =

98 1 0 1

2 195 3 0

0 8 190 2

3 0 0 97

error: sort: only cell arrays of character strings may be sorted

error: called from:

error: C:\Octave\Octave3.6.4_gcc4.6.2\share\octave\3.6.4\m\set\unique.m at line 136, column 9

error: C:\Users\XXX\Documents\my-neural-network\run_mlp.m at line

107, column 15

Any experience with this? As far as I experienced so far, octave does has problems with cells sometimes…

Hi Chris,

Yes, I’m getting that error now too. It used to work in Octave. I’m thinking they must have changed the way their “unique” function works to break it for me. This code was to do some preprocessing of text class inputs, and the rest of it still seems to work for now. I’ll write another “unique” function to hide the builtin one so that my code works. I’ll update here in comments. Thanks for pointing it out.

Best, Keith.

Cool!

I just managed to get the “built-in” nnet package to work – although its not so nice and comprehensive as yours…

Did you think about sharing your package on octave-forge?

Hi Chris,

Thanks for the complement. I haven’t looked into octave-forge, maybe I should.

I figured out the problem. Octave was including an empty element in my cell array (because concatenating an empty element with another cell results in two cells, one empty and one not, but I expect it to just have one non-empty cell after doing that). Then, when I called unique, it found an empty cell and freaked out. I changed how that cell array is made and it should work now. Hopefully it still works in Matlab, but I don’t have a license anymore so that might be interesting.

Here’s the surgical fix if you don’t want to re-download it, or have modified your code since downloading. Change the text array preprocessing code to read like this:

% preprocess data, make text into binary attributes

input_strings = data{1};

output_strings = veg{1};

for i = 2:length(data)

input_strings = [input_strings data{i}];

output_strings = [output_strings veg{i}];

end

Best,

Keith.

Yeah, that works! Thanks a lot! I just got your network to work with my data – and still like it more than the nnet-package from octave-forge.

I dont get 100% correct class – but I probably still have to tune the size and shape.

Great work!

Hello I am a beginner user in MATLAB. I need to implement MLP on my pictures. But I do not know how I use images instead of input .

I have used your code.

But I could not implement the code on your images.

can You help me please?

Regards

Pretty! This has been a really wonderful post. Thank you for providing this

information.

Hi, thanks for the nice script. By the way, I tried your script but it seems the results are not consistent each time I run it. For example, first run: 91%, second run 94%, third run: 92% and so on. Do you know why is this happening? Which part of the algprithm crates this “randomness”? thank you.