4
$\begingroup$

I've been wanting to make my own Neural Network in Python, in order to better understand how it works. I've been following this series of videos as a sort of guide, but it seems the backpropagation will get much more difficult when you use a larger network, which I plan to do. He doesn't really explain how to scale it to larger ones.

Currently, my network feeds forward, but I don't have much of an idea of where to start with backpropagation. My code is posted below, to show you where I'm currently at (I'm not asking for coding help, just for some pointers to good sources, and I figure knowing where I'm currently at might help):

import numpy class NN: prediction = [] def __init__(self,input_length): self.layers = [] self.input_length = input_length def addLayer(self, layer): self.layers.append(layer) if len(self.layers) >1: self.layers[len(self.layers)-1].setWeights(len(self.layers[len(self.layers)-2].neurons)) else: self.layers[0].setWeights(self.input_length) def feedForward(self, inputs): _inputs = inputs for i in range(len(self.layers)): self.layers[i].process(_inputs) _inputs = self.layers[i].output self.prediction = _inputs def calculateErr(self, target): out = [] for i in range(0,len(self.prediction)): out.append( (self.prediction[i] - target[i]) ** 2 ) return out class Layer: neurons = [] weights = [] biases = [] output = [] def __init__(self,length,function): for i in range(0,length): self.neurons.append(Neuron(function)) self.biases.append(numpy.random.randn()) def setWeights(self, inlength): for i in range(0,inlength): self.weights.append([]) for j in range(0, inlength): self.weights[i].append(numpy.random.randn()) def process(self,inputs): for i in range(0, len(self.neurons)): self.output.append(self.neurons[i].run(inputs,self.weights[i], self.biases[i])) class Neuron: output = 0 def __init__(self, function): self.function = function def run(self, inputs, weights, bias): self.output = self.function(inputs,weights,bias) return self.output def sigmoid(n): return 1/(1+numpy.exp(n)) def inputlayer_func(inputs,weights,bias): return inputs def l2_func(inputs,weights,bias): out = 0 for i in range(0,len(inputs)): out += weights[i] * inputs[i] out += bias return sigmoid(out) NNet = NN(2) l2 = Layer(1,l2_func) NNet.addLayer(l2) NNet.feedForward([2.0,1.0]) print(NNet.prediction) 

So, is there any resource that explains how to implement the back-propagation algorithm step-by-step?

$\endgroup$

2 Answers 2

2
$\begingroup$

Backpropagation isn't too much more complicated, but understanding it well will require a bit of mathematics.

This tutorial is my go-to resource when students want more detail, because it includes fully worked through examples.

Chapter 18 of Russell & Norvig's book includes pseudocode for this algorithm, as well as a derivation, but without good examples.

$\endgroup$
1
0
$\begingroup$

Nowadays, there are many resources that cover the back-propagation algorithm and some of them provide step-by-step examples.

However, in addition to the other answer, I would like to mention the online book Neural Networks and Deep Learning by Nielsen that covers the back-propagation algorithm (and other topics) in detail and, at the same, intuitively, although some could disagree. You can find the associated source code here (which I had consulted a few years ago when I was learning about the topic).

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.