Sqsq

Only available on StudyMode
  • Topic: Artificial neural network, Neural network, Pattern
  • Pages : 18 (3457 words )
  • Download(s) : 58
  • Published : March 14, 2013
Open Document
Text Preview
3. The Back Propagation Algorithm
Having established the basis of neural nets in the previous chapters, let’s now have a look at some practical networks, their applications and how they are trained. Many hundreds of Neural Network types have been proposed over the years. In fact, because Neural Nets are so widely studied (for example, by Computer Scientists, Electronic Engineers, Biologists and Psychologists), they are given many different names. You’ll see them referred to as Artificial Neural Networks (ANNs), Connectionism or Connectionist Models, Multi-layer Percpetrons (MLPs) and Parallel Distributed Processing (PDP).

However, despite all the different terms and different types, there are a small group of “classic” networks which are widely used and on which many others are based. These are: Back Propagation, Hopfield Networks, Competitive Networks and networks using Spiky Neurons. There are many variations even on these themes. We’ll consider these networks in this and the following chapters, starting with Back Propagation. 3.1 The algorithm

Most people would consider the Back Propagation network to be the quintessential Neural Net. Actually, Back Propagation1,2,3 is the training or learning algorithm rather than the network itself. The network used is generally of the simple type shown in figure 1.1, in chapter 1 and in the examples up until now. These are called FeedForward Networks (we’ll see why in chapter 7 on Hopfield Networks) or occasionally Multi-Layer Perceptrons (MLPs).

The network operates in exactly the same way as the others we’ve seen (if you need to remind yourself, look at worked example 2.3). Now, let’s consider what Back Propagation is and how to use it.
A Back Propagation network learns by example. You give the algorithm examples of what you want the network to do and it changes the network’s weights so that, when training is finished, it will give you the required output for a particular input. Back Propagation networks are ideal for simple Pattern Recognition and Mapping Tasks4. As just mentioned, to train the network you need to give it examples of what you want – the output you want (called the Target) for a particular input as shown in Figure 3.1. Figure 3.1, a Back Propagation training set.

Inputs

Targets
(the output you
want for each pattern)

01

10

16

11

For this
particular
input
pattern to
the
network, we
would like
to get this
output.

So, if we put in the first pattern to the network, we would like the output to be 0 1 as shown in figure 3.2 (a black pixel is represented by 1 and a white by 0 as in the previous examples). The input and its corresponding target are called a Training Pair. Figure 3.2, applying a training pair to a network.

Input 1
0

1

1

0

We’d like this
neuron to give a
“0” out.

Input 2

We’d like this
neuron to give a
“1” out.

Input 3

Input 4

Targets

Tutorial question 3.1:
Redraw the diagram in figure 3.2 to show the inputs and targets for the second pattern.
Once the network is trained, it will provide the desired output for any of the input patterns. Let’s now look at how the training works.
The network is first initialised by setting up all its weights to be small random numbers – say between –1 and +1. Next, the input pattern is applied and the output calculated (this is called the forward pass). The calculation gives an output which is completely different to what you want (the Target), since all the weights are random. We then calculate the Error of each neuron, which is essentially: Target - Actual Output (i.e. What you want – What you actually get). This error is then used mathematically to change the weights in such a way that the error will get smaller. In other words, the Output of each neuron will get closer to its Target (this part is called the reverse pass). The process is repeated again and again until the error is minimal. Let's do an example with an actual network to see how the...
tracking img