Preview

Sqsq

Good Essays
Open Document
Open Document
3457 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
Sqsq
3. The Back Propagation Algorithm
Having established the basis of neural nets in the previous chapters, let’s now have a look at some practical networks, their applications and how they are trained.
Many hundreds of Neural Network types have been proposed over the years. In fact, because Neural Nets are so widely studied (for example, by Computer Scientists,
Electronic Engineers, Biologists and Psychologists), they are given many different names. You’ll see them referred to as Artificial Neural Networks (ANNs),
Connectionism or Connectionist Models, Multi-layer Percpetrons (MLPs) and
Parallel Distributed Processing (PDP).
However, despite all the different terms and different types, there are a small group of
“classic” networks which are widely used and on which many others are based. These are: Back Propagation, Hopfield Networks, Competitive Networks and networks using Spiky Neurons. There are many variations even on these themes. We’ll consider these networks in this and the following chapters, starting with Back Propagation.
3.1 The algorithm
Most people would consider the Back Propagation network to be the quintessential
Neural Net. Actually, Back Propagation1,2,3 is the training or learning algorithm rather than the network itself. The network used is generally of the simple type shown in figure 1.1, in chapter 1 and in the examples up until now. These are called FeedForward Networks (we’ll see why in chapter 7 on Hopfield Networks) or occasionally Multi-Layer Perceptrons (MLPs).
The network operates in exactly the same way as the others we’ve seen (if you need to remind yourself, look at worked example 2.3). Now, let’s consider what Back
Propagation is and how to use it.
A Back Propagation network learns by example. You give the algorithm examples of what you want the network to do and it changes the network’s weights so that, when training is finished, it will give you the required output for a particular input. Back
Propagation

You May Also Find These Documents Helpful

  • Good Essays

    Nt1310 Unit 7 Lab Report

    • 493 Words
    • 2 Pages

    Having obtained the Error for the hidden layer neurons now proceed as in stage 3 to change the hidden layer weights. By repeating this method a network can be trained for any number of layers.…

    • 493 Words
    • 2 Pages
    Good Essays
  • Powerful Essays

    NT1210 Chapter 6 Summary

    • 1603 Words
    • 7 Pages

    Backbone Network: Connecting all of the LANs of an organization entails another type of network (BN). A properly designed backbone network provides a high-speed circuit that serves as the central conduit across which the LANs of an organization can communicate. They can also be used to connect LANs within a building, across a campus, and, increasingly, across much greater distances. A BN, as indicated by its name, is a network of its own. Besides connecting the various network segments, the backbone may have its own devices that can be accessed by other network segments…

    • 1603 Words
    • 7 Pages
    Powerful Essays
  • Better Essays

    History of Neuroscience

    • 1199 Words
    • 5 Pages

    References: 1. ^ Kandel, ER; Schwartz JH, Jessell TM (2000). Principles of Neural Science (4th ed. ed.). New York: McGraw-Hill. ISBN 0-8385-7701-6.…

    • 1199 Words
    • 5 Pages
    Better Essays
  • Best Essays

    Spiral Structure

    • 3835 Words
    • 16 Pages

    References: [1] Touretzky, DS, and Pomerleau, DA. What’s hidden in the hidden layers? Byte 1989; August issue:227-233. [2] Fahlman, SE. Faster-learning variations on back-propagation: An empirical study. In Proceedings of the 1988 Connectionist Models Summer School, Morgan Kaufmann, 1988. [3] Fahlman, SE and Lebiere, C. The cascade-correlation learning architecture, In Advances in neural information processing systems 2, Touretzky, DS (ed.), Morgan Kaufmann, 1990. [4] Lang, KJ and Witbrock, MJ. Learning to tell two spirals apart, In Proceedings of the 1988 Connectionist Models Summer School, Morgan Kaufmann, 1988. [5] Tay, LP and Evans, DJ. Fast learning artificial neural network (FLANN II) using the nearest neighbour recall. Neural, Parallel and Scientific Computations 1994; 2(1):17-27. [6] Sun, CT and Jang, JS. A neuro-fuzzy classifier and its applications, In Proceedings of the IEEE International conference on fuzzy systems, 1993, vol. 1, pp. 94-98. [7] Chua, H, Jia, J, Chen, L and Gong, Y. Solving the two-spiral problem through input data encoding, Electronics letters 1995; 31(10):813-14. [8] Jia, J and Chua, H. Solving two-spiral problem through input data representation, In Proceedings of the IEEE International conference on neural networks, 1995, vol. 1, pp. 132-135. [9] Ulgen, F, Akamatsu, N and Iwasa, T. The hypercube separation algorithm: a fast and efficient algorithm for on-line handwritten character recognition, Applied Intelligence 1996; 6(2):101-116. [10] Singh, S. A single nearest neighbour fuzzy approach for pattern recognition, (in press, International Journal of Pattern Recognition and Artificial Intelligence, 1999). [11] Singh, S. 2D spiral recognition using possibilistic measures, Pattern Recognition Letters 1998; 19(2):141-147.…

    • 3835 Words
    • 16 Pages
    Best Essays
  • Powerful Essays

    Aapo Hyvärinen Helsinki University of Technology Laboratory of Computer and Information Science P.O. Box 5400, FIN-02015 HUT, Finland Email: aapo.hyvarinen@@hut.fi IEEE Trans. on Neural Networks, 10(3):626-634, 1999.…

    • 8254 Words
    • 34 Pages
    Powerful Essays
  • Powerful Essays

    Sleep and Dreams

    • 3676 Words
    • 15 Pages

    * Neural synchronicity: they have to activate in sync with other neurons, makes them more easily detectable and more powerful…

    • 3676 Words
    • 15 Pages
    Powerful Essays
  • Satisfactory Essays

    Sample Lan Design

    • 379 Words
    • 2 Pages

    The following sequence of labs will introduce the networking terms below. This networking terminology will be studied in detail in subsequent chapters.…

    • 379 Words
    • 2 Pages
    Satisfactory Essays
  • Better Essays

    it is as Y = X1 X2 . . . Xs so as minimize the total sparsity si=1 π(Xi ).…

    • 8763 Words
    • 43 Pages
    Better Essays
  • Powerful Essays

    Sscq

    • 1484 Words
    • 6 Pages

    | 1 - Global Delivery Manager1 - Delivery Manager for Development1- Delivery Manager for Maintenance…

    • 1484 Words
    • 6 Pages
    Powerful Essays
  • Powerful Essays

    References: 1. Stevenson, Charles F., 1966. Neurophysiology: A Primer, John Wiley & Sons, Inc. 2. Gerald, Curtis F. and Wheatley, Patrick O., 1999. Applied Numerical Analysis, 6th Ed., Addison-Wesley 3. Russel, Stuart J. and Norvig, Peter, 2003. Articial Intelligence: A Modern Approach, 2nd Ed., Prentice Hall 4. Schalko, Robert J., 1997. Articial Neural Networks, McGraw-Hill 5. Li, Hongzing, Chen, Philip C.L. and Huang, Han-Pang, 2001. Fuzzy Neural Intelligent Systems, CRC Press LLC 6. Jang, J.-S. R., Sun, C.-T. and Mizutani, E., 1997. Neuro-Fuzzy and Soft Computing: A Computational Approach to Learning and Machine Intelligence, Prentice Hall 7. Kosko, Bart, 1992. Neural Networks and Fuzzy Systems: A Dynamic Systems Approach to Machine Intelligence, Prentice Hall 8. Mammone, Richard J. and Zeevi, Yehoshua, 1991. Neural Networks: Theory and Applications, Academic Press, Inc. 9. Principe, José C., Euliano, Neil R., and Lefebvre, W. Curt, 2000. Neural and Adaptive Systems: Fundamentals Through Simulations, John Wiley & Sons, Inc. 10. Foley, James D., van Dam, Andries, Feiner, Steven K., Hughes, John F., 1996. Computer Graphics: Principles and Practice: Second Edition in C, Addison Wesley…

    • 2844 Words
    • 12 Pages
    Powerful Essays
  • Powerful Essays

    Sreeni Ppt

    • 1390 Words
    • 6 Pages

    3. Cohen,Haim and Bergman,Gadi (2002) Car License Plate Recognition. [available at: http://visl.technion.ac.il/projects/2002w02/] [viewed on 17/08/2008]…

    • 1390 Words
    • 6 Pages
    Powerful Essays
  • Powerful Essays

    Hf-Rnn Supp

    • 2870 Words
    • 12 Pages

    2.5 Noiseless memorization . . . . . . . . . . . . . . . . . . . . . . . . . 5…

    • 2870 Words
    • 12 Pages
    Powerful Essays
  • Good Essays

    Adaptive filters

    • 2894 Words
    • 12 Pages

    the design of adaptive transversal filters. These algorithms are applied for identification of an unknown…

    • 2894 Words
    • 12 Pages
    Good Essays
  • Powerful Essays

    Artificial Neural Network

    • 2115 Words
    • 9 Pages

    The modern usage of the term neural network is refers to artificial neural networks, which are composed of artificial neurons or nodes Artificial neural networks are composed of interconnecting artificial neurons. These neurons are connected through a network structure. Once a network has been structured for a particular application, that network is ready to be trained. We know that there are several learning methods for the training to that network like: Supervised Learning, Unsupervised Learning, Reinforced Learning etc.…

    • 2115 Words
    • 9 Pages
    Powerful Essays
  • Powerful Essays

    finger print

    • 1840 Words
    • 8 Pages

    Over the last 25 years artificial neural networks have found its way into various applications ranging from character recognition, pattern recognition, handwriting recognition and so many others. Artificial neural networks are models inspired by the animal central nervous system which includes the brain and that of many other organisms. Frequently neural networks is used in a broad sense which group together different families of algorithms and methods.…

    • 1840 Words
    • 8 Pages
    Powerful Essays