Detailansicht

Neural Networks

A Systematic Introduction
ISBN/EAN: 9783540605058
Umbreit-Nr.: 4368616

Sprache: Englisch
Umfang: xx, 502 S., 154 s/w Illustr., 502 p. 154 illus.
Format in cm:
Einband: kartoniertes Buch

Erschienen am 12.07.1996
Auflage: 1/1996
€ 96,29
(inklusive MwSt.)
Lieferbar innerhalb 1 - 2 Wochen
  • Zusatztext
    • Inhaltsangabe1. The Biological Paradigm.- 1.1 Neural computation.- 1.1.1 Natural and artificial neural networks.- 1.1.2 Models of computation.- 1.1.3 Elements of a computing model.- 1.2 Networks of neurons.- 1.2.1 Structure of the neurons.- 1.2.2 Transmission of information.- 1.2.3 Information processing at the neurons and synapses.- 1.2.4 Storage of information - learning.- 1.2.5 The neuron - a self-organizing system.- 1.3 Artificial neural networks.- 1.3.1 Networks of primitive functions.- 1.3.2 Approximation of functions.- 1.3.3 Caveat.- 1.4 Historical and bibliographical remarks.- 2. Threshold Logic.- 2.1 Networks of functions.- 2.1.1 Feed-forward and recurrent networks.- 2.1.2 The computing units.- 2.2 Synthesis of Boolean functions.- 2.2.1 Conjunction, disjunction, negation.- 2.2.2 Geometric interpretation.- 2.2.3 Constructive synthesis.- 2.3 Equivalent networks.- 2.3.1 Weighted and unweighted networks.- 2.3.2 Absolute and relative inhibition.- 2.3.3 Binary signals and pulse coding.- 2.4 Recurrent networks.- 2.4.1 Stored state networks.- 2.4.2 Finite automata.- 2.4.3 Finite automata and recurrent networks.- 2.4.4 A first classification of neural networks.- 2.5 Harmonic analysis of logical functions.- 2.5.1 General expression.- 2.5.2 The Hadamard-Walsh transform.- 2.5.3 Applications of threshold logic.- 2.6 Historical and bibliographical remarks.- 3.Weighted Networks - The Perceptron.- 3.1 Perceptrons and parallel processing.- 3.1.1 Perceptrons as weighted threshold elements.- 3.1.2 Computational limits of the perceptron model.- 3.2 Implementation of logical functions.- 3.2.1 Geometric interpretation.- 3.2.2 The XOR problem.- 3.3 Linearly separable functions.- 3.3.1 Linear separability.- 3.3.2 Duality of input space and weight space.- 3.3.3 The error function in weight space.- 3.3.4 General decision curves.- 3.4 Applications and biological analogy.- 3.4.1 Edge detection with perceptrons.- 3.4.2 The structure of the retina.- 3.4.3 Pyramidal networks and the neocognitron.- 3.4.4 The silicon retina.- 3.5 Historical and bibliographical remarks.- 4. Perceptron Learning.- 4.1 Learning algorithms for neural networks.- 4.1.1 Classes of learning algorithms.- 4.1.2 Vector notation.- 4.1.3 Absolute linear separability.- 4.1.4 The error surface and the search method.- 4.2 Algorithmic learning.- 4.2.1 Geometric visualization.- 4.2.2 Convergence of the algorithm.- 4.2.3 Accelerating convergence.- 4.2.4 The pocket algorithm.- 4.2.5 Complexity of perceptron learning.- 4.3 Linear programming.- 4.3.1 Inner points of polytopes.- 4.3.2 Linear separability as linear optimization.- 4.3.3 Karmarkar's algorithm.- 4.4 Historical and bibliographical remarks.- 5. Unsupervised Learning and Clustering Algorithms.- 5.1 Competitive learning.- 5.1.1 Generalization of the perceptron problem.- 5.1.2 Unsupervised learning through competition.- 5.2 Convergence analysis.- 5.2.1 The one-dimensional case - energy function.- 5.2.2 Multidimensional case - the classical methods.- 5.2.3 Unsupervised learning as minimization problem.- 5.2.4 Stability of the solutions.- 5.3 Principal component analysis.- 5.3.1 Unsupervised reinforcement learning.- 5.3.2 Convergence of the learning algorithm.- 5.3.3 Multiple principal components.- 5.4 Some applications.- 5.4.1 Pattern recognition.- 5.4.2 Image compression.- 5.5 Historical and bibliographical remarks.- 6. One and Two Layered Networks.- 6.1 Structure and geometric visualization.- 6.1.1 Network architecture.- 6.1.2 The XOR problem revisited.- 6.1.3 Geometric visualization.- 6.2 Counting regions in input and weight space.- 6.2.1 Weight space regions for the XOR problem.- 6.2.2 Bipolar vectors.- 6.2.3 Projection of the solution regions.- 6.2.4 Geometric interpretation.- 6.3 Regions for two layered networks.- 6.3.1 Regions in weight space for the XOR problem.- 6.3.2 Number of regions in general.- 6.3.3 Consequences.- 6.3.4 The Vapnik-Chervonenkis dimension.- 6.3.5 The problem of local minima.- 6.4 Historical and bibliographical remarks
  • Kurztext
    • Artificial neural networks are an alternative computational paradigm with roots in neurobiology which has attracted increasing interest in recent years. This book is a comprehensive introduction to the topic that stresses the systematic development of the underlying theory. Starting from simple threshold elements, more advanced topics are introduced, such as multilayer networks, efficient learning methods, recurrent networks, and self-organization. The various branches of neural network theory are interrelated closely and quite often unexpectedly, so the chapters treat the underlying connection between neural models and offer a unified view of the current state of research in the field. The book has been written for anyone interested in understanding artificial neural networks or in learning more about them. The only mathematical tools needed are those learned during the first two years at university. The text contains more than 300 figures to stimulate the intuition of the reader and to illustrate the kinds of computation performed by neural networks. Material from the book has been used successfully for courses in Germany, Austria and the United States.