Share Email Print

Spie Press Book

Artificial Neural Networks: An Introduction
Format Member Price Non-Member Price

Book Description

This tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today.

The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks.


Book Details

Date Published: 30 August 2005
Pages: 180
ISBN: 9780819459879
Volume: TT68

Table of Contents
SHOW Table of Contents | HIDE Table of Contents
Preface ix
Acknowledgements xi
Chapter 1 Introduction 1
1.1. The Neuron 1
1.2. Modeling Neurons 2
1.3. The Feedforward Neural Network 8
1.3.1. The Credit-assignment Problem 9
1.3.2. Complexity 10
1.4. Historical Perspective on Computing with Artificial Neurons 11
Chapter 2 Learning Methods 13
2.1. Supervised Training Methods 13
2.2. Unsupervised Training Methods 13
Chapter 3 Data Normalization 15
3.1. Statistical or Z-Score Normalization 15
3.2. Min-Max Normalization 16
3.3. Sigmoidal or SoftMax Normalization 16
3.4. Energy Normalization 17
3.5. Principal Components Normalization 17
Chapter 4 Data Collection, Preparation, Labeling, and Input Coding 21
4.1. Data Collection 21
4.1.1. Data-collection Plan 21
4.1.2. Biased Data Set 23
4.1.3. Amount of Data 24
4.1.4. Features/Measurements 24
4.1.5. Data Labeling 25
4.2. Feature Selection and Extraction 25
4.2.1. The Curse of Dimensionality 26
4.2.2. Feature Reduction/Dimensionality Reduction 26
4.2.3. Feature Distance Metrics 28
Chapter 5 Output Coding 31
5.1. Classifier Coding 31
5.2. Estimator Coding 31
Chapter 6 Post-processing 33
Chapter 7 Supervised Training Methods 35
7.1. The Effects of Training Data on Neural-network Performance 36
7.1.1. Comparative Analysis 37
7.2. Rules of Thumb for Training Neural Networks 42
7.2.1. Foley's Rule 42
7.2.2. Cover's Rule 42
7.2.3. VC Dimension 43
7.2.4. The Number of Hidden Layers 43
7.2.5. Number of Hidden Neurons 43
7.2.6. Transfer Functions 43
7.3. Training and Testing 44
7.3.1. Split-sample Testing 44
7.3.2. Use of Validation Error 46
7.3.3. Use of Validation Error to Select Number of Hidden Neurons 47
Chapter 8 Unsupervised Training Methods 49
8.1. Self-organizing Maps(SOMs) 49
8.1.1. SOM Training 51
8.1.2. An Example Problem Solution Using the SOM 53
8.2. Adaptive Resonance Theory Network 56
Chapter 9 Recurrent Neural Networks 61
9.1. Hopfield Neural Networks 61
9.2. The Bidirectional Associative Memory(BAM) 63
9.3. The Generalized Linear Neural Network 66
9.3.1. GLNN Example 67
9.4. Real-time Recurrent Network 68
9.5. Elman Recurrent Network 68
Chapter 10 A Plethora of Applications 71
10.1. Function Approximation 71
10.2. Function Approximation--Boston Housing Example 74
10.3. Function Approximation--Cardiopulmonary Modeling 75
10.4.Pattern Recognition--Tree-classifier Example 80
10.5.Pattern Recognition--Handwritten Number-recognition Example 84
10.6.Pattern Recognition--Electronic-nose-Example 88
10.7.Pattern recognition--Airport-scanner Texture-recognition Example 91
10.8. Self-organization--Serial-killer Data-mining Example 94
10.9. Pulse-coupled Neural Networks--Image-segmentation Example 96
Chapter 11 Dealing with Limited Amounts of Data 101
11.1.k-fold Cross-validation 101
11.2. Leave-one-out Cross-validation 102
11.3. Jackknife Resampling 102
11.4. Bootstrap Resampling 103
Appendix A.The Feedforward Neural Network 107
A.1. Mathematics of the Feedforward Process 107
A.2. The Backpropagation Algorithm 109
A.2.1. Generalized Delta Rule 110
A.2.2. Backpropagation Process 113
A.2.3. Advantages and Disadvantages of Backpropagation 116
A.3. Alternatives to Backpropagation 116
A.3.1. Conjugate Gradient Descent 117
A.3.2. Cascade Correlation 117
A.3.3. Second-order Gradient Techniques 118
A.3.4. Evolutionary Computation 122
Appendix B. Feature Saliency 125
Appendix C. Matlab Code for Various Neural Networks 131
C.1. Matlab Code for Principal-components Normalization 131
C.2. Hopfield Network 132
C.3. Generalized Neural Network 133
C.4. Generalized Neural-network Example 134
C.5. ART-like Network 135
C.6. Simple Perceptron Algorithm 137
C.7. Kohonen Self-organizing Feature Map 138
Appendix D. Glossary of Terms 143
References 151
Index 163

Preface

This text introduces the reader to the fascinating world of artificial neural networks, a journey that the authors are here to help you with. The authors have written this book for the reader who wants to understand artificial neural networks without necessarily being bogged down in the mathematics. A glossary is included to assist the reader in understanding any unfamiliar terms. For those who desire the math, sufficient detail for most of the common neural network algorithms is included in the appendixes.

The concept of data-driven computing is the overriding principle upon which neural networks have been built. Many problems exist for which data are plentiful, , but there is no underlying knowledge of the process that converts the measured inputs into the observed outputs. Artificial neural networks are well suited to this class of problem because they are excellent data mappers in that they map inputs to outputs. This text illustrates how this is done with examples and relevant snippets of theory.

The authors have enjoyed writing the text and welcome readers to dig further and learn how artificial neural networks are changing the world around them.


© SPIE. Terms of Use
Back to Top