Locally connected recurrent neural networks.

Jazyk: angličtina
Rok vydání: 1993
Předmět:
Druh dokumentu: Bibliografie
Popis: by Evan, Fung-yu Young.
Thesis (M.Phil.)--Chinese University of Hong Kong, 1993.
Includes bibliographical references (leaves 161-166).
List of Figures --- p.vi
List of Tables --- p.vii
List of Graphs --- p.viii
p.ix
Chapter Part I --- Learning Algorithms
Chapter 1 --- Representing Time in Connectionist Models --- p.1
Chapter 1.1 --- Introduction --- p.1
Chapter 1.2 --- Temporal Sequences --- p.2
Chapter 1.2.1 --- Recognition Tasks --- p.2
Chapter 1.2.2 --- Reproduction Tasks --- p.3
Chapter 1.2.3 --- Generation Tasks --- p.4
Chapter 1.3 --- Discrete Time v.s. Continuous Time --- p.4
Chapter 1.4 --- Time Delay Neural Network (TDNN) --- p.4
Chapter 1.4.1 --- Delay Elements in the Connections --- p.5
Chapter 1.4.2 --- NETtalk: An Application of TDNN --- p.7
Chapter 1.4.3 --- Drawbacks of TDNN --- p.8
Chapter 1.5 --- Networks with Context Units --- p.8
Chapter 1.5.1 --- Jordan's Network --- p.9
Chapter 1.5.2 --- Elman's Network --- p.10
Chapter 1.5.3 --- Other Architectures --- p.14
Chapter 1.5.4 --- Drawbacks of Using Context Units --- p.15
Chapter 1.6 --- Recurrent Neural Networks --- p.16
Chapter 1.6.1 --- Hopfield Models --- p.17
Chapter 1.6.2 --- Fully Recurrent Neural Networks --- p.20
Chapter A. --- EXAMPLES OF USING RECURRENT NETWORKS --- p.22
Chapter 1.7 --- Our Objective --- p.25
Chapter 2 --- Learning Algorithms for Recurrent Neural Networks --- p.27
Chapter 2.1 --- Introduction --- p.27
Chapter 2.2 --- Gradient Descent Methods --- p.29
Chapter 2.2.1 --- Backpropagation Through Time (BPTT) --- p.29
Chapter 2.2.2 --- Real Time Recurrent Learning Rule (RTRL) --- p.30
Chapter A. --- RTRL WITH TEACHER FORCING --- p.32
Chapter B. --- TERMINAL TEACHER FORCING --- p.33
Chapter C. --- CONTINUOUS TIME RTRL --- p.33
Chapter 2.2.3 --- Variants of RTRL --- p.34
Chapter A. --- SUB GROUPED RTRL --- p.34
Chapter B. --- A FIXED SIZE STORAGE 0(n3) TIME COMPLEXITY LEARNGING RULE --- p.35
Chapter 2.3 --- Non-Gradient Descent Methods --- p.37
Chapter 2.3.1 --- Neural Bucket Brigade (NBB) --- p.37
Chapter 2.3.2 --- Temporal Driven Method (TO) --- p.38
Chapter 2.4 --- Comparison between Different Approaches --- p.39
Chapter 2.5 --- Conclusion --- p.41
Chapter 3 --- Locally Connected Recurrent Networks --- p.43
Chapter 3.1 --- Introduction --- p.43
Chapter 3.2 --- Locally Connected Recurrent Networks --- p.44
Chapter 3.2.1 --- Network Topology --- p.44
Chapter 3.2.2 --- Subgrouping --- p.46
Chapter 3.2.3 --- Learning Algorithm --- p.47
Chapter 3.2.4 --- Continuous Time Learning Algorithm --- p.50
Chapter 3.3 --- Analysis --- p.51
Chapter 3.3.1 --- Time Complexity --- p.51
Chapter 3.3.2 --- Space Complexity --- p.51
Chapter 3.3.3 --- Local Computations in Time and Space --- p.51
Chapter 3.4 --- Running on Parallel Architectures --- p.52
Chapter 3.4.1 --- Mapping the Algorithm to Parallel Architectures --- p.52
Chapter 3.4.2 --- Parallel Learning Algorithm --- p.53
Chapter 3.4.3 --- Analysis --- p.54
Chapter 3.5 --- Ring-Structured Recurrent Network (RRN) --- p.55
Chapter 3.6 --- Comparison between RRN and RTRL in Sequence Recognition --- p.55
Chapter 3.6.1 --- Training Sets and Testing Sequences --- p.56
Chapter 3.6.2 --- Comparison in Training Speed --- p.58
Chapter 3.6.3 --- Comparison in Recalling Power --- p.59
Chapter 3.7 --- Comparison between RRN and RTRL in Time Series Prediction --- p.59
Chapter 3.7.1 --- Comparison in Training Speed --- p.62
Chapter 3.7.2 --- Comparison in Predictive Power --- p.63
Chapter 3.8 --- Conclusion --- p.65
Chapter Part II --- Applications
Chapter 4 --- Sequence Recognition by Ring-Structured Recurrent Networks --- p.67
Chapter 4.1 --- Introduction --- p.67
Chapter 4.2 --- Related Works --- p.68
Chapter 4.2.1 --- Feedback Multilayer Perceptron (FMLP) --- p.68
Chapter 4.2.2 --- Back Propagation Unfolded Recurrent Rule (BURR) --- p.69
Chapter 4.3 --- Experimental Details --- p.71
Chapter 4.3.1 --- Network Architecture --- p.71
Chapter 4.3.2 --- Input/Output Representations --- p.72
Chapter 4.3.3 --- Training Phase --- p.73
Chapter 4.3.4 --- Recalling Phase --- p.73
Chapter 4.4 --- Experimental Results --- p.74
Chapter 4.4.1 --- Temporal Memorizing Power --- p.74
Chapter 4.4.2 --- Time Warping Performance --- p.80
Chapter 4.4.3 --- Fault Tolerance --- p.85
Chapter 4.4.4 --- Learning Rate --- p.87
Chapter 4.5 --- Time Delay --- p.88
Chapter 4.6 --- Conclusion --- p.91
Chapter 5 --- Time Series Prediction --- p.92
Chapter 5.1 --- Introduction --- p.92
Chapter 5.2 --- Modelling in Feedforward Networks --- p.93
Chapter 5.3 --- Methodology with Recurrent Networks --- p.94
Chapter 5.3.1 --- Network Structure --- p.94
Chapter 5.3.2 --- Model Building - Training --- p.95
Chapter 5.3.3 --- Model Diagnosis - Testing --- p.95
Chapter 5.4 --- Training Paradigms --- p.96
Chapter 5.4.1 --- A Quasiperiodic Series with White Noise --- p.96
Chapter 5.4.2 --- A Chaotic Series --- p.97
Chapter 5.4.3 --- Sunspots Numbers --- p.98
Chapter 5.4.4 --- Hang Seng Index --- p.99
Chapter 5.5 --- Experimental Results and Discussions --- p.99
Chapter 5.5.1 --- A Quasiperiodic Series with White Noise --- p.101
Chapter 5.5.2 --- Logistic Map --- p.103
Chapter 5.5.3 --- Sunspots Numbers --- p.105
Chapter 5.5.4 --- Hang Seng Index --- p.109
Chapter 5.6 --- Conclusion --- p.112
Chapter 6 --- Chaos in Recurrent Networks --- p.114
Chapter 6.1 --- Introduction --- p.114
Chapter 6.2 --- Important Features of Chaos --- p.115
Chapter 6.2.1 --- First Return Map --- p.115
Chapter 6.2.2 --- Long Term Unpredictability --- p.117
Chapter 6.2.3 --- Sensitivity to Initial Conditions (SIC) --- p.118
Chapter 6.2.4 --- Strange Attractor --- p.119
Chapter 6.3 --- Chaotic Behaviour in Recurrent Networks --- p.120
Chapter 6.3.1 --- Network Structure --- p.121
Chapter 6.3.2 --- Dynamics in Training --- p.121
Chapter 6.3.3 --- Dynamics in Testing --- p.122
Chapter 6.4 --- Experiments and Discussions --- p.123
Chapter 6.4.1 --- Henon Model --- p.123
Chapter 6.4.2 --- Lorenz Model --- p.127
Chapter 6.5 --- Conclusion --- p.134
Chapter 7 --- Conclusion --- p.135
Appendix A Series 1 Sine Function with White Noise --- p.137
Appendix B Series 2 Logistic Map --- p.138
Appendix C Series 3 Sunspots Numbers from 1700 to 1979 --- p.139
Appendix D A Quasiperiodic Series with White Noise --- p.141
Appendix E Hang Seng Daily Closing Index in 1991 --- p.142
Appendix F Network Model for the Quasiperiodic Series with White Noise --- p.143
Appendix G Network Model for the Logistic Map --- p.144
Appendix H Network Model for the Sunspots Numbers --- p.145
Appendix I Network Model for the Hang Seng Index --- p.146
Appendix J Henon Model --- p.147
Appendix K Network Model for the Henon Map --- p.150
Appendix L Lorenz Model --- p.151
Appendix M Network Model for the Lorenz Map --- p.159
Bibliography --- p.161
Databáze: Networked Digital Library of Theses & Dissertations