Preface.
Introduction.
Fundamentals.
Network Architectures for Prediction.
Activation Functions Used in Neural Networks.
Recurrent Neural Networks Architectures.
Neural Networks as Nonlinear Adaptive Filters.
Stability Issues in RNN Architectures.
Data-Reusing Adaptive Learning Algorithms.
A Class of Normalised Algorithms for Online Training of Recurrent
Neural Networks.
Convergence of Online Learning Algorithms in Neural Networks.
Some Practical Considerations of Predictability and Learning
Algorithms for Various Signals.
Exploiting Inherent Relationships Between Parameters in Recurrent
Neural Networks.
Appendix A: The O Notation and Vector and Matrix
Differentiation.
Appendix B: Concepts from the Approximation Theory.
Appendix C: Complex Sigmoid Activation Functions, Holomorphic
Mappings and Modular Groups.
Appendix D: Learning Algorithms for RNNs.
Appendix E: Terminology Used in the Field of Neural Networks.
Appendix F: On the A Posteriori Approach in Science and
Engineering.
Appendix G: Contraction Mapping Theorems.
Appendix H: Linear GAS Relaxation.
Appendix I: The Main Notions in Stability Theory.
Appendix J: Deasonsonalising Time Series.
References.
Index.
Danilo Mandic from the Imperial College London, London, UK was named Fellow of the Institute of Electrical and Electronics Engineers in 2013 for contributions to multivariate and nonlinear learning systems.
Jonathon A. Chambers is the author of Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability, published by Wiley.
Ask a Question About this Product More... |