Wir verwenden Cookies und Analyse-Tools, um die Nutzerfreundlichkeit der Internet-Seite zu verbessern und für Marketingzwecke. Wenn Sie fortfahren, diese Seite zu verwenden, nehmen wir an, dass Sie damit einverstanden sind. Zur Datenschutzerklärung.
Backpropagation and it's Modifications
Details
Gradient based methods are one of the most widely used error minimization methods used to train back propagation networks. The BP training algorithm is a supervised learning method for multi-layered feedforward neural networks. It is essentially a gradient descent local optimization technique which involves backward error correction of the network weights. It has many limitations of convergence , getting trapped in local minima and performance. To solve this there are different modifications like introducing momentum and bias terms, conjugate gradient are used. In the conjugate gradient algorithms a search is performed along conjugate directions, which produces generally faster convergence than steepest descent directions.Here, in this monograph we will consider parity bit checking problem with conventional backpropagation method and other methods. We have to construct suitable neural network and train it properly. The training dataset has to be used to train classification engine for solution purpose and then the trained network is used for testing its validation.
Autorentext
Ms Richa Kathuria is in final semester of her MTech(computer science) from Birla Institute of Technology,Ranchi.This work is part of the research done for completion of her thesis.
Weitere Informationen
- Allgemeine Informationen
- GTIN 09783847379355
- Sprache Englisch
- Auflage Aufl.
- Größe H4mm x B220mm x T150mm
- Jahr 2012
- EAN 9783847379355
- Format Kartonierter Einband (Kt)
- ISBN 978-3-8473-7935-5
- Titel Backpropagation and it's Modifications
- Autor Richa Kathuria Karthikeyan
- Untertitel With Bit-Parity Example
- Gewicht 113g
- Herausgeber LAP Lambert Academic Publishing
- Anzahl Seiten 72
- Genre Informatik