Wir verwenden Cookies und Analyse-Tools, um die Nutzerfreundlichkeit der Internet-Seite zu verbessern und für Marketingzwecke. Wenn Sie fortfahren, diese Seite zu verwenden, nehmen wir an, dass Sie damit einverstanden sind. Zur Datenschutzerklärung.
Deep Neural Networks in a Mathematical Framework
Details
This SpringerBrief describes how to build a rigorous end-to-end mathematical framework for deep neural networks. The authors provide tools to represent and describe neural networks, casting previous results in the field in a more natural light. In particular, the authors derive gradient descent algorithms in a unified way for several neural network structures, including multilayer perceptrons, convolutional neural networks, deep autoencoders and recurrent neural networks. Furthermore, the authors developed framework is both more concise and mathematically intuitive than previous representations of neural networks.
This SpringerBrief is one step towards unlocking the black box of Deep Learning. The authors believe that this framework will help catalyze further discoveries regarding the mathematical properties of neural networks.This SpringerBrief is accessible not only to researchers, professionals and students working and studying in the field of deep learning, but alsoto those outside of the neutral network community.
Weitere Informationen
- Allgemeine Informationen
- GTIN 09783319753034
- Genre Information Technology
- Auflage 1st ed. 2018
- Lesemotiv Verstehen
- Anzahl Seiten 84
- Größe H234mm x B154mm x T6mm
- Jahr 2018
- EAN 9783319753034
- Format Kartonierter Einband
- ISBN 978-3-319-75303-4
- Titel Deep Neural Networks in a Mathematical Framework
- Autor Anthony L. Caterini , Dong Eui Chang
- Untertitel SpringerBriefs in Computer Science
- Gewicht 184g
- Herausgeber Springer-Verlag GmbH
- Sprache Englisch