Supervised Sequence Labelling with Recurrent Neural Networks

CHF 242.20
Auf Lager
SKU
67DGU63KMBS
Stock 1 Verfügbar
Geliefert zwischen Mi., 28.01.2026 und Do., 29.01.2026

Details

This book offers a complete framework for classifying and transcribing sequential data with recurrent neural networks. It uses state-of-the-art results in speech and handwriting recognition to show the framework in action.

Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning toolsrobust to input noise and distortion, able to exploit long-range contextual informationthat would seem ideally suited to such problems. However their role in large-scale sequence labelling systems has so far been auxiliary.

The goal of this book is a complete framework for classifying and transcribing sequential data with recurrent neural networks only. Three main innovations are introduced in order to realise this goal. Firstly, the connectionist temporal classification output layer allows the framework to be trained with unsegmented target sequences, such as phoneme-level speech transcriptions; this is in contrast to previous connectionist approaches, which were dependent on error-prone prior segmentation. Secondly, multidimensional recurrent neural networks extend the framework in a natural way to data with more than one spatio-temporal dimension, such as images and videos. Thirdly, the use of hierarchical subsampling makes it feasible to apply the framework to very large or high resolution sequences, such as raw audio or video.

Experimental validation is provided by state-of-the-art results in speech and handwriting recognition.


Recent research in Supervised Sequence Labelling with Recurrent Neural Networks New results in a hot topic Written by leading experts

Inhalt
Introduction.- Supervised Sequence Labelling.- Neural Networks.- Long Short-Term Memory.- A Comparison of Network Architectures.- Hidden Markov Model Hybrids.- Connectionist Temporal Classification.- Multidimensional Networks.- Hierarchical Subsampling Networks.

Weitere Informationen

  • Allgemeine Informationen
    • GTIN 09783642247965
    • Auflage 2012
    • Sprache Englisch
    • Genre Allgemeines & Lexika
    • Lesemotiv Verstehen
    • Größe H241mm x B160mm x T13mm
    • Jahr 2012
    • EAN 9783642247965
    • Format Fester Einband
    • ISBN 3642247962
    • Veröffentlichung 09.02.2012
    • Titel Supervised Sequence Labelling with Recurrent Neural Networks
    • Autor Alex Graves
    • Untertitel Studies in Computational Intelligence 385
    • Gewicht 412g
    • Herausgeber Springer Berlin Heidelberg
    • Anzahl Seiten 160

Bewertungen

Schreiben Sie eine Bewertung
Nur registrierte Benutzer können Bewertungen schreiben. Bitte loggen Sie sich ein oder erstellen Sie ein Konto.
Made with ♥ in Switzerland | ©2025 Avento by Gametime AG
Gametime AG | Hohlstrasse 216 | 8004 Zürich | Schweiz | UID: CHE-112.967.470
Kundenservice: customerservice@avento.shop | Tel: +41 44 248 38 38