Wir verwenden Cookies und Analyse-Tools, um die Nutzerfreundlichkeit der Internet-Seite zu verbessern und für Marketingzwecke. Wenn Sie fortfahren, diese Seite zu verwenden, nehmen wir an, dass Sie damit einverstanden sind. Zur Datenschutzerklärung.
Introduction to Transformers for NLP
Details
Get a hands-on introduction to Transformer architecture using the Hugging Face library. This book explains how Transformers are changing the AI domain, particularly in the area of natural language processing.
This book covers Transformer architecture and its relevance in natural language processing (NLP). It starts with an introduction to NLP and a progression of language models from n-grams to a Transformer-based architecture. Next, it offers some basic Transformers examples using the Google colab engine. Then, it introduces the Hugging Face ecosystem and the different libraries and models provided by it. Moving forward, it explains language models such as Google BERT with some examples before providing a deep dive into Hugging Face API using different language models to address tasks such as sentence classification, sentiment analysis, summarization, and text generation.
After completing Introduction to Transformers for NLP, you will understand Transformer concepts and be able to solve problems using the Hugging Face library.
What You Will Learn
- Understand language models and their importance in NLP and NLU (Natural Language Understanding)
- Master Transformer architecture through practical examples
- Use the Hugging Face library in Transformer-based language models
Create a simple code generator in Python based on Transformer architecture
Who This Book Is ForData Scientists and software developers interested in developing their skills in NLP and NLU (Natural Language Understanding)Explains how to create Hugging Face applications for NLP tasks like sentiment analysis and sentence masking Covers code generator examples using Transformers Explains the language models such as Google BERT, Open AI GPT2, and other open-source models with examples
Autorentext
Shashank Mohan Jain has been working in the IT industry for around 20 years mainly in the areas of cloud computing, machine learning and distributed systems. He has keen interests in virtualization techniques, security, and complex systems. Shashank has software patents to his name in the area of cloud computing, IoT, and machine learning. He is a speaker at multiple reputed cloud conferences. Shashank holds Sun, Microsoft, and Linux kernel certifications.
Klappentext
Chapter 1: Introduction to Language Models.- Chapter 2: Introduction to Transformers.- Chapter 3: BERT.- Chapter 4: Hugging Face.- Chapter 5: Tasks Using the Huggingface Library.- Chapter 6: Fine-Tuning Pre-Trained Models.- Appendix A: Vision Transformers.
Inhalt
Chapter 1: Introduction to Language Models.- Chapter 2: Introduction to Transformers.- Chapter 3: BERT.- Chapter 4: Hugging Face.- Chapter 5: Tasks Using the Huggingface Library.- Chapter 6: Fine-Tuning Pre-Trained Models.- Appendix A: Vision Transformers.
Weitere Informationen
- Allgemeine Informationen
- GTIN 09781484288436
- Genre Information Technology
- Auflage First Edition
- Lesemotiv Verstehen
- Anzahl Seiten 180
- Größe H235mm x B155mm x T11mm
- Jahr 2022
- EAN 9781484288436
- Format Kartonierter Einband
- ISBN 1484288432
- Veröffentlichung 21.10.2022
- Titel Introduction to Transformers for NLP
- Autor Shashank Mohan Jain
- Untertitel With the Hugging Face Library and Models to Solve Problems
- Gewicht 283g
- Herausgeber Apress
- Sprache Englisch