Titre : | Transformers for Natural Language Processing : build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more | Type de document : | texte imprimé | Auteurs : | Rothman, Denis, Auteur | Mention d'édition : | First published | Editeur : | Packt Publishing | Année de publication : | 2001 | Importance : | xvi, 360 | Présentation : | ill | Format : | 19 x 23.5 Cm | ISBN/ISSN/EAN : | 978-1-8005-6579-1 | Prix : | 93,50 € | Note générale : | Index | Langues : | Anglais (eng) | Mots-clés : | Language Processing | Index. décimale : | 006.1 ROT | Résumé : | Book DescriptionThe transformer architecture has proved to be revolutionary in outperforming the classical RNN and CNN models in use today. With an apply-as-you-learn approach, Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains with transformers. | Note de contenu : | Getting Started with the Model Architecture of the Transformer
Fine-Tuning BERT Models
Pretraining a RoBERTa Model from Scratch
Downstream NLP Tasks with Transformers
Machine Translation with the Transformer
Text Generation with OpenAI GPT-2 and GPT-3 Models
Applying Transformers to Legal and Financial Documents for AI Text Summarization
Matching Tokenizers and Datasets
Semantic Role Labeling with BERT-Based Transformers
Let Your Data Do the Talking: Story, Questions, and Answers
Detecting Customer Emotions to Make Predictions
Analyzing Fake News with Transformers
Appendix: Answers to the Questions |
Transformers for Natural Language Processing : build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more [texte imprimé] / Rothman, Denis, Auteur . - First published . - [S.l.] : Packt Publishing, 2001 . - xvi, 360 : ill ; 19 x 23.5 Cm. ISBN : 978-1-8005-6579-1 : 93,50 € Index Langues : Anglais ( eng) Mots-clés : | Language Processing | Index. décimale : | 006.1 ROT | Résumé : | Book DescriptionThe transformer architecture has proved to be revolutionary in outperforming the classical RNN and CNN models in use today. With an apply-as-you-learn approach, Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains with transformers. | Note de contenu : | Getting Started with the Model Architecture of the Transformer
Fine-Tuning BERT Models
Pretraining a RoBERTa Model from Scratch
Downstream NLP Tasks with Transformers
Machine Translation with the Transformer
Text Generation with OpenAI GPT-2 and GPT-3 Models
Applying Transformers to Legal and Financial Documents for AI Text Summarization
Matching Tokenizers and Datasets
Semantic Role Labeling with BERT-Based Transformers
Let Your Data Do the Talking: Story, Questions, and Answers
Detecting Customer Emotions to Make Predictions
Analyzing Fake News with Transformers
Appendix: Answers to the Questions |
| |