Titre : | Learning: Deep Learning : Theory and Practice of Neural Networks, Computer Vision, Natural Language Processing, and Transformers Using TensorFlow | Type de document : | texte imprimé | Auteurs : | MAGNUS, Ekman, Auteur | Mention d'édition : | 1st edition | Editeur : | NVIDIA | Année de publication : | 2022 | Importance : | 688 p | Présentation : | ill | Format : | 18,8 x 23 cm | ISBN/ISSN/EAN : | 978-0-13-747035-8 | Prix : | 58.40 EUR | Note générale : | Index | Langues : | Anglais (eng) | Mots-clés : | Learning Deep | Index. décimale : | 006.31 EKM | Résumé : | Learning Deep Learning is a complete guide to DL. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this text can be used for students with prior programming experince but with no prior machine learning or statistics experience.
After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Ekman shows how to use them to build advanced architectures, including the Transformer. He describes how these concepts are used to build modern networks for computer vision and natural language processing (NLP), including Mask R-CNN, GPT, and BERT. And he explains a natural language translator and a system generating natural language descriptions of images.
Throughout, Ekman provides concise, well-annotated code examples using TensorFlow with Keras. Corresponding PyTorch examples are provided online, and the book thereby covers the two dominating Python libraries for DL used in industry and academia. He concludes with an introduction to neural architecture search (NAS), exploring important ethical issues and providing resources for further learning.
Explore and master core concepts: perceptrons, gradient-based learning, sigmoid neurons, and back propagation
See how DL frameworks make it easier to develop more complicated and useful neural networks
Discover how convolutional neural networks (CNNs) revolutionize image classification and analysis
Apply recurrent neural networks (RNNs) and long short-term memory (LSTM) to text and other variable-length sequences
Master NLP with sequence-to-sequence networks and the Transformer architecture
Build applications for natural language translation and image captioning |
Learning: Deep Learning : Theory and Practice of Neural Networks, Computer Vision, Natural Language Processing, and Transformers Using TensorFlow [texte imprimé] / MAGNUS, Ekman, Auteur . - 1st edition . - [S.l.] : NVIDIA, 2022 . - 688 p : ill ; 18,8 x 23 cm. ISBN : 978-0-13-747035-8 : 58.40 EUR Index Langues : Anglais ( eng) Mots-clés : | Learning Deep | Index. décimale : | 006.31 EKM | Résumé : | Learning Deep Learning is a complete guide to DL. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this text can be used for students with prior programming experince but with no prior machine learning or statistics experience.
After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Ekman shows how to use them to build advanced architectures, including the Transformer. He describes how these concepts are used to build modern networks for computer vision and natural language processing (NLP), including Mask R-CNN, GPT, and BERT. And he explains a natural language translator and a system generating natural language descriptions of images.
Throughout, Ekman provides concise, well-annotated code examples using TensorFlow with Keras. Corresponding PyTorch examples are provided online, and the book thereby covers the two dominating Python libraries for DL used in industry and academia. He concludes with an introduction to neural architecture search (NAS), exploring important ethical issues and providing resources for further learning.
Explore and master core concepts: perceptrons, gradient-based learning, sigmoid neurons, and back propagation
See how DL frameworks make it easier to develop more complicated and useful neural networks
Discover how convolutional neural networks (CNNs) revolutionize image classification and analysis
Apply recurrent neural networks (RNNs) and long short-term memory (LSTM) to text and other variable-length sequences
Master NLP with sequence-to-sequence networks and the Transformer architecture
Build applications for natural language translation and image captioning |
| |