Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    ISBN: 9781837633784
    Language: English
    Pages: 1 online resource (462 pages) , illustrations
    Edition: Second edition.
    DDC: 006.3/5
    Keywords: Natural language processing (Computer science) ; Traitement automatique des langues naturelles
    Abstract: Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key Features Understand the complexity of deep learning architecture and transformers architecture Create solutions to industrial natural language processing (NLP) and computer vision (CV) problems Explore challenges in the preparation process, such as problem and language-specific dataset transformation Purchase of the print or Kindle book includes a free PDF eBook Book Description Transformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems. Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting. By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV. What you will learn Focus on solving simple-to-complex NLP problems with Python Discover how to solve classification/regression problems with traditional NLP approaches Train a language model and explore how to fine-tune models to the downstream tasks Understand how to use transformers for generative AI and computer vision tasks Build transformer-based NLP apps with the Python transformers library Focus on language generation such as machine translation and conversational AI in any language Speed up transformer model inference to reduce latency Who this book is for This book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required.
    Note: Includes bibliographical references and index
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    [Erscheinungsort nicht ermittelbar] : Packt Publishing | Boston, MA : Safari
    ISBN: 9781801077651
    Language: English
    Pages: 1 online resource (374 pages)
    Edition: 1st edition
    DDC: 006.35
    Keywords: Natural language processing (Computer science) ; Electronic books ; Traitement automatique des langues naturelles ; Natural language processing (Computer science)
    Abstract: Take a problem-solving approach to learning all about transformers and get up and running in no time by implementing methodologies that will build the future of NLP Key Features Explore quick prototyping with up-to-date Python libraries to create effective solutions to industrial problems Solve advanced NLP problems such as named-entity recognition, information extraction, language generation, and conversational AI Monitor your model's performance with the help of BertViz, exBERT, and TensorBoard Book Description Transformer-based language models have dominated natural language processing (NLP) studies and have now become a new paradigm. With this book, you'll learn how to build various transformer-based NLP applications using the Python Transformers library. The book gives you an introduction to Transformers by showing you how to write your first hello-world program. You'll then learn how a tokenizer works and how to train your own tokenizer. As you advance, you'll explore the architecture of autoencoding models, such as BERT, and autoregressive models, such as GPT. You'll see how to train and fine-tune models for a variety of natural language understanding (NLU) and natural language generation (NLG) problems, including text classification, token classification, and text representation. This book also helps you to learn efficient models for challenging problems, such as long-context NLP tasks with limited computational capacity. You'll also work with multilingual and cross-lingual problems, optimize models by monitoring their performance, and discover how to deconstruct these models for interpretability and explainability. Finally, you'll be able to deploy your transformer models in a production environment. By the end of this NLP book, you'll have learned how to use Transformers to solve advanced NLP problems using advanced models. What you will learn Explore state-of-the-art NLP solutions with the Transformers library Train a language model in any language with any transformer architecture Fine-tune a pre-trained language model to perform several downstream tasks Select the right framework for the training, evaluation, and production of an end-to-end solution Get hands-on experience in using TensorBoard and Weights & Biases Visualize the internal representation of transformer models for interpretability Who this book is for This book is for deep learning researchers, hands-on NLP practitioners, as well as ML/NLP educators and students who w...
    Note: Online resource; Title from title page (viewed September 15, 2021) , Mode of access: World Wide Web.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...