Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    [Erscheinungsort nicht ermittelbar] : O'Reilly Media, Inc. | Boston, MA : Safari
    Language: English
    Edition: Revised Edtion
    Keywords: Electronic books ; local ; Electronic books
    Abstract: Since their introduction in 2017, Transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book shows you how to train and scale these large models using HuggingFace Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf use a hands-on approach to teach you how Transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize Transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how Transformers can be used for cross-lingual transfer learning Apply Transformers in real-world scenarios where labeled data is scarce Make Transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train Transformers from scratch and learn how to scale to multiple GPUs and distributed environments
    Note: Online resource; Title from title page (viewed March 25, 2022)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Language: German
    Pages: 1 online resource (432 pages) , illustrations
    Edition: 1. Auflage.
    Uniform Title: Natural language processing with transformers
    DDC: 006.3/5
    Keywords: Natural language processing (Computer science) ; Python (Computer program language) ; Machine learning ; Cloud computing
    Abstract: Transformer haben sich seit ihrer Einführung nahezu über Nacht zur vorherrschenden Architektur im Natural Language Processing entwickelt. Sie liefern die besten Ergebnisse für eine Vielzahl von Aufgaben bei der maschinellen Sprachverarbeitung. Wenn Sie Data Scientist oder Programmierer sind, zeigt Ihnen dieses praktische Buch, wie Sie NLP-Modelle mit Hugging Face Transformers, einer Python-basierten Deep-Learning-Bibliothek, trainieren und skalieren können. Transformer kommen beispielsweise beim maschinellen Schreiben von Nachrichtenartikeln zum Einsatz, bei der Verbesserung von Google-Suchanfragen oder bei Chatbots. In diesem Handbuch zeigen Ihnen Lewis Tunstall, Leandro von Werra und Thomas Wolf, die auch die Transformers-Bibliothek von Hugging Face mitentwickelt haben, anhand eines praktischen Ansatzes, wie Transformer-basierte Modelle funktionieren und wie Sie sie in Ihre Anwendungen integrieren können. Sie werden schnell eine Vielzahl von Aufgaben wie Textklassifikation, Named Entity Recognition oder Question Answering kennenlernen, die Sie mit ihnen lösen können.
    Note: Includes bibliographical references and index
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Language: English
    Pages: 1 online resource (1 video file (1 hr., 24 min.)) , sound, color.
    Edition: Revised edition.
    DDC: 006.3/5
    Keywords: Natural language processing (Computer science) ; Machine learning ; Cloud computing ; Traitement automatique des langues naturelles ; Apprentissage automatique ; Infonuagique ; Instructional films ; Nonfiction films ; Internet videos ; Films de formation ; Films autres que de fiction ; Vidéos sur Internet
    Abstract: Sponsored by deepset Join us for this edition of O'Reilly Book Club with Lewis Tunstall and Leandro von Werra, authors of Natural Language Processing with Transformers, to learn how transformers, which are the backbone of the state-of-the-art LLM models in the field today, work and how to integrate them in your applications. Bringing insights from their experience working at Hugging Face, Lewis and Leandro will discuss the benefits and challenges of training your own model, scaling, and deploying it for your system. Learn tricks of the trade, listen to stories, and gain new insights. What you'll learn and how you can apply it Learn about a variety of NLP tasks transformers can help you solve Understand how transformers can be used for transfer learning Learn about building, debugging, and optimizing transformer models This live event is for you because... You want to go beyond the words on the page and ask your own questions. You're a data scientist, machine learning engineer, or developer who is working with AI and machine learning and want to learn about the latest developments in the field. Recommended follow-up: Read Natural Language Processing with Transformers, Revised Edition (book) Read Hands-On Large Language Models (early release book) Read Hands-On Generative AI with Transformers and Diffusion Models (early release book) Read Introduction to Transformers for NLP: With the Hugging Face Library and Models to Solve Problems (book) Watch Natural Language Processing Using Transformer Architectures (video).
    Note: Online resource; title from title details screen (O'Reilly, viewed November 1, 2023)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    ISBN: 1098136764 , 9781098136765
    Language: English
    Pages: 1 online resource
    Edition: Revised edition.
    DDC: 006.3/5
    Keywords: Natural language processing (Computer science) ; Python (Computer program language) ; Machine learning ; Cloud computing ; Electronic books
    Abstract: Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how transformers can be used for cross-lingual transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Orig.schr. Ausgabe: 初版.
    Title: 機械学習エンジニアのためのTransformers : : 最先端の自然言語処理ライブラリによるモデル開発 /
    Publisher: オライリー・ジャパン,
    ISBN: 9784873119953 , 4873119952
    Language: Japanese
    Pages: 1 online resource (424 pages) , color illustrations..
    Edition: Shohan.
    Uniform Title: Natural language processing with transformers
    DDC: 006.3/5
    Keywords: Natural language processing (Computer science) ; Python (Computer program language) ; Deep learning (Machine learning)
    Abstract: "Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book -now revised in full color- shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library. Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve. Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering Learn how transformers can be used for cross-lingual transfer learning Apply transformers in real-world scenarios where labeled data is scarce Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments." --
    Note: Includes bibiographical references , In Japanese.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...