• Medientyp: E-Book
  • Titel: Transformers for natural language processing : build, train, and fine-tuning deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, and GPT-3
  • Beteiligte: Rothman, Denis [Verfasser:in]; Gulli, Antonio [Mitwirkende:r]
  • Erschienen: [Birmingham, United Kingdom]: Packt Publishing, [2022]
  • Erschienen in: Expert insight
  • Ausgabe: Second edition.
  • Umfang: 1 online resource (564 pages); illustrations
  • Sprache: Englisch
  • ISBN: 9781803247335
  • RVK-Notation: ST 306 : Natürliche Sprachverarbeitung
  • Schlagwörter: Natürliche Sprache > Deep learning
  • Entstehung:
  • Anmerkungen: Includes index
  • Beschreibung: Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence. Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers. An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP. This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.