Address NLP tasks as well as multi-modal tasks including both NLP and CV through the utilization of modern transformer architecture.
Key FeaturesUnderstand the Complexity of Deep Learning Architectures and Transformers ArchitectureLearn how to create effective solutions to industrial NLP and CV problemsLearn about the challenges in the preparation process, such as problem and language-specific data sets transformationBook DescriptionThe Transformer-based language models such as BERT, T5, GPT, DALL-E, ChatGPT have dominated natural language processing studies and become a new paradigm. Understand and be able to implement multimodal solutions including text-to-image. Computer vision solutions that are based on Transformers are also explained in the book.
Thanks to their accurate and fast fine-tuning capabilities, Transformer-based language models outperformed traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems.
Apart from NLP, recently a fast-growing area in multimodal learning and generative AI has been established which shows promising results. Dalle and Stable diffusions are examples of it.
Developers working with The Transformers architecture will be able to put their knowledge to work with this practical guide to NLP. The book provides a hands-on approach to implementation and associated methodologies in the field of NLP that will have you up-and-running, and productive in no time. Also, developers that want to learn more about multimodal models and generative AI in the field of computer vision can use this book as a source.
What you will learnHow NLP technologies have evolved over the past yearsHow to solve simple/complex NLP problems with Python programming languageHow to solve classification/regression problems with traditional NLP approachesTraining a language model and further exploring how to fine-tune the models to the downstream tasksHow to use Transformers for generative AI and computer vision tasksHow to build Transformers-based NLP applications with the Python Transformers libraryHow to build language generation such as Machine Translation, Conversational AI in any languageHow to speed up transformer model inference to reduce latencyWho this book is forThe book is for deep learning researchers, hands-on practitioners, ML/NLP researchers, educators and their students who have a good command of programming subjects, have knowledge in the field of machine learning and artificial intelligence, and want to develop applications in the field of cutting-edge natural language processing as well as multimodal tasks. The readers will have to know at least python or any programming language, know machine learning literature, have some basic understanding of computer science, as this book is going to cover the practical aspects of natural language processing and multimodal deep learning.
Table of ContentsFrom Bag-of-Words to the TransformerA Hands-On Introduction to the SubjectAutoencoding Language ModelsAutoregressive Language ModelsFine-tuning Language Model for Text ClassificationFine-Tuning Language Models for Token ClassificationText RepresentationBoosting your model performanceParameter Efficient Fine-tuningZero-shot and Few-shot learning in NLP<