Transformers in Action

Author:   Nicole Koenigstein
Publisher:   Manning Publications
ISBN:  

9781633437883


Pages:   256
Publication Date:   04 March 2026
Format:   Hardback
Availability:   In Print   Availability explained
Limited stock is available. It will be ordered for you and shipped pending supplier's limited stock.

Our Price $158.37 Quantity:  
Pre-Order

Share |

Transformers in Action


Overview

Transformer models power the chatbots, coders, and translators reshaping every industry today. Yet their architecture, math, and tuning often remain an intimidating black box. Stop copy pasting tutorials and start truly understanding what happens under the hood. Transformers in Action walks you through every layer with practical Python and clear analogies. Master small, large, and multimodal models, then optimize them for speed and cost. Build solutions that translate, summarize, and generate with confidence, efficiency, and rigor.  Layer-by-layer walkthrough: See how attention, embeddings, and positional encodings produce fluent output.  Task adaptation recipes: Fine-tune models for summarization, classification, or translation in minutes.  Optimization strategies: Reduce latency, shrink memory, and cut cloud bills without sacrificing accuracy.  Reinforcement learning techniques: Refine text generation quality using reward models and policy gradients.  Multimodal expansion: Combine text and vision to build next-generation, cross-media applications.  Complete code repository: Experiment instantly, tweak hyperparameters, and validate concepts on real datasets.  Transformers in Action, by Quantmate CEO and Chief AI Officer Nicole Koenigstein, has clear math walkthroughs, annotated Python, and production-ready patterns that you can trust.  The journey starts with encoder-only, decoder-only, and encoder-decoder variants, then moves to small language models for constrained environments. Each chapter couples theory with runnable notebooks, visual explanations, and performance benchmarks. Finish knowing exactly when to deploy a lightweight model, how to tune hyperparameters, and how to monitor costs. You will ship faster, safer, and leaner LLM solutions that impress users and stakeholders.  Ideal for software engineers and data scientists comfortable with Python and basic machine learning, eager to unlock transformer power. 

Full Product Details

Author:   Nicole Koenigstein
Publisher:   Manning Publications
Imprint:   Manning Publications
Weight:   0.467kg
ISBN:  

9781633437883


ISBN 10:   1633437884
Pages:   256
Publication Date:   04 March 2026
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Hardback
Publisher's Status:   Forthcoming
Availability:   In Print   Availability explained
Limited stock is available. It will be ordered for you and shipped pending supplier's limited stock.

Table of Contents

PART 1 FOUNDATIONS OF MODERN TRANSFORMER MODELS  1 THE NEED FOR TRANSFORMERS  2 A DEEPER LOOK INTO TRANSFORMERS  PART 2: GENERATIVE TRANSFORMERS  3 MODEL FAMILIES AND ARCHITECTURE VARIANTS  4 TEXT GENERATION STRATEGIES AND PROMPTING TECHNIQUES  5 PREFERENCE ALIGNMENT AND RAG  PART 3: SPECIALIZED MODELS  6 MULTIMODAL MODELS  7 EFFICIENT AND SPECIALIZED LARGE LANGUAGE MODELS  8 TRAINING AND EVALUATING LARGE LANGUAGE MODELS 9 OPTIMIZING AND SCALING LARGE LANGUAGE MODELS  10 ETHICAL AND RESPONSIBLE LARGE LANGUAGE MODELS 

Reviews

Author Information

Nicole Koenigstein is a CEO and Chief AI Officer renowned for transforming raw research into profitable AI systems. With years leading Quantmate’s agentic intelligence platform, Nicole brings clarity, precision, and business focus to every page. She distills deep model-building expertise into accessible guidance that helps readers deliver faster, smarter transformer solutions. 

Tab Content 6

Author Website:  

Countries Available

All regions
Latest Reading Guide

NOV RG 20252

 

Shopping Cart
Your cart is empty
Shopping cart
Mailing List