|
|
|||
|
||||
OverviewThis is an essential guide of the mathematics, algorithms, and trade-offs behind large language models, all explained in everyday language for everyone. Learn how LLMs truly work, behind the scene This is for everyone, irrespective of whether you have background in computers or not. You will see how LLMs turn text into tokens, tokens into probabilities, and probabilities into coherent language. The book covers information theory, entropy, n-gram models, Byte Pair Encoding, embeddings, transformers, fine-tuning, and inference. It explains why scale improves performance, why overfitting wrecks reliability, how memory extends context, and how multimodal systems connect words with images, audio, and video. It also goes inside the research lab: data pipelines, compute infrastructure, failed experiments, ethical risks, and the human labor required to make these systems work at all. Full Product DetailsAuthor: Turing Editorial TeamPublisher: Turing App Imprint: Turing App Dimensions: Width: 15.20cm , Height: 0.30cm , Length: 22.90cm Weight: 0.091kg ISBN: 9789199156910ISBN 10: 9199156911 Pages: 58 Publication Date: 15 April 2026 Audience: General/trade , General Format: Paperback Publisher's Status: Active Availability: Available To Order We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |
||||