|
![]() |
|||
|
||||
OverviewThe book provides a timely coverage of the paradigm of knowledge distillation—an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher–student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms. Full Product DetailsAuthor: Witold Pedrycz , Shyi-Ming ChenPublisher: Springer International Publishing AG Imprint: Springer International Publishing AG Edition: 2023 ed. Volume: 1100 ISBN: 9783031320972ISBN 10: 3031320972 Pages: 232 Publication Date: 15 June 2024 Audience: Professional and scholarly , Professional & Vocational Format: Paperback Publisher's Status: Active Availability: Manufactured on demand ![]() We will order this item for you from a manufactured on demand supplier. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |