|
![]() |
|||
|
||||
OverviewRecent studies showed that for large models, such as GPT-3 which requires 355 years to complete the training using one fastest GPU, it is necessary to use thousands of GPUs to finish the training. Therefore the design of scalable distributed training system imposes a significant implication for the future development of machine learning. One major bottleneck for the scalability of the training system is the communication cost, which could totally overweight the computation cost on commodity systems with that offer limited network bandwidth or high network latency. Full Product DetailsAuthor: Greenfelder GrantPublisher: Grant Greenfelder Imprint: Grant Greenfelder Dimensions: Width: 15.20cm , Height: 0.60cm , Length: 22.90cm Weight: 0.163kg ISBN: 9789434135113ISBN 10: 9434135117 Pages: 116 Publication Date: 03 April 2023 Audience: General/trade , General Format: Paperback Publisher's Status: Active Availability: Available To Order ![]() We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |