|
![]() |
|||
|
||||
OverviewThis book introduces readers to the fundamentals of and recent advances in federated learning, focusing on reducing communication costs, improving computational efficiency, and enhancing the security level. Federated learning is a distributed machine learning paradigm which enables model training on a large body of decentralized data. Its goal is to make full use of data across organizations or devices while meeting regulatory, privacy, and security requirements. The book starts with a self-contained introduction to artificial neural networks, deep learning models, supervised learning algorithms, evolutionary algorithms, and evolutionary learning. Concise information is then presented on multi-party secure computation, differential privacy, and homomorphic encryption, followed by a detailed description of federated learning. In turn, the book addresses the latest advances in federate learning research, especially from the perspectives of communication efficiency, evolutionarylearning, and privacy preservation. The book is particularly well suited for graduate students, academic researchers, and industrial practitioners in the field of machine learning and artificial intelligence. It can also be used as a self-learning resource for readers with a science or engineering background, or as a reference text for graduate courses. Full Product DetailsAuthor: Yaochu Jin , Hangyu Zhu , Jinjin Xu , Yang ChenPublisher: Springer Verlag, Singapore Imprint: Springer Verlag, Singapore Edition: 1st ed. 2023 Weight: 0.560kg ISBN: 9789811970825ISBN 10: 9811970823 Pages: 218 Publication Date: 30 November 2022 Audience: Professional and scholarly , College/higher education , Professional & Vocational , Undergraduate Format: Hardback Publisher's Status: Active Availability: Manufactured on demand ![]() We will order this item for you from a manufactured on demand supplier. Table of ContentsIntroduction 1.1 Artificial neural networks and deep learning 1.2 Evolutionary optimization and learning 1.3 Privacy-preserving computation 1.4 Federated learning 1.5 Summary Communication-Efficient Federated Learning 2.1 Communication cost in federated learning 2.2 Main methodologies 2.3 Temporally weighted averaging and layer-wise weight update 2.4 Trained ternary compression for federated learning 2.5 Summary Evolutionary Federated Learning 3.1 Motivations and challenges 3.2 Offline evolutionary multi-objective federated learning 3.3 Realtime evolutionary federated neural architecture search 3.4 Summary Secure Federated Learning 4.1 Threats to federated learning 4.2 Distributed encryption for horizontal federated learning 4.3 Secure vertical federated learning 4.4 Summary Summary and Outlook 5.1 Summary 5.2 Future directionsReviewsAuthor InformationYaochu Jin is an “Alexander von Humboldt Professor for Artificial Intelligence” in the Faculty of Technology, Bielefeld University, Germany. He is also a part-time Distinguished Chair Professor in Computational Intelligence at the Department of Computer Science, University of Surrey, Guildford, UK. He was a “Finland Distinguished Professor” at the University of Jyväskylä, Finland, “Changjiang Distinguished Visiting Professor” at Northeastern University, China, and “Distinguished Visiting Scholar” at the University of Technology in Sydney, Australia. His main research interests include data-driven optimization, multi-objective optimization, multi-objective learning, trustworthy machine learning, and evolutionary developmental systems. Prof Jin is a Member of Academia Europaea and IEEE Fellow. Hangyu Zhu received B.Sc. degree from Yangzhou University, Yangzhou, China, in 2015, M.Sc. degree from RMIT University, Melbourne, VIC, Australia, in 2017, and PhD degree from University of Surrey, Guildford, UK, in 2021. He is currently a Lecturer with the Department of Artificial Intelligence and Computer Science, Jiangnan University, China. His main research interests are federated learning and evolutionary neural architecture search. Jinjin Xu received the B.S and Ph.D. degrees from East China University of Science and Technology, Shanghai, China, in 2017 and 2022, respectively. He is currently a researcher with the Intelligent Perception and Interaction Research Department, OPPO Research Institute, Shanghai, China. His research interests include federated learning, data-driven optimization and its applications. Yang Chen received Ph.D. from the School of Information and Control Engineering, China University of Mining and Technology, China, in 2019. He was a Research Fellow with the School of Computer Science and Engineering, Nanyang Technological University, Singapore, 2019-2022. He is currently with the School of Electrical Engineering, China University of Mining and Technology, China. His research interests include deep learning, secure machine learning, edge computing, anomaly detection, evolutionary computation, and intelligence optimization. Tab Content 6Author Website:Countries AvailableAll regions |