|
|
|||
|
||||
OverviewThis research monograph presents a groundbreaking unification of neural network approximation theory through the lens of Positive Linear Operators (PLOs). For the first time in the literature, neural network operators and activated convolution operators are rigorously analyzed as PLOs — providing a comprehensive, quantitative framework based on inequalities and the modulus of continuity.The author develops a general, elegant, and highly versatile theory that applies uniformly to a wide variety of neural and convolution operators, bridging Pure and Applied Mathematics with modern Artificial Intelligence and Machine Learning. The results open new directions for mathematical understanding of neural network approximation, with applications across computational analysis, engineering, statistics, and economics.This volume is an essential resource for mathematicians, computer scientists, and engineers seeking a rigorous analytical foundation for AI and deep learning models. Full Product DetailsAuthor: George A Anastassiou (The University Of Memphis, Usa)Publisher: World Scientific Publishing Co Pte Ltd Imprint: World Scientific Publishing Co Pte Ltd Volume: 28 ISBN: 9789819826186ISBN 10: 9819826187 Pages: 420 Publication Date: 26 March 2026 Audience: College/higher education , Professional and scholarly , Tertiary & Higher Education , Professional & Vocational Format: Hardback Publisher's Status: Forthcoming Availability: Not yet available This item is yet to be released. You can pre-order this item and we will dispatch it to you upon its release. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |
||||