This research monograph presents a groundbreaking unification of neural network approximation theory through the lens of Positive Linear Operators (PLOs). For the first time in the literature, neural network operators and activated convolution operators are rigorously analyzed as PLOs — providing a comprehensive, quantitative framework based on inequalities and the modulus of continuity.
The author develops a general, elegant, and highly versatile theory that applies uniformly to a wide variety of neural and convolution operators, bridging Pure and Applied Mathematics with modern Artificial Intelligence and Machine Learning. The results open new directions for mathematical understanding of neural network approximation, with applications across computational analysis, engineering, statistics, and economics.
This volume is an essential resource for mathematicians, computer scientists, and engineers seeking a rigorous analytical foundation for AI and deep learning models.
By:
George A Anastassiou (The University Of Memphis Usa) Imprint: World Scientific Publishing Co Pte Ltd Country of Publication: Singapore Volume: 28 ISBN:9789819826186 ISBN 10: 9819826187 Series:Series on Concrete & Applicable Mathematics Pages: 420 Publication Date:13 April 2026 Audience:
College/higher education
,
Professional and scholarly
,
Primary
,
Undergraduate
Format:Hardback Publisher's Status: Forthcoming