LATEST DISCOUNTS & SALES: PROMOTIONS

Close Notification

Your cart does not contain any items

$209.95

Hardback

Not in-store but you can order this
How long will it take?

QTY:

English
Cambridge University Press
03 January 2000
"This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a ""large margin."" The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics."

By:   , ,
Imprint:   Cambridge University Press
Country of Publication:   United Kingdom
Dimensions:   Height: 229mm,  Width: 152mm,  Spine: 27mm
Weight:   760g
ISBN:   9780521573535
ISBN 10:   052157353X
Pages:   404
Publication Date:  
Audience:   Professional and scholarly ,  Undergraduate
Format:   Hardback
Publisher's Status:   Active

Reviews for Neural Network Learning: Theoretical Foundations

'The book is a useful and readable mongraph. For beginners it is a nice introduction to the subject, for experts a valuable reference.' Zentralblatt MATH


See Also