PERHAPS A GIFT VOUCHER FOR MUM?: MOTHER'S DAY

Close Notification

Your cart does not contain any items

$396

Undefined

Not in-store but you can order this
How long will it take?

QTY:

English
Cambridge University Press
22 December 2022
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. The first volume, Foundations, establishes core topics in inference and learning, and prepares readers for studying their practical application. The second volume, Inference, introduces readers to cutting-edge techniques for inferring unknown variables and quantities. The final volume, Learning, provides a rigorous introduction to state-of-the-art learning methods. A consistent structure and pedagogy is employed throughout all three volumes to reinforce student understanding, with over 1280 end-of-chapter problems (including solutions for instructors), over 600 figures, over 470 solved examples, datasets and downloadable Matlab code. Unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.

By:  
Imprint:   Cambridge University Press
Country of Publication:   United Kingdom
Dimensions:   Height: 255mm,  Width: 180mm,  Spine: 120mm
Weight:   5.420kg
ISBN:   9781009218108
ISBN 10:   1009218107
Pages:   3370
Publication Date:  
Audience:   General/trade ,  ELT Advanced
Format:   Undefined
Publisher's Status:   Active
Volume I. Foundations: 1. Matrix theory; 2. Vector differentiation; 3. Random variables; 4. Gaussian distribution; 5. Exponential distributions; 6. Entropy and divergence; 7. Random processes; 8. Convex functions; 9. Convex optimization; 10. Lipschitz conditions; 11. Proximal operator; 12. Gradient descent method; 13. Conjugate gradient method; 14. Subgradient method; 15. Proximal and mirror descent methods; 16. Stochastic optimization; 17. Adaptive gradient methods; 18. Gradient noise; 19. Convergence analysis I: stochastic gradient algorithms; 20. Convergence analysis II: stochasic subgradient algorithms; 21. Convergence analysis III: stochastic proximal algorithms; 22. Variance-reduced methods I: uniform sampling; 23. Variance-reduced methods II: random reshuffling; 24. Nonconvex optimization; 25. Decentralized optimization I: primal methods; 26. Decentralized optimization II: primal-dual methods; Author index; Subject index. Volume II. Inference: 27. Mean-Square-Error inference; 28. Bayesian inference; 29. Linear regression; 30. Kalman filter; 31. Maximum likelihood; 32. Expectation maximization; 33. Predictive modeling; 34. Expectation propagation; 35. Particle filters; 36. Variational inference; 37. Latent Dirichlet allocation; 38. Hidden Markov models; 39. Decoding HMMs; 40. Independent component analysis; 41. Bayesian networks; 42. Inference over graphs; 43. Undirected graphs; 44. Markov decision processes; 45. Value and policy iterations; 46. Temporal difference learning; 47. Q-learning; 48. Value function approximation; 49. Policy gradient methods; Author index; Subject index. Volume III. Learning: 50. Least-squares problems; 51. Regularization; 52. Nearest-neighbor rule; 53. Self-organizing maps; 54. Decision trees; 55. Naive Bayes classifier; 56. Linear discriminant analysis; 57. Principal component analysis; 58. Dictionary learning; 59. Logistic regression; 60. Perceptron; 61. Support vector machines; 62. Bagging and boosting; 63. Kernel methods; 64. Generalization theory; 65. Feed forward neural networks; 66. Deep belief networks; 67. Convolutional networks; 68. Generative networks; 69. Recurrent networks; 70. Explainable learning; 71. Adversarial attacks; 72. Meta learning; Author index; Subject index.

Ali H. Sayed is Professor and Dean of Engineering at Ecole Polytechnique Federale de Lausanne (EPFL), Switzerland. He has also served as Distinguished Professor and Chairman of Electrical Engineering at the University of California, Los Angeles, USA, and as President of the IEEE Signal Processing Society. He is a member of the US National Academy of Engineering (NAE) and The World Academy of Sciences (TWAS), and a recipient of the 2022 IEEE Fourier Award and the 2020 IEEE Norbert Wiener Society Award. He is a Fellow of the IEEE.

Reviews for Inference and Learning from Data

'Inference and Learning from Data is a uniquely comprehensive introduction to the signal processing foundations of modern data science. Lucidly written, with a carefully balanced choice of topics, this textbook is an indispensable resource for both graduate students and data science practitioners, a piece of lasting value.' Helmut Boelcskei, ETH Zurich 'This textbook provides a lucid and magisterial treatment of methods for inference and learning from data, aided by hundreds of solved examples, computer simulations, and over 1000 problems. The material ranges from fundamentals to recent advances in statistical learning theory; variational inference; neural, convolutional, and Bayesian networks; and several other topics. It is aimed at students and practitioners, and can be used for several different introductory and advanced courses.' Thomas Kailath, Stanford University 'A tour de force comprehensive three-volume set for the fast-developing areas of data science, machine learning, and statistical signal processing. With masterful clarity and depth, Sayed covers, connects, and integrates background fundamentals and classical and emerging methods in inference and learning. The books are rich in worked-out examples, exercises, and links to data sets. Commentaries with historical background and contexts for the topics covered in each chapter are a special feature.' Mostafa Kaveh, University of Minnesota 'This is the first of a three-volume series covering from fundamentals to the many various methods in inference and learning from data. Professor Sayed is a prolific author of award-winning books and research papers who has himself contributed significantly to many of the topics included in the series. With his encyclopedic knowledge, his careful attention to detail, and in a very approachable style, this first volume covers the basics of matrix theory, probability and stochastic processes, convex and non-convex optimization, gradient-descent, convergence analysis, and several other advanced topics that will be needed for volume II (Inference) and volume III (Learning). This series, and in particular this volume, will be a must-have for educators, students, researchers, and technologists alike who are pursuing a systematic study, want a quick refresh, or may use it as a helpful reference to learn about these fundamentals.' Jose Moura, Carnegie Mellon University 'Volume I of Inference and Learning from Data provides a foundational treatment of one of the most topical aspects of contemporary signal and information processing, written by one of the most talented expositors in the field. It is a valuable resource both as a textbook for students wishing to enter the field and as a reference work for practicing engineers.' Vincent Poor, Princeton University 'Inference and Learning from Data, Vol. I: Foundations offers an insightful and well-integrated primer with just the right balance of everything that new graduate students need to put their research on a solid footing. It covers foundations in a modern way - emphasizing the most useful concepts, including proofs, and timely topics which are often missing from introductory graduate texts. All in one beautifully written textbook. An impressive feat! I highly recommend it.' Nikolaos Sidiropoulos, University of Virginia 'This exceptional encyclopedic work on learning from data will be the bible of the field for many years to come. Totaling more than 3000 pages, this three-volume book covers in an exhaustive and timely manner the topic of data science, which has become critically important to many areas and lies at the basis of modern signal processing, machine learning, artificial intelligence, and their numerous applications. Written by an authority in the field, the book is really unique in scale and breadth, and it will be an invaluable source of information for students, researchers, and practitioners alike.' Peter Stoica, Uppsala University 'Very meticulous, thorough, and timely. This volume is largely focused on optimization, which is so important in the modern-day world of data science, signal processing, and machine learning. The book is classical and modern at the same time - many classical topics are nicely linked to modern topics of current interest. All the necessary mathematical background is covered. Professor Sayed is one of the foremost researchers and educators in the field and the writing style is unhurried and clear with many examples, truly reflecting the towering scholar that he is. This volume is so complete that it can be used for self-study, as a classroom text, and as a timeless research reference.' P. P. Vaidyanathan, Caltech 'The book series is timely and indispensable. It is a unique companion for graduate students and early-career researchers. The three volumes provide an extraordinary breadth and depth of techniques and tools, and encapsulate the experience and expertise of a world-class expert in the field. The pedagogically crafted text is written lucidly, yet never compromises rigor. Theoretical concepts are enhanced with illustrative figures, well-thought problems, intuitive examples, datasets, and MATLAB codes that reinforce readers' learning.' Abdelhak Zoubir, TU Darmstadt


See Also