EE 210A Course Overview

EE 210A Adaptation and Learning

This course is recommended for students following specializations in the signal processing, communications, or data science tracks. Students learn to master a rich and wide range of tools and concepts related to adaptation and learning from data, through a methodical, insightful, and deep treatment.

Description of the coursePicture1

The course covers the fundamentals of inference and learning from data, with emphasis on adaptation and online learning mechanisms. Students learn about the foundations of adaptive filtering and machine learning techniques in a unified treatment that brings forth their commonalities and immense practical relevance. In particular, the course covers topics related to optimal and linear estimation methods, stochastic-gradient algorithms for optimization, adaptation, and learning, Bayes and naive classifiers, nearest-neighbor rules, self-organizing maps, decision trees, logistic regression, discriminant analysis, Perceptron, support vector machines, kernel methods, bagging, boosting, random forests, and cross-validation. The course considers several examples related to adaptation and learning including channel estimation, channel equalization, echo cancellation, pattern classification, and machine learning scenarios.

View Course Overview

Background students will need

Although prerequisites are not enforced for graduate students, it is recommended that students have some familiarity with matrix theory, linear algebra, and random variables. Supplemental material is provided, and tools and concepts from these subjects are reviewed by the instructor as they arise for the benefit of the students.

About the instructor:Ali Sayed

Ali H. Sayed () is professor and former chairman of electrical engineering at the University of California, Los Angeles, where he directs the UCLA Adaptive Systems Laboratory. An author of over 480+ scholarly publications and six books, his research involves several areas including adaptation and learning, network science, information processing theories, and biologically-inspired designs. His work has been recognized with several awards including the 2014 Athanasios Papoulis Award from the European Association of Signal Processing, the 2015 Education Award, the 2013 Meritorious Service Award, and the 2012 Technical Achievement Award from the IEEE Signal Processing Society, the 2005 Terman Award from the American Society for Engineering Education, the 2003 Kuwait Prize, and the 1996 IEEE Donald G. Fink Prize. He served as a 2005 Distinguished Lecturer for the IEEE Signal Processing Society, and as Editor-in-Chief for the IEEE Transactions on Signal Processing (2003-2005). His articles received several best paper awards from the IEEE Signal Processing Society (2002,2005,2012,2014). He is a Fellow of both the IEEE and the American Association for the Advancement of Science (AAAS); the publisher of the journal Science. He is recognized as a Highly Cited Researcher by Thomson Reuters. He is serving as President-Elect of the IEEE Signal Processing Society (2016-2017).

TOPICS

Part A: Estimation Theory

Random Variables. Optimal Estimation; Linear Estimation; Regression; Linear Models; Modeling; Equalization; Design Examples.
Part B: Adaptation Theory

Gradient-Descent Algorithms; Stochastic-Gradient Algorithms; Recursive Least-Squares; Mean-Square-Error Performance; Tracking Performance; Transient Performance; Sub-gradient Learning; Proximal Learning.

Part C: Learning Theory

Learning and Generalization; Bayes Classifiers; Nearest-Neighbor (NN) Rules; Decision Trees; Risk Functions; Regularization; Sparsity; Logistic Regression; Discriminant Analysis (LDA,FDA); The Perceptron; Support Vector Machines (SVM); Kernel Methods; Bagging and Boosting; Random Forests; Cross-Validation; Principal Component Analysis (PCA).

Part D: Experimentation and Projects (selected from):

Adaptive Channel Estimation; Adaptive Channel Equalization; Adaptive Echo Cancellation; SVM Learning Machines; Boosting and Cross Validation; Discriminant Analysis; Deep Learning; Convolutional Networks.

View Course Overview