**Shai Ben-David, Professor at the University of Waterloo, gave Machine Learning Course composed of 23 Lectures (CS 485/685) at the University of Waterloo on Jan 14, 2015.**

*Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it. Many researchers also think it is the best way to make progress towards human-level AI.*

Shai Ben-David holds a PhD in mathematics from the Hebrew University is Jerusalem. He has held postdoctoral positions at the University of Toronto in both the Mathematics and CS departments. He was a professor of computer science at the Technion in Haifa, Israel. Ben-David has held visiting positions at the Australian National University and Cornell University, and since 2004 has been a professor of computer science at the University of Waterloo in Canada.

His research interests are in CS theory and machine learning. He has been program chair for COLT and ALT (the two main annual machine learning conferences), and also served long terms on their steering committees. Ben-David has also been an area chair and senior program committee member for ICML and NIPS (the two major general machine learning conferences)

**Machine Learning Course- Shai Ben-David: Lecture 1**

Introduction: What is machine learning? and an outline of the course.

**Machine Learning Course- Shai Ben-David: Lecture 2**

First formal learnability theorem: Assuming realizability, ERM is guaranteed to succeed over finite classes.

**Machine Learning Course- Shai Ben-David: Lecture 3**

The basic definitions of learnability of a class H: PAC learnability and Agnostic PAC learnability.

**Machine Learning Course- Shai Ben-David: Lecture 4**

Extensions of the definition of PAC learnability to various more realistic scenarios. The basic notion of representative samples

**Machine Learning Course – Shai Ben-David : Lecture 5 by Mohammad-Hassan Zokaei Ashtiani**

Proving that every finite class is Agnostically PAC learnable.

**Machine Learning Course – Shai Ben-David : Lecture 6 by Mohammad-Hassan Zokaei Ashtiani**

Learnability of the class of threshold functions and the No-Free-Lunch theorem

**Machine Learning Course- Shai Ben-David: Lecture 7**

Introducing the VC-dimension.

**Machine Learning Course- Shai Ben-David: Lecture 8**

The relationship of VC dimension and learning: Statement of the “fundamental theorem of statistical machine learning” and a proof of the non-learnability of classes of infinite VC-dimension.

**Machine Learning Course- Shai Ben-David: Lecture 9**

The VC dimension of Linear predictors and the quantitative version of the fundamental theorem (how the VC dimension determines the sample complexity of learning)

**Machine Learning Course- Shai Ben-David: Lecture 10**

Lower bounding the sample complexity of learning in terms of the VC-dimension and the desired accuracy

**Machine Learning Course- Shai Ben-David: Lecture 11**

The Sauer Lemma: Proof and its relevance to sample complexity.

**Machine Learning Course- Shai Ben-David: Lecture 12**

A more realistic notion – Non-uniform learnability

**Machine Learning Course- Shai Ben-David: Lecture 13**

VCdim-based characterization of non-uniform learnability

**Machine Learning Course- Shai Ben-David: Lecture 14**

VCdim-based characterization of non-uniform learnability

**Machine Learning Course- Shai Ben-David: Lecture 15**

The Consistency version of learning, and defining the computational complexity of learning.

**Machine Learning Course- Shai Ben-David: Lecture 16**

Computational complexity: Examples of (provably) computationally efficient learning

**Machine Learning Course- Shai Ben-David: Lecture 17**

**Machine Learning Course- Shai Ben-David: Lecture 18**

The AdaBoost algorithm

**Machine Learning Course- Shai Ben-David: Lecture 19**

Analyzing and applying AdaBoost

**Machine Learning Course- Shai Ben-David: Lecture 20**

Convexity of sets and functions

**Machine Learning Course- Shai Ben-David: Lecture 21**

convex optimization problems, learning and convex loss functions

**Machine Learning Course- Shai Ben-David: Lecture 22**

**Machine Learning Course- Shai Ben-David: Lecture 23**

Last lecture – Support Vector Machines

**Source: Understanding Machine Learning – Shai Ben-David**