MATS2900 Mathematics of machine learning (5 cr)

Study level:
Advanced studies
Grading scale:
0-5
Language:
English
Responsible organisation:
Department of Mathematics and Statistics
Curriculum periods:
2024-2025, 2025-2026, 2026-2027, 2027-2028

Description

-

Learning outcomes

After taking the course, the student will understand basic mathematical principles behind machine learning. What does probably approximately correct (PAC) learning mean? Which hypothesis classes are PAC learnable, and how is this related to the Vapnik-Chervonenkis (VC) dimension of the hypothesis class? As the main special case of the theory, we will treat the classes of perceptrons and neural networks. What is the VC dimension of neural networks? What functions can be represented by neural networks? What is the (stochastic) gradient descent algorithm, and when does it work?

Description of prerequisites

The course assumes familiarity and routine with the notions of probability, expectation, and basic linear algebra. These can be obtained from courses

- Foundations of stochastics, Probability theory 1

- Measure and Integration Theory 1

- Linear algebra and geometry 1 and 2,

- Vector analysis 1

Study materials

Preliminary lecture notes are available at

https://drive.google.com/file/d/1lQsGGwV-hJZnmN1jr1e6ouQD_uA8zv4X/view

These lecture notes are largely based on the book "Understanding Machine Learning" by Shai Shalev-Shwartz and Shai Ben-David. A version of the book is (at the time of writing) freely available at the first author's website. 

Completion methods

Method 1

Evaluation criteria:
Grade is based on exercises and points from the exam.
Select all marked parts

Method 2

Evaluation criteria:
Grade is based on the points from the final exam.
Select all marked parts
Parts of the completion methods
x

Participation in teaching (5 cr)

Type:
Participation in teaching
Grading scale:
0-5
Language:
English

Teaching

x

Exam (5 cr)

Type:
Exam
Grading scale:
0-5
Language:
English
No published teaching