Syllabus MAI391 Sp24

Download as xlsx, pdf, or txt
Download as xlsx, pdf, or txt
You are on page 1of 16

No Title Details

1 Document type SYLLABUS


Program <UNDERGRADUATE PROGRAM>
2
Decision No. <378/QĐ-ĐHFPT>
3
4 Course Name Mathematics for Machine Learning
5 Course Code MAI391
6 No of credits 3
Degree Level Bachelor
7
Time Allocation Contact time: 30 sessions; 1 session = 90'
Lectures: 30
8
Pre-requisite MAE101

9
Description This course introduces the mathematical concepts and foundations needed to talk
about the three main components of a machine learning system: data, models, and
learning. Upon this course, students will be able to understand:
• Foundametal concepts about matrices and matrix decomposition.
• Concepts of gradients.
• The basics of probability and some distributions.
• Optimization to find maxima/minima of functions. •
Dimensionality reduction using principal component analysis.
• Classification in the context of support vector machines.
Method of teaching and learning: Lecture and project based learning

10
Student's task <- Students must attend more than 80% of contact slots in order to be accepted to the
final examination.
- Student is responsible to do all exercises, assignments given by instructor in class or
at home.
- Use laptop in class only for learning purpose
- Promptly access to the FU CMS at http://cms.fpt.edu.vn for up-to-date course
information>
11
Tools <- Internet
12 - Python Programe>
13 Note
Min GPA to pass 5
14
15 Scoring scale 10
16 Approved date 4/2/2021

Ass
W
P
Quiz
Practical Test
ME
FE

05.02-BM/ĐH/HDCV/FE 1/0
LT:TH

05.02-BM/ĐH/HDCV/FE 1/0
No MaterialDescription Purpose Sourse Author Publisher Published Date Edition
Cambrige Univeristy
1 Mathematics for Machine Learning textbook M. P. Deisenroth, A. Faisal, Ch. S. Ong Press 2020 1st
2 PDF Slides Instructors
3 Machine Learning cơ bản reference https://github.com/tiepvupsu/ebookMLCB/blob/master/book_ML.pdf
Vũ Hữu Tiệp
4 Multivariate Calculus reference https://ocw.mit.edu/courses/mathematics/18-02sc-multivariable-calculus-fall-2010/index.htm
Denis Auroux MIT OpenCourseWare
5 Mathematics for machine learning reference https://www.youtube.com/playlist?list=PLcQCwsZDEzFmlSc6levE3UV9rZ8yY-D_7
Mirror Neuron Chanel
6 Mathematics for machine Learning reference Digital Learning Hub - Imperial College London Chanel
https://www.youtube.com/channel/UCSzae1ITUdw9DCdELMduaQw/playlists
7 Linear Algebra Done Right reference Sheldon Axler Springer 2015 3rd
8 Machine Learning Andrew Ng
reference https://www.youtube.com/watch?v=UzxYlbK2c7E&list=PLA89DCFA6ADACE599&index=1 Stanford Chanel
9 Mathematics for Machine Learning reference https://www.coursera.org/specializations/mathematics-machine-learning
Cousera

05.02-BM/ĐH/HDCV/FE 1/0
No LO Name

LO 0

1 LO 1

2 LO 2

3 LO 3

4 LO 4

5 LO 5

6 LO 6

05.02-BM/ĐH/HDCV/FE 1/0
7 LO 7

8 LO 8

05.02-BM/ĐH/HDCV/FE 1/0
Analytic Geometry
LO Description
LO 0.1 Define a norm and an inner product on a vector space
LO 0.2 Determine bilinear mapping and check if a square matrix is positive definite.
LO 0.3 Find the length of a vector as well as the distance between two vectors in a vector space by using inner product.
LO 0.4 Find the angle between two vectors and check if they are orthogonal. Check if a square matrix is orthogonal.
LO 0.5 Check if a set of vectors is an orthogonal or an orthonormal basis of a given vector space. Determine an orthogonal
complement of a subspace.
LO 0.6 Construct the projection onto a subspace.

Matrix Decomposition
LO 1.1
Compute the determinant of a square matrix, decide when a square matrix is invertible by equivalent conditions.
Find the characteristic polynomial, eigenvalues and eigenvectors of a matrix.
Understand some theorems concerning eigenvalues.
LO1.2
Diagonalize a matrix by eigendecomposition.
Decompose a matrix by singular value decomposition.
LO1.3
Compute matrix approximation.

Vector Calculus
LO 2.1
Revisit differentiation of a univariate function.
Find Taylor expansion of a smooth function at a given point.
LO 2.2 Find partial derivatives using basic formulas and the chain rule, gradients of multivariate functions.
LO 2.3 Find gradients, Jacobian matrices of Vector-Valued Functions.
LO 2.4 Take gradients of matrices with respect to vectors.
LO 2.5 Understand the idea of backpropagation in deep learning.
LO 2.6 Find the higher-order partial derivatives, linearization and multivariate Taylor series of multivariate functions.

Probability and Distributions


LO 3.1 Catch the notions of discrete variable continuous variable; find the probability mass function, the probability density
function, the cumulative distribution function and their properties.
LO 3.2 Calculate conditional probabilities; use multiplication rule, total probability rule and Bayes theorem to find probabilities.
LO 3.3 Compute the mean and variance of a random variable, compute the covariance and the correlation between two
variables.
LO 3.4 Consider whether two variables are independent.
LO 3.5 Describe a Gauss distribution, marginals, conditionals, sums and products of Gauss distributions.

Continuous Optimization
LO 4.1 Find a local minimum of a function using gradient descent.
LO 4.2 Use Lagrange multipliers to find the maximum/minimum value of a multivariate function with constraints.
LO 4.3 Understand convex optimization problems and solve linear programings.
Linear Regression
LO 5.1 Use maximum likelihood and maximum a posteriori (MAP) estimation to find optimal model parameters.
LO 5.2 Using these parameter estimates to have a brief look at generalization errors and overfitting.
LO 5.3 Discuss Bayesian linear regression.
Dimensionality Reduction with PCA
LO 6.1 Compute the data covariance matrix of a data set, construct m-dimensional subspace with maximum variance.
LO 6.2 Find the basis of theprincipal subspace.
LO 6.3 Using Low-Rank Matrix to approximate PCA.
LO 6.4 Discuss PCA in high dimension and key steps of PCA.

05.02-BM/ĐH/HDCV/FE 1/0
Classification with SVM
LO 7.1 Understand binary classification by seperating hyperplane.
LO 7.2 Catch mathematical ideas of support vector machine.
LO 7.3 Understand the advantage of dual support vector machine.
LO 7.4 Introduce non-linear classifiers by kernels.
LO 7.5 Consider two different approaches for finding the optimal solution for the SVM.
Attitude and soft skills
LO 8 Develop working in group skills and self-discipline

05.02-BM/ĐH/HDCV/FE 1/0
05.02-BM/ĐH/HDCV/FE 1/0
05.02-BM/ĐH/HDCV/FE 1/0
05.02-BM/ĐH/HDCV/FE 1/0
05.02-BM/ĐH/HDCV/FE 1/0
Session Content Topics LO ITU Student's materials Teacher's Material Student's task
No

3.1 Norm
1 3.2 Inner Products
CLO0 T, U

3.3 Lengths and Distances


2 3.4 Angles and Orthogonality
CLO0 T, U Textbook and Slides Read Chapter 2
Chapter 3. Analytic Geometry Textbook Chapter 3
Exercises Chapter 3
3.5 Orthonormal Basis
3 3.6 Orthonormal Complement
CLO0 T, U

4 3.8 Orthogonal Projections


CLO0 T, U

5 4.1 Determinants and Traces


CLO1 T, U

6 4.2 Eigenvalues and Eigenvectors


CLO1 T, U
7 4.4 Eigendecomposition and Diagonalization
Chapter 4. Matrix CLO1 T, U Textbook and Slides
Textbook Chapter 4 Exercises Chapter 4
Decompositions
8 4.5 Singular value Decomposition
CLO1 T, U

9 4.5 Singular value Decomposition (cont.)


CLO1 T

10 4.6 Matrix Approximation


CLO1 T

11 5.1 Differentiation of Univariate Functions


CLO2 T
12 5.2 Partial Differentiation and Gradients
CLO2 T, U
13 5.2 Partial Differentiation and Gradients (cont.)
CLO2 T, U
14 5.3 Gradients of Vector-Valued Functions
CLO2 T, U Textbook and Slides
Chapter 5. Vector Calculus Textbook Chapter 5 Exercises Chapter 5
15 5.3 Gradients of Vector-Valued Functions (cont.)
CLO2 T, U
16 5.6 Backpropagation and Automatic Differentiation
CLO2 T
17 5.7 Higher-Order Derivatives
CLO2 T, U

18 5.8 Linearization and Multivariate Taylor Series


CLO2 T, U
Review CLO0, CLO1,
19 CLO2, CLO8 U
Progress Test 1 Submit Assignment 1
CLO0, CLO1,
20
CLO2, CLO8
21 6.1 Construction of a Probability Space
CLO3 T

22 6.2 Discrete and Continuous Probabilities


CLO3 T, U

05.02-BM/ĐH/HDCV/FE 1/0 12/16


Chapter 6. Probability and Textbook and Slides
Textbook Chapter 6 Exercises Chapter 6
23 6.3 Sum Rule, Product Rule, and Bayes’ Theorem
CLO3 T, U

24 6.3 Sum Rule, Product Rule, and Bayes’ Theorem (cont.)


Chapter 6. Probability and CLO3 T, U Textbook and Slides
Textbook Chapter 6 Exercises Chapter 6
Distributions
25 6.4 Summary Statistics and Independence
CLO3 T, U

26 6.4 Summary Statistics and Independence (cont.)


CLO3 T, U

27 6.5 Gaussian Distribution


CLO3 T

28 6.5 Gaussian Distribution (cont.)


CLO3 T, U

29 7.1 Optimization Using Gradient Descent


CLO4 T, U

30 7.2 Constrained Optimization and Lagrange Multipliers


CLO4 T

31 Chapter 7. Continuous 7.2 Constrained Optimization and Lagrange Multipliers (cont.) Textbook and Slides
CLO4 T Textbook Chapter 7 Exercises Chapter 7
Optimization
32 7.3 Convex Optimization
CLO4 T
33 7.3 Convex Optimization (cont.)
CLO4 T, U
34 7.3 Convex Optimization (cont.)
CLO4 T, U
35 Review
CLO3, LO4, CLO8 U
Submit Assignment 2
36 Progress Test 2
CLO3, LO4, CLO8

37 9.1 Problem Formulation


CLO5 T

38 9.1 Problem Formulation (cont.)


CLO5 T

39 9.2 Parameter Estimation Textbook and Slides Linear Regression in Python :


Chapter 9. Linear Regression CLO5 T Textbook Chapter 9
read and code
40 9.2 Parameter Estimation (cont.)
CLO5 T, U
41 Optional:
9.3 Bayesian Linear Regression CLO5 T
42 9.4 Maximum Likelihood as Orthogonal Projection

43 10.1 Problem Setting


CLO6 T

44 10.2 Maximum Variance Perspective


CLO6 T

45 10.2 Maximum Variance Perspective (cont.)


CLO6 T, U Textbook and Slides Write code that represents images
Textbook Chapter 10
as vectors
Chapter 10. Dimensionality 10.6 Key Steps of PCA in Practice
46
Reduction with Principal CLO6 T, U
Component Analysis
47 10.6 Key Steps of PCA in Practice (cont.)
CLO6 T, U

05.02-BM/ĐH/HDCV/FE 1/0 13/16


as vectors
Chapter 10. Dimensionality
Reduction with Principal
Component Analysis

48
Optional:
10.3 Projection Perspective
49 10.4 Eigenvector Computation and Low-Rank Approximations CLO6 T
10.5 PCA in High Dimensions Submit Assignment 3
50

51 12.1 Separating Hyperplanes


CLO7 T
52 12.2 Primal Support Vector Machine
CLO7 T

53 12.2 Primal Support Vector Machine (cont.)


CLO7 T, U

54 Chapter 12. Classification with 12.2 Primal Support Vector Machine (cont.) Textbook and Slides Create a Support Vector Machine
CLO7 T Textbook Chapter 12
Support Vector Machines in Python
55 12.3 Dual Support Vector Machine
CLO7 T
56 12.3 Dual Support Vector Machine (cont.)
CLO7 T, U
57 Optional: CLO7 T
12.4 Kernels
58 12.5 Numerical Solution
CLO7 T
CLO5, CLO6,
59 Presentation of Computer Project
CLO7, CLO8 U

60 Presentation of Computer Project CLO5, CLO6,


CLO7, CLO8 U

05.02-BM/ĐH/HDCV/FE 1/0 14/16


Assessment Number
Minimun value to
Component Assessment Weight Part of Scope of knowledge and skill of
# meet Completion Duration LO Type of questions
Hạng mục đánh Type Trọng số Phần question questions
Criteria
giá s

Assignment 1: LO 0, LO 1, LO 2, LO 8 Assigment 1: Cover chapters 3, 4,5


1 Assignments Assignment 30 3 0.0001 at home Assignment 2: LO 3, LO 4, LO 8 open 5-10 Assignment 2: Cover chapters 6,7
Assignment 3: LO 5, LO 8 Assignment 3: Cover chapters 9

- Test 1: cover chapters 3, 4,5


Progress Test 1: LO 0, LO 1, LO 2 Multiple choices
2 Progress tests Progress Test 20 2 0.0001 30' 12 - Test 2: cover chapters 6,7
Progress Test 2: LO 3, LO 4

3 Computer Project Computer Project 20 1 0.0001 at home LO 5, LO 6, LO 7, LO 8 open cover chapter 10,12

Multiple choices
4 Final exam Final exam 30 1 4 60' LO 0, LO 1, LO 2, LO 3, LO 4 40 Selected chapters
Marked by Computer

05.02-BM/ĐH/HDCV/FE 1/0
How? Note

guided by instructor in class, completed


by student at home, submitted by
deadline
Instruction and shedules for
Progress tests must be
presented in the Course
Implementation Plan approved
by director of the campus.
by instructor, and by
Progress test must be taken right
suitable means (computer, paper, oral)
after the last lectures of required
material.

Instructor has resposibility to


review the test for students after
graded.
by instructor Presented by group of students

The exam questions must be


by exam board,
updated or different at least 70%
using computer
to the previous ones.

05.02-BM/ĐH/HDCV/FE 1/0

You might also like