Mathematics for Data Science 2 (2018)

Click Here for the Semester 2, 2020 version of the course.
Welcome to MATH7502. This course is part of the Masters of Data Science program at the University of Queensland.

The prerequisite for the course is knowledge comparable to that of MATH7501. This includes basic discrete mathematics, calculus and elementary manipulation of matrices. Feel free to use the 7501 - course reader to brush-up as needed.

The current course, MATH7502, is a linear algebra foundations course focusing on data-science applications. There is also a focus on numerical computation via software. It also contains a few basic elements of multi-variable calculus and continuous optimization At the end of the course students will possess a strong mathematical foundation allowing understanding and execution of activities such as these:
  1. Understanding how basic clustering algorithms work.
  2. Solving linear systems of equations.
  3. Using the Jacobians for solving smooth non-linear systems of equations by iteration.
  4. Formulating optimality conditions for unconstrained and constrained optimization of smooth multi-variable functions.
  5. Using and understanding least squares approximations and generalizations.
  6. Modelling evolution of linear systems over time and understanding the role of eigenvalues in such evolution.
  7. Understanding linear transformations of multi-variate normal distributions.
  8. Understanding the operation of Principal Component Analysis (PCA).
  9. Understanding the use of singular value decomposition as used for lossy data compression.
  10. Understanding the mathematics of gradient-descent, Gauss-Newton and the Levenberg-Marquardt, non-linear optimization methods.
As a motivating example for the use of linear algebra in data science and machine learning, see this recent talk presented at UQ by Prof. Stephen Wright (UW-Madison).

In addition to the final exam, the course assessment includes 5 homework assignments.

Home assignments are to be submitted individually with each student submitting a unique assignment (copying assignments will not be tolerated). Nevertheless, students are encouraged to collaborate and discuss the homework assignments in an open and constructive manner. Sharing ideas, helping each other and jointly working towards a goal is great.

The course is coordinated by Yoni Nazarathy (y.nazarathy@uq.edu.au) and the tutor is Maria Kleshnina (m.kleshnina@uq.edu.au). The homework assignments were created with the help of Zhihao Qiao.

The course is mostly (but not soley) taught in "flipped mode". It relies on materials from the following:
  1. [VMLS] The book: Vectors Matrices and Least Squares (2018) by Stephen Boyd and Lieven Vandenberghe. You can use the free on-line version or you can order the book.

  2. [STRANG] The book: Introduction to Linear Algebra, Fifth Edition (2016) by Gilbert Strang. Here it is in the university book store. Selected sections are in the UQ Library. These are Sections 3.1, 5.1 and Chapter 7.

  3. [3B1B] The video series: Essence of linear algebra by 3Blue1Brown (Grant Sanderson).

  4. [LECT] Complementary lectures (presented in class by Yoni Nazarathy), including material (not found in 1-3) covering: Matrix Exponentials, Cramer's rule, Fourier Analysis, Principal Component Analysis, software packages and related terms.

  5. The Julia programming language. This is the recommended software for the course. However you can use alternatives if you wish (R, Python, Mathematica, Matlab,...). For basics, see the Julia linear algebra docs. Another resource for some code examples is this draft book co-authored with Hayden Klok [JSTAT]. Code examples from the book are available in this Github repo. There are several modes in which you can run Julia - the recommended one is Julia box. Also note that we are using Julia 0.6 (not far from 1.0). There was a significant change in Julia linear algebra between 0.5 and 0.6. For extra joy - see this video.

Material from [VMLS], [STRANG] and [3B1B] is used for learning in flipped mode (this means still attending class). Students cover this material independently out of class. Then it is discussed in class, with computational activities and exercises solved. Only a a few selected parts are in [LECT] mode, where a lecture is presented in class, not requiring prior preperation.

Below are links to Units 1--10 of the course including an activity schedule, using #week1 -- #week13 for scheduling. Material covered by the students during #weekN will be reviewed and discussed in class in #week(N+1)

  1. Unit 1 - Vectors and clustering. Video
    1. [LECT] Introduction. Based on Markov chain example from [JSTAT], chapter 1. #week1
    2. [VMLS] Chapter 1 (Vectors) + [3B1B] Chapter 1 (Vectors what are they). #week1
    3. [VMLS] Chapter 2 (Linear functions). #week2
    4. [VMLS] Chapter 3 (Norm and distance). #week2
    5. [VMLS] Chapter 4 (Clustering). #week3 Video
    6. [Homework 1] Vectors and clustering (#week2, #week3) - due #week4 (Aug 18). Assignment1.ipynb, Assignment1.pdf, Solution.

  2. Unit 2 - Tasting Linear Algebra I Video
    1. [3B1B] Chapter 2 (Linear combinations, span, and basis vectors). #week3
    2. [3B1B] Chapter 3 (Linear transformations and matrices). #week3
    3. [3B1B] Chapter 4 (Matrix multiplication as composition). #week3
    4. [3B1B] Chapter 5 (The determinant) + [STRANG] Chapter 5 (Determinants). #week3
    5. [3B1B] Chapter 6 (Inverse matrices, column space and null space) + [STRANG] Sections 3.1 and 3.2. #week3
    6. [3B1B] Footnote (Non-square matrices as transformations between dimensions). #week3

  3. Unit 3 - Tasting Linear Algebra II Video
    1. [3B1B] Chapter 7 (Dot products and duality). #week4
    2. [3B1B] Chapter 9 (Change of basis). #week4
    3. [3B1B] Chapter 10 (Eigenvectors and eigenvalues) + [STRANG] Section 6.1 (Introduction to eigenvalues). #week4
    4. [3B1B] Chapter 11 (Abstract vector spaces). #week4

  4. Unit 4 - Bases and Orthogonality
    1. [VMLS] Chapter 5 (Linear independence). #week4
    2. [STRANG] - vector spaces. #week4
    3. Lecture on Fourier series and transforms. (Cancelled or deferred to end).

  5. Unit 5 - Matrices and Equations. Video
    1. [VMLS] Chapter 6 (Matrices). #week5
    2. [VMLS] Chapter 7 (Matrix examples). #week5
    3. [VMLS] Chapter 8 (Linear equations). #week6
    4. [Homework 2] Matrices, equations and more (#week4, #week5, #week6, #week7) - due #week7 (Sep 8). Assignment2.ipynb, Assignment2.pdf, Solution.

  6. Unit 6 - Linear Dynamical Systems Video
    1. [VMLS] Chapter 9 (Linear dynamical systems). #week6
    2. [VMLS] Chapter 10 (Matrix multiplication). #week6
    3. [VMLS] Chapter 11 (Matrix inverses). #week7
    4. Lecture on complements: Matrix Exponential, Laplace transforms, the resolvent and Cramer's rule. #week7

  7. Unit 7 - Least Squares
    1. [VMLS] Chapter 12 (Least squares). #week8
    2. [VMLS] Chapter 13 (Least squares data fitting). #week8
    3. [Homework 3] Linear dynamical systems and least squares (#week8, #week9) - due #week9 (Sep 22). Assignment3.ipynb, Assignment3.pdf, , Solution.

  8. Unit 8 - Constraints, Lagrange Multipliers and Non-linear least squares
    1. [VMLS] Chapter 16 (Constrained least squares). #week9
    2. [VMLS] Chapter 17 (Constrained least squares applications). #week9
    3. [VMLS] Chapter 18 (Non-linear least squares). #week10
    4. [STRANG] Section 11.2 (Norms and condition numbers). #week10
    5. [Homework 4] Optimization and least squares (#week10, #week11) - due #week12 (Oct 20). Assignment4.ipynb, Assignment4.pdf

  9. Unit 9 - Eigenvalues, Eigenvectors, Covariance and PCA
    1. [STRANG] Section 6.2 (Diagonalizing a matrix). #week11
    2. [STRANG] Section 6.4 (Symmetric matrices). #week11
    3. [STRANG] Section 6.5 (Positive definite matrices). #week11
    4. [STRANG] Section 12.2 (Covariance matrices and joint probabilities). #week11
    5. [STRANG] Section 12.3 (Multivariate Gaussian and weighted least squares). #week12
    6. Lecture on Principal Component Analysis (PCA). #week12

  10. Unit 10 - The Singular Value Decomposition and Applications
    1. [STRANG] Chapter 7 (The Singular Value Decomposition). #week12
    2. [Homework 5] Covariance, PCA and SVD (#week12) - due #week13 (Oct 27). Assignment5.ipynb, Assignment5.pdf