UQ MATH7502
Mathematics for Data Science 2 (2019)

Click Here for the Semester 2, 2020 version of the course.


Welcome to MATH7502. This course is part of the Masters of Data Science program at the University of Queensland. The course is coordinated by Yoni Nazarathy (y.nazarathy@uq.edu.au). The tutors are Samuel Hambleton (samuelahambleton@gmail.com) and Chris Raymond (christopher.raymond@uqconnect.edu.au).

Communication dealing with technical matters is best achieved via the dedicated slack workspace where you can communicate with peer students and teaching staff (use this invite link to sign up). Formal messages and grades are broadcasted via blackboard.

The prerequisite for the course is knowledge comparable to that of MATH7501. This includes basic discrete mathematics, calculus and elementary manipulation of vectors and matrices. Feel free to use the 7501 - course reader to brush-up as needed.

The current course, MATH7502, is a linear algebra foundations course focusing on data-science applications. There is also some emphasis on numerical computation via software. It also contains a few basic elements of multi-variable calculus and continuous optimization. At the end of the course students will possess mathematical foundations allowing understanding and execution of activities such as these: For more motivation see also 20 methods of the data scientist and the mathematics behind them.


Q: Should I take this course?
A: If you have done enough linear algebra and multi-variable calculus and you are able to independently realize how to apply it for machine learning and data science, then maybe there is no need. Otherwise, you should probably join. If you think you understand least squares, principal component analysis, gradient descent and clustering algorithms well then maybe there isn't a need. However, if you want to improve your mathematical understanding of such tools, please join.

Q: What if I haven't done MATH7501?
A: Depending on your background, you can perhaps make up (or review) the needed parts of MATH7501 independently. Consult with the teaching staff. You can also attend the "First Year Maths Support Tutes". They are run every afternoon, in Room 205 of building #41 between 2pm and 4pm.

Q: Is the format of the course similar to the previous offerings of MATH7501 or MATH7502?
A: No. The format is quite different both in comparison to MATH7501 and last year's MATH7502.

The course is mostly (but not solely) taught in "flipped mode". For this students are assigned reading of certain sections from two text books and are required to read prior to the lectures. Then in in the lectures, highlights from the reading are discussed and problems and examples are solved. One exception is the first week focusing on an introduction to the course as well as the Julia language. Note that tutorials are taking place in the first week in the form of a lecture.

The three required resources are:
  1. [VMLS] The book: Vectors Matrices and Least Squares (2018) by Stephen Boyd and Lieven Vandenberghe. You can use the free on-line version or you can order the book. Here is the Julia Language Companion for the book.

  2. [LALFD] The book: Linear Algebra and Learning from Data (2018) by Gilbert Strang. Up to 10% of the book will be supplied via the library. However, students need to obtain further sections of the book independently. Here it is in the university book store. Here are (2 x 24 hour loan) copies at UQ Library. The UQ Library has also scanned a few sections from the book here (these are VI.4, I.1, I.2, I.3, I.4).

  3. The Julia programming language. This is the recommended software for the course. However you can use alternatives if you insist (R, Python, Mathematica, Matlab,...). For basics, see the Julia linear algebra docs. There are several modes in which you can run Julia. An easy option is Julia box. However installing Julia and Jupyter locally on your computer is recommended. See for example this explainer video.
There are also additional useful resources:
  1. [ILA] The book: Introduction to Linear Algebra, Fifth Edition (2016) by Gilbert Strang. Here it is in the university book store. Selected sections are in the UQ Library. These are Sections 3.1, 5.1 and Chapter 7.


  2. [3B1B] The video series: Essence of linear algebra by 3Blue1Brown (Grant Sanderson) as well as other selected videos.


  3. [SWJ] The draft book: Statistics with Julia: Fundamentals for Data Science, Machine Learning and Artificial Intelligence (2019) by Hayden Klok and Yoni Nazarathy. Code examples from the book are available in this GitHub repo.



The course assessment includes the following:
  1. Three homework assignments. These assignments are to be submitted individually with each student submitting a unique assignment (copying assignments will not be tolerated). Nevertheless, students are encouraged to collaborate and discuss the homework assignments in an open and constructive manner. Sharing ideas, helping each other and jointly working towards a goal is great. More details below.
  2. A project report presented via a Jupyter notebook with an accompanying YouTube video.
    These are group assignments covering additional material to the core material taught in the course. More details below.
  3. Individual review of peer project reports.
    This is an individual review of project reports (of other groups). More details below.
  4. A final exam.
    This is a (UQ central) final exam for the course. More details below.
Due dates for assessment items are listed on UQ's official course profile for MATH7502.


Outline of Material and Reading

Below is a detailed reading list. The semester has 13 weeks. The first week is an introductory lecture. Then lectures during weeks 2 to 11 require the students to read (and watch videos) prior to the lecture as per the schedule below. Minor refinements of the reading schedule will be communicated via blackboard. Week 13 is for wrap-up and exam review.


Lectures are recorded via UQ's blackboard system, however these recordings don't capture images of the whiteboard. For this, see the course's GitHub page, also containing Jupyter notebooks from class.

    Unit 1: Introduction
    Motivating examples. Using Julia, Jupyter, Markdown and basic latex formulas. No prior reading is needed.
    MATH7502-Introduction-Lecture.ipynb, MATH7502-Introduction-Lecture.pdf.
    Optional video:


    Unit 2: Vectors (week 2)
    From [VMLS]: 1.1 Vectors, 1.2 Vector addition, 1.3 Scalar-vector Multiplication, 1.4 Inner Product, 1.5 Complexity of vector computations, 2.1 Linear Functions, 2.2 Taylor approximation, C.1.2 Scalar-valued function of a vector, C.1.3 Vector-valued function of a vector.
    From [3B1B]:


    Unit 3: Using Vectors (week 3)
    From [VMLS]: 3.1 Norm, 3.2 Distance, 3.3 Standard deviation, 3.4 Angle, 4.1 Clustering, 4.2 A clustering objective, 4.3 The k-means Algorithm.
    From [LALFD]: VI.4 Gradient Descent Toward the Minimum (only some parts of this section are covered).
    From [3B1B]:





    Unit 4: Matrices (weeks 4 and 5)
    From [LALFD]: I.1 Multiplication Ax Using Columns of Ax, I.2 Matrix-Matrix Multiplication AB.
    From [VMLS]: 5.1 Linear dependence, 5.2 Basis, 5.3 Orthonormal Vectors, 5.4 Gram-Schmidt Algorithm, 6.1 Matrices, 6.2 Zero and identity Matrices, 6.3 Transpose, addition and Norm, 6.4 Matrix-vector Multiplication. 7.1 Geometric transformations, 7.2 Selectors, 8.1 Linear and affine Functions, 8.2 Linear function models, 8.3 Systems of linear equations, 10.1 Matrix-matrix Multiplication, 10.2 Composition of linear Functions, 10.3 Matrix power. 10.4 QR factorization.
    From [3B1B]:






    Unit 5: Matrices and Vector Spaces (week 6)
    From [VMLS]: 11.1 Left and right inverses, 11.2 inverse, 11.3 Solving linear equations, 11.5 Pseudo-inverse.
    From [LALFD]: I.3 The Four Fundamental Subspaces, I.4 Elimination and A = LU, I.5 Orthogonal Matrices and Subspaces.
    From [3B1B]:




    Unit 6: Spectral Analysis (weeks 7 and 8)
    From [LALFD]: I.6 Eigenvalues and Eigenvectors, I.7 Symmetric Positive Definite Matrices, I.8 Singular Values and Singular Vectors in the SVD, I.9 Principal Components and Best Low Rank Matrix, V.4 Covariance Matrices and Joint Probabilities.
    See also (extra): Chapter 6 from [ILA].
    From [3B1B]:


    Unit 7: Least Squares #1 (weeks 9 and 10)
    From [VMLS]: 12.1 Least squares Problem, 12.2 Solution (to least squares problem), 12.3 Solving least squares Problems, 13.1 Least squares data fitting, 14.1 Classification, 14.2 Least squares classifier, 14.3 Multi-class classifiers.
    From [LALFD]: II.2 Least Squares: Four Ways.
    Optional video:

    Video Resources: LeastSquaresForDataScience.ipynb, LeastSquaresForDataScience.pdf.
    bestValue.gif.


    Unit 8: Least Squares #2 (weeks 11 and 12)
    From [VMLS]: 15.1 Multi-objective least squares, 15.3 Estimation and inversion, 15.4 Regularized data fitting, 15.5 Complexity (regularized data fitting).
    From [LALFD]: III.4 Split Algorithms for l^2 + l^1, V.5 Multivariate Gaussians and Weighted Least Squares.
    Optional: Guest lecturer lecture notes by Phil Isaac.
    Optional video:

Homework assignments



    Under construction.


    HW1 on Units 1, 2 and 3. Due 17/08/2009. Solution (PDF) Solution (ipynb).

    HW2 on Units 4 and 5. Due 14/09/2019. Solution (PDF) Solution (ipynb).

    HW3 on Units 6 and 7. Due 12/10/2019. Solution (PDF) Solution (ipynb).

Project Reports

Projects are to be carried out in groups of up to 5 people and no less than 3 people per group. Each group needs to choose one project topic from the topics below. A topic has associated reading from [VMLS], [LALFD] and in certain cases [SWJ]. The group then needs to study the material and present key ideas, principals and methods. The presentation is via a Julia Jupyter notebook with an accompanying YouTube video. Here are detailed instructions. Due 18/10/2019.

After projects are submitted. Individual peer reviews of projects will be carried out (you review projects of others). This review (summarized as a written document) is also part of the course assessment. The review questionare is here.

    Topic 1: Constrained Optimization

    From [VMLS]: 16.1 Constrained least squares problem, 16.2 Solution (to constrained least squares problem), 16.3 Solving constrained least squares problems, 17.2 Linear quadratic control, 17.3 Linear quadratic state estimation, C.3 Lagrange multipliers.
    From [LALFD]: VI.2 Lagrange Multipliers = Derivatives of the Components.



    Group 9:

    Summary_group9.pdf
    Code_group9.pdf (Code_Problems_group9.ipynb, Code_Application_group9.ipynb ).

    Group 13:

    Summary_group13.pdf
    Code_group13.pdf (ipynb file).

    Peer feedback.


    Topic 2: Second Order optimization

    From [VMLS]: 18.1 Nonlinear equations and least squares, 18.2 Gauss-Newton Algorithm, 18.3 Levenberg-Marquardt algorithm, 18.4 Nonlinear model fitting.
    From [LALFD]: VI.1 Minimum Problems: Convexity and Newton's Method.



    Group 10:

    Summary_group10.pdf
    Code_group10.pdf (ipynb file).

    Group 12:

    Summary_group12.pdf
    Code_group12.pdf (ipynb file).

    Peer feedback.


    Topic 3: Signal Processing

    From [VMLS]: 7.4 Convolution, IV.1 Fourier Transforms: Discrete and Continuous.
    From [LALFD]: IV.2 Shift Matrices and Circulant Matrices.

    Group 4:

    Summary_group4.pdf
    Code_group4.pdf (ipynb file, extra image file).

    Group 6:

    Summary_group6.pdf
    Code_group6.pdf (ipynb file. Jelly1.jpg).

    Peer feedback.


    Topic 4: Deep Learning

    From [LALFD]: VI.5 Stochastic Gradient Descent and ADAM, VII.1 The Construction of Deep Neural Networks, VII.2 Convolutional Neural Networks, VII.3 Backpropogation and the Chain Rule.

    Group 3:

    Summary_group3.pdf
    Code_group3.pdf (ipynb file, readme file).

    Group 5:

    Summary_group5.pdf
    Code_group5.pdf (ipynb file).

    Group 8:

    Summary_group8.pdf
    Code_group8.pdf (ipynb file ).

    Peer feedback.


    Topic 5: Markovian (and Deterministic) Dynamical Systems

    From [VMLS]: 9. Linear Dynamical Systems.
    From [LALFD]: V.6 Markov Chains.
    From [SWJ]: Markov Chains, MDP and Q-learning.

    Group 1:

    Summary_group1.pdf
    Code_group1.pdf (ipynb file).

    Group 2:

    Summary_group2.pdf
    Code_group2.pdf (ipynb file, extra csv file).

    Group 11:

    Summary_group11.pdf
    Code_group11.pdf (ipynb file).

    Group 14:

    Summary_group14.pdf
    Code_group14.ipynb.

    Peer feedback.


    Topic 6: Graphs and Networks

    From [VMLS]: 7.3 Incidence Matrix.
    From [LALFD]: IV.6 Graphs and Laplacians and Kirchhoff's Laws, VI3 (Max Flow-Min Cut subsection).



    Group 7:

    Summary_group7.pdf
    Code_group7.pdf (ipynb file, isochrone.html, directedGraph.png ).

    Peer feedback.

Exam Information



    Here is the practice exam. A session will take place 8am-9:50am on Tuesday October 15 in class.
    Here is the solution.

    Some review questions for Units 1-5 are here.

    A few selected review questions from [LALFD] are here.

    Here is the 2019 Final Exam and here is solution. (Note a typo on Question 4a: x_j in the displayed equation should be x_i).