- September 11, 2020

The Australian National University Semester 2, 2020 Research School of Computer Science Theory Assignment 2 of 5 COMP3670: Introduction to Machine Learning Release Date. Aug 21st, 2020 Due Date. 23:59pm, Sep 13th, 2020 Maximum credit. 100 Exercise 1 Orthogonal Compliments (10+10 credits) Let V be a vector space, together with an inner product 〈·, ·〉 : V × V → R and let X and Y be vector subspaces of V . We define the orthogonal compliment XT as XT := { v ∈ V : 〈x,v〉 = 0 for all x ∈ X} 1. Prove that X ∩XT = {0}, where 0 is the zero vector in V . 2. Prove that if X ⊆ Y , then Y T ⊆ XT . Exercise 2 Norms and Inner Products (10+20 credits) 1. Let (V, 〈·, ·〉) be an inner product space. Let proju(v) := 〈v,u〉 〈u,u〉u denote the vector projection of v onto u. Prove that v− proju(v) and u are orthogonal. 2. Let (V, 〈·, ·〉) be an inner product space. Let ||x|| := √〈x,x〉. Prove that || · || is a norm. (Hint: To prove the triangle inequality holds, you may need the Cauchy-Schwartz inequality, 〈x,y〉 ≤ ||x||||y||.) Exercise 3 Vector Calculus (10+10+30 credits) 1. f, g : Rn → R, f(x) = cTx, c ∈ Rn, g(x) = √ cTx+ µ2, µ ∈ R. • a) (3 points) Prove df(x)dx = cT . • b) (2 points) Calculate dgdx . 2. Given a system of linear equations Ax = b, with A ∈ Rk×n,x ∈ Rn×1, b ∈ Rk×1, sometimes there exists no solutions x. So we’d like to find a approximate solution Ax ≈ b. To achieve this, we formulate the following regularized least squares error `(x) =‖Ax− b‖22 + λ‖x‖22 ,where λ ∈ Show that the gradient of the regularized least squares error above is given by d`(x) dx = 2(xTATA− bTA) + 2λxT (Hint: you can directly use the conclusions from questions 2 and 3 above, together with the definition of the Euclidean norm.)