Which of the following is NOT a property of the induced 2-norm of a matrix \(A\)?
Which of the following best describes the Frobenius norm of a matrix \(A \in \mathbb{R}^{n \times m}\)?
Let \(A \in \mathbb{R}^{n \times m}\) be a matrix with compact SVD \(A = \sum_{j=1}^r \sigma_j \mathbf{u}_j \mathbf{v}_j^T\), where \(\sigma_1 \geq \sigma_2 \geq \cdots \sigma_r > 0\). Which of the following is true about the induced 2-norm of \(A\)?
*Let \(A \in \mathbb{R}^{n \times m}\) with \(m \leq n\) and let \(B \in \mathbb{R}^{n \times m}\) be a matrix of rank \(k \leq \min\{n,m\}\). Let \(B_{\perp} \in \mathbb{R}^{n \times m}\) be the matrix obtained by projecting each row of \(A\) onto the row space of \(B\). Which of the following is true?
Let \(A \in \mathbb{R}^{n \times m}\) be a matrix with SVD \(A = \sum_{j=1}^r \sigma_j \mathbf{u}_j \mathbf{v}_j^T\) and let \(A_k = \sum_{j=1}^k \sigma_j \mathbf{u}_j \mathbf{v}_j^T\) be the truncated SVD with \(k < r\). Which of the following is true about the Frobenius norm of \(A - A_k\)?
Let \(A \in \mathbb{R}^{n \times m}\) be a matrix with SVD \(A = \sum_{j=1}^r \sigma_j \mathbf{u}_j \mathbf{v}_j^T\) and let \(A_k = \sum_{j=1}^k \sigma_j \mathbf{u}_j \mathbf{v}_j^T\) be the truncated SVD with \(k < r\). Which of the following is true about the induced 2-norm of \(A - A_k\)?
What is the pseudoinverse of an \(n \times m\) matrix \(A\) with compact SVD \(A = \sum_{l=1}^r \sigma_l u_l v_l^T\)?
Let \(A\) be an \(n \times m\) matrix with \(m \le n\). Which of the following is true about the pseudoinverse \(A^+\)?
Let \(A \in \mathbb{R}^{n \times m}\) with \(m \leq n\) and assume that \(A\) has full column rank \(m\). Which of the following is true about the pseudoinverse \(A^+\)?
Let \(A \in \mathbb{R}^{n \times m}\) with \(m > n\) and assume that \(A\) has full row rank \(n\). Which of the following is true about the pseudoinverse \(A^+\)?
Let \(A\) be an \(n \times m\) matrix with \(m > n\) and full row rank. Which of the following is true about the solution \(x^* = A^+b\) to the system \(Ax = b\)?
What is the purpose of ridge regression?
The ridge regression problem is formulated as \(\min_{x \in \mathbb{R}^m} \|Ax - b\|_2^2 + \lambda \|x\|_2^2\). What is the role of the parameter \(\lambda\)?
Let \(A\) be an \(n \times m\) matrix with compact SVD \(A = \sum_{j=1}^r \sigma_j u_j v_j^T\). How does the ridge regression solution \(x^{**}\) compare to the least squares solution \(x^*\)?
Let \(A \in \mathbb{R}^{n \times m}\) with \(n \geq m\) and \(\mathbf{b} \in \mathbb{R}^n\). Consider the ridge regression problem
\[ \min_{\mathbf{x} \in \mathbb{R}^m} \|A \mathbf{x} - \mathbf{b}\|_2^2 + \lambda \|\mathbf{x}\|_2^2, \]
where \(\lambda > 0\). Which of the following is the solution to this problem?
Let \(A \in \mathbb{R}^{n \times n}\) be a square nonsingular matrix with compact SVD \(A = \sum_{j=1}^n \sigma_j \mathbf{u}_j \mathbf{v}_j^T\). Which of the following is true about the induced 2-norm of the inverse \(A^{-1}\)?