Let \( \mathbf{a} = (a_1,\ldots,a_n), \mathbf{b} = (b_1,\ldots,b_n), \mathbf{c} = (c_1,\ldots,c_n) \in \mathbb{R}^n \). Which of the following is true about the Hadamard product \( \odot \)?
Let \( A \in \mathbb{R}^{n \times m} \) and \( B \in \mathbb{R}^{p \times q} \). What are the dimensions of the Kronecker product \( A \otimes B \)?
What is the result of the Kronecker product \( A \otimes B \) if \( A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} \) and \( B = \begin{pmatrix} 0 & 5 \\ 6 & 7 \end{pmatrix} \)?
If \( f: \mathbb{R}^d \rightarrow \mathbb{R}^m \) is a continuously differentiable function, what is its Jacobian \( J_f(x_0) \) at an interior point \( x_0 \) of its domain?
Let \( f: \mathbb{R}^d \rightarrow \mathbb{R} \) be a twice continuously differentiable function. What is the Jacobian of its gradient \( \nabla f \)?
What is the relationship between the Jacobian of a continuously differentiable real-valued function \( f: \mathbb{R}^d \rightarrow \mathbb{R} \) and its gradient \( \nabla f \)?
In the context of the Chain Rule, if \( f: \mathbb{R}^2 \to \mathbb{R}^3 \) and \( g: \mathbb{R}^3 \to \mathbb{R} \), what is the dimension of the Jacobian matrix \( J_{g \circ f}(x) \)?
Let \( \mathbf{f} : D_1 \to \mathbb{R}^m \), where \( D_1 \subseteq \mathbb{R}^d \), and let \( \mathbf{g} : D_2 \to \mathbb{R}^p \), where \( D_2 \subseteq \mathbb{R}^m \). Assume that \( \mathbf{f} \) is continuously differentiable at \( \mathbf{x}_0 \), an interior point of \( D_1 \), and that \( \mathbf{g} \) is continuously differentiable at \( \mathbf{f}(\mathbf{x}_0) \), an interior point of \( D_2 \). Which of the following is correct according to the Chain Rule?
Given the function \( f(x) = \begin{pmatrix} x_1^2 \\ \sin(x_2) \end{pmatrix} \), what is the Jacobian \( J_f \) at the point \( (1, \frac{\pi}{2}) \)?
Let \( f: \mathbb{R}^2 \rightarrow \mathbb{R}^3 \) be defined by \( f(x_1, x_2) = (3x_1^2, x_2, x_1x_2)^T \). What is the Jacobian matrix \( J_f(x_1, x_2) \)?
Let \( A \in \mathbb{R}^{m \times d} \) and \( \mathbf{b} \in \mathbb{R}^{m} \). Define the vector-valued function \( \mathbf{f} : \mathbb{R}^d \to \mathbb{R}^m \) as \( \mathbf{f}(\mathbf{x}) = A \mathbf{x} + \mathbf{b} \). What is the Jacobian of \( \mathbf{f} \) at \( \mathbf{x}_0 \)?
Consider the following PyTorch code:
x1 = torch.tensor(1.0, requires_grad=True)
x2 = torch.tensor(2.0, requires_grad=True)
f = 3 * (x1 ** 2) + x2 + torch.exp(x1 * x2)
What does the option requires_grad=True
do?
Which of the following PyTorch functions is used to compute gradients?