Which of the following is NOT an example of an exponential family of distributions?
In the exponential family form \(p_\theta(x) = h(x) \exp(\theta^T \phi(x) - A(\theta))\), what does \(A(\theta)\) represent?
Given \(n\) independent samples \(X_1, \ldots, X_n\) from a parametric family \(p_{\theta^*}\) with unknown \(\theta^* \in \Theta\), the maximum likelihood estimator \(\hat{\theta}_{\mathrm{MLE}}\) is defined as:
What is the Kullback-Leibler divergence \(\mathrm{KL}(p \parallel q)\) between two probability distributions \(p\) and \(q\)?
What is the relationship between the maximum likelihood estimator and the Kullback-Leibler divergence?
Under certain conditions, the maximum likelihood estimator is guaranteed to converge to the true parameter as the number of samples grows. This property is known as:
For exponential families, the maximum likelihood estimator (if it exists) solves which of the following equations?
What is the sufficient statistic for a multivariate Gaussian distribution in the context of exponential families?
In a generalized linear model, the maximum likelihood estimator \(\hat{w}_{\mathrm{MLE}}\) solves the equation:
In logistic regression, which distribution is used for the outcome variable?
In logistic regression, the log-likelihood is: