Interview Preparation
Regression
Brief notes prepared for technical interviews
Linear RegressionLogistic RegressionMLE / NLL / BCE
← Back to Archives

These notes cover two foundational regression models — the linear case for continuous targets and the logistic case for binary classification — and the probabilistic perspectives that connect them to MLE and MAP.

Linear Regression

Linear Regression — model & objective

Linear Regression — closed-form, probabilistic interpretation

Model

\[y = w^\top x + \epsilon, \quad \epsilon \sim \mathcal{N}(0, \sigma^2)\]

Objective

\[\mathcal{L}_{\text{MSE}} = \frac{1}{n} \sum_i (y_i - w^\top x_i)^2\]

Interpretation

Gradient-based optimization

Closed-form solution (normal equation)

Probabilistic interpretation

Limitations

Logistic Regression

Logistic Regression — model, classifier, decision boundary (page 30 portion)

Logistic Regression — likelihood, MLE, BCE, gradient, training (page 31)

Model

\[p(y = 1 \mid x; w) = \sigma(w^\top x) = \frac{1}{1 + e^{-w^\top x}}\]

Likelihood (Bernoulli)

MLE Objective

NLL = Binary Cross-Entropy (BCE)

Training