Simple linear regression through the origin
y_i = \beta x_i + \epsilon_i, where \epsilon_i \sim N(0,\sigma^2).
\newline The method of Maximum Likelihood will be used to find the estimator for \hat{\beta}.
y_i \sim N(\beta x_i,\sigma^2)
Likelihood and Log-likelihood function
We are assuming y_i|x_i \sim N(\beta x_i, \sigma^2) and that our observations are independent and the probability density function of the normal distribution
\begin{eqnarray*} p(y_i) &=& \frac{1}{\sqrt{2\pi}\sigma}\mathrm{exp}(-\frac{1}{2\sigma^2}\{y_i-(\beta x_i)\}^2)\\ \end{eqnarray*}
Hence \begin{eqnarray*} L(\alpha,\beta) &\propto& \prod_{i=1}^n \frac{1}{{\sqrt{2\pi}}\sigma}\mathrm{exp}(-\frac{1}{2\sigma^2}\{y_i-(beta x_i)\}^2)\\ &=&(2\pi)^\frac{-n}{2}\sigma^{-n}\mathrm{exp}(-\frac{1}{2\sigma^2}\sum_{i=1}^n\{y_i-(\beta x_i)\}^2)\\ \\ l(\alpha,\beta)&=& -\frac{n}{2}\mathrm{log_e}(2\pi)-n\mathrm{log_e}\sigma-\frac{1}{2\sigma^2}\sum_{i=1}^n\{y_i-(\beta x_i)\}^2\\ \end{eqnarray*}
\begin{eqnarray*} \frac{d l}{d \beta}&=&-\frac{1}{2\sigma^2}\sum_{i=1}^n( -2x_iy_i+2\beta x_i^2) \end{eqnarray*}
\frac{d l}{d \beta}=0 \mbox{ when}
\begin{eqnarray*} \sum_ix_iy_i-\beta\sum_ix_i^2 &=& 0\\ -\beta\sum_ix_i^2 &=& -\sum_ix_iy_i\\ \beta\sum_ix_i^2 &=& \sum_ix_iy_i\\ \hat{\beta} &=& \frac{\sum_ix_iy_i}{\sum_ix_i^2} \end{eqnarray*}
\frac{d^2 l}{d \beta^2}=-\frac{1}{\sigma^2}\sum_{i=1}^nx_i^2 <0 hence we have found a maximum.