Bivariate normal and regression lines
Visualise how the regression lines $E(Y|X)$ and $E(X|Y)$ change when the values of the parameters of the given bivariate normal change. The line of best orthogonal fit is the line that minimises the sum of squares of orthogonal distances.
The ellipse is a level curve for the joint probability density function $f(x,y)$. For convenience we chose the means as $\mu_X=\mu_Y=0$. The yellow line is supporting the major axis of the ellipse and joining the two points on the ellipse that are the furthest from each other (the points of tangency of the ellipse with its circumscribed circle). Note that in the case when $r=0$ and $\sigma_X=\sigma_Y$, every line through the origin is a line of best orthogonal fit, therefore there is no yellow line displayed.
The green line is the regression line $E(Y|X)=\alpha + \beta X$. It minimises the sum of squared vertical distances. This regression line joins the two points where the ellipse is tangent to vertical lines. The simple explanation of this fact goes as follows: let’s denote by $(x_1,y_1)$ the coordinates of the rightmost point on the ellipse for instance. Then consider the restriction of the density function along the vertical line $X=x_1$. This restriction is a multiple of the density function of the normal variable $Y|X=x_1$, (the conditional density) which takes every value twice except the value at its mode $y=\alpha + \beta x_1$. Indeed, it is a unimodal density. Any other ellipse which is a level curve for $f(x,y)$ cuts the vertical line $X=x_1$ in two points or not at all. So the point of tangency is precisely the point of coordinates $(x_1,\alpha + \beta x_1)$.
Similarly, the magenta line is the regression line $E(X|Y)$ that minimises the sum of squared horizontal distances. This line joins the only two points where the ellipse is tangent to horizontal lines.
You may also observe that the line of best orthogonal fit bisects the angle formed by the two regressions lines only in the case $\sigma_X=\sigma_Y$.
Other resources: