I've come across some equations on regression coefficients for multiple linear regression(Fig.1 and Fig.2) with 2 independent variables and was wondering how they were derived.
FIG.1
FIG.2
I'm aware of how to derive the regression coefficients using Matrix Notation, but this intrigued me(and seeing the derivation of the equations in non-matrix form always develops my intuition better than the matrix form proofs). When I saw figure 2 I figured(no pun intended) it's just an extrapolated version of the single linear regression equation(Fig.3)
Fig.3
and assumed that one would have to use something along the lines of a Multiple Correlation coefficient and just multiply that by the (StdevY)/(Stdev of Respective Independent Variable of Regression Coefficient). But then I realized the equation in Fig.2, while it might be using correlation coefficients, isn't actually the multiple correlation coefficient itself. I can't find any sources online(I pretty much only use online sources. I am not a stats student. Someone who is just genuinely curious about statistics.) that actually detail how they derived these equations for the regression coefficients in Fig.1 and Fig.2 let alone explain the intuition behind it(which is very important for me and was very useful in understanding the reasoning behind the Single Linear Regression equations.). If anyone is aware of such sources/books that perhaps even explain how to derive in them in non-matrix form as the number of independent variables gets bigger (although at that Point I understand it becomes very cumbersome and Linear Algebra is a much neater trick) I'd be very grateful. Thank you very much


