Skip to main content
added 272 characters in body
Source Link
JJacquelin
  • 69k
  • 4
  • 41
  • 94

If you accept a fitting with a criterion different from the classical least mean square, the method is very simple : $$ y-1=p_1(x-x^2y) +p_2(x^2-x^2y) +q_1(x^2y-xy)$$ From data $(x_1, y_1), (x_2, y_2), ... , (x_k, y_k), ... , (x_n, y_n)$, compute :

$F_k=y_k-1$

$u_k=x_k-x_k^2y_k$

$v_k=x_k^2-x_k^2y_k$

$w_k=x_k^2y_k-x_ky_k$

Then, the linear regression : $$\hat F=p_1u +p_2v +q_1w$$ leads to the approximates of $p_1$ , $p_2$ and $q_1$.

Anyways, if the criterion of fitting is imperatively the least mean square (wich is not always the best criterion in real life) the above method will give very good first estimates, in order to start a usual itterative fitting method of non-linear regression.

If you accept a fitting with a criterion different from the classical mean square, the method is very simple : $$ y-1=p_1(x-x^2y) +p_2(x^2-x^2y) +q_1(x^2y-xy)$$ From data $(x_1, y_1), (x_2, y_2), ... , (x_k, y_k), ... , (x_n, y_n)$, compute :

$F_k=y_k-1$

$u_k=x_k-x_k^2y_k$

$v_k=x_k^2-x_k^2y_k$

$w_k=x_k^2y_k-x_ky_k$

Then, the linear regression : $$\hat F=p_1u +p_2v +q_1w$$ leads to the approximates of $p_1$ , $p_2$ and $q_1$.

If you accept a fitting with a criterion different from the classical least mean square, the method is very simple : $$ y-1=p_1(x-x^2y) +p_2(x^2-x^2y) +q_1(x^2y-xy)$$ From data $(x_1, y_1), (x_2, y_2), ... , (x_k, y_k), ... , (x_n, y_n)$, compute :

$F_k=y_k-1$

$u_k=x_k-x_k^2y_k$

$v_k=x_k^2-x_k^2y_k$

$w_k=x_k^2y_k-x_ky_k$

Then, the linear regression : $$\hat F=p_1u +p_2v +q_1w$$ leads to the approximates of $p_1$ , $p_2$ and $q_1$.

Anyways, if the criterion of fitting is imperatively the least mean square (wich is not always the best criterion in real life) the above method will give very good first estimates, in order to start a usual itterative fitting method of non-linear regression.

Source Link
JJacquelin
  • 69k
  • 4
  • 41
  • 94

If you accept a fitting with a criterion different from the classical mean square, the method is very simple : $$ y-1=p_1(x-x^2y) +p_2(x^2-x^2y) +q_1(x^2y-xy)$$ From data $(x_1, y_1), (x_2, y_2), ... , (x_k, y_k), ... , (x_n, y_n)$, compute :

$F_k=y_k-1$

$u_k=x_k-x_k^2y_k$

$v_k=x_k^2-x_k^2y_k$

$w_k=x_k^2y_k-x_ky_k$

Then, the linear regression : $$\hat F=p_1u +p_2v +q_1w$$ leads to the approximates of $p_1$ , $p_2$ and $q_1$.