I am using a simple error propagation of a multivariate polynomial (of x[i_],y[i_],z[i_] ...). The way I am writing is as a symbolic expression and then substituting for some real values so that it can be applied in any similar general case:
(**All are real functions/numbers**) pol = a x[1] + b x[1] y[1] - c x[2] z[1] - d x[1] (** + ...**) error = (Sum[D[pol, x[i]]^2*dx[i]^2, {i, 1, 2}] + Sum[D[pol, y[i]]^2*dy[i]^2, {i, 1, 2}] + Sum[D[pol, z[i]]^2*dz[i]^2 , {i, 1, 2}](**+...**))^(1/2) Now the above trial could lead to wrong errors as what Mathematica does is to club the terms a x[1]- d x[1] + b y[1] x[1] together and in the error estimation, it just sees (a-d+ b y[1] )^2 dx[1]^2 which is not the same as (a^ + d^2 + b^2 y[1]^2)dx[1]^2. Does this mean I will have to do them one by one and sum them in quadrature explicitly (without using Sum, For, Do etc.)?
Abs[..]and notAbs[..]^2the whole time! Thanks! $\endgroup$a^2 + d^2 + b^2 y[1]^2(link?). From context, I understand that we are assuming thata,b,c,dare given and fixed numbers. Take this example: Ifa = d = 1then the first and last term inpolcancel and do not propagate errors at all. What is the rationale fora^2 + d^2in that case? $\endgroup$a-d+b*y[1]==0, then the partial derivative ofpolw.r.t.x[1]vanishes at that point, and so to linear order about that point, the function does not depend onx[1], and uncertainty inx[1]is not propagated to linear order (linear Taylor), which is all the usual error propagation formula sees. (In higher order corrections, one would see a contribution from the uncertainty ofx[1].) $\endgroup$dx[1]in one instance ofx[1]as uncorrelated with the errordx[1]in another instance ofx[1]. That would mean that somehow the different instances ofx[1]are in fact independent variables each with different values ofdx[1]. (The round-off errors ina x[1]and-d x[1]may be treated as independent ifaanddare uncorrelated, but that's not a component of the propagated error.) $\endgroup$