Skip to main content
Corrected wording responding to comment
Source Link
Alecos Papadopoulos
  • 62.5k
  • 8
  • 161
  • 294

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that "fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution in the usual sense, so they have no moments, like for example in the expected valueusual sense, meaning in practice that $E(x^r) = x^r$. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that "fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that "fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution in the usual sense, so they have no moments in the usual sense, meaning in practice that $E(x^r) = x^r$. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

added 1 character in body
Source Link
kjetil b halvorsen
  • 85.6k
  • 32
  • 216
  • 694

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that"fixed"that "fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that"fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that "fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

typo
Source Link
Alecos Papadopoulos
  • 62.5k
  • 8
  • 161
  • 294

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that"fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable herhere does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that"fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable her does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

My suggestion is to take the habit of calling the "fixed" regressors "deterministic". This accomplishes two things: first, it clears the not-infrequent misunderstanding that"fixed" means "invariant". Second, it clearly contrasts with "stochastic", and tells us that the regressors are decided upon (hence the "design matrix" terminology that comes from fields where the regressors are f... deterministic).

If regressors are deterministic, they have no distribution, so they have no moments, like for example the expected value. The only stochastic element in the sample, rests in the error term (and so in the dependent variable).

This has the basic implication that a sample with even one and varying deterministic regressor is no longer an identically distributed sample:

$$E(y_i) = bE(x_i) + E(u_i) \implies E(y_i) = bx_i$$

and since the deterministic $x_i$'s are varying, it follows that the dependent variable does not have the same expected value for all $i$'s. In other words, there is not one distribution, each $y_i$ has its own (possibly belonging to the same family, but with different parameters).

So you see it is not about conditional moments, the implications of deterministic regressors relate to the unconditional moments. For example, averaging the dependent variable here does not give us anything meaningful, except for descriptive statistics for the sample.

Reverse that to see the implication: if the $y_i$'s are draws from a population of identical random variables, in what sense, and with what validity are we going to link them with deterministic regressors? We can always regress a series of numbers on a matrix of other numbers: if we use ordinary least-squares, we will be estimating the related orthogonal projection. But this is devoid of any statistical meaning.

Note also that $E(y_i \mid x_i) = E(y_i)$. Does this mean that $y_i$ is "mean-independent" from $x_i$? No, this would be the interpretation if $x_i$ was stochastic. Here, it tells us that there is no distinction between unconditional and conditional moments, when deterministic regressors are involved.

We can certainly predict with deterministic regressors. $b$ is a common characteristic of all $y_i$'s, and we can recover it using deterministic regressors. Then we can take a regressor with a value out-of-sample, and predict the value of the corresponding $y$.

Source Link
Alecos Papadopoulos
  • 62.5k
  • 8
  • 161
  • 294
Loading