Skip to main content
Commonmark migration
Source Link

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.

deleted 123 characters in body
Source Link
Sextus Empiricus
  • 93.9k
  • 6
  • 127
  • 338

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.


Written by StackExchangeStrike

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.


Written by StackExchangeStrike

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.

added 122 characters in body
Source Link
Sextus Empiricus
  • 93.9k
  • 6
  • 127
  • 338

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.


Written by StackExchangeStrike

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.

Multiplication of two probabilities

The probability for a first arrival at a time between $t$ and $t+dt$ (the waiting time) is equal to the multiplication of

  • the probability for an arrival between $t$ and $t+dt$ (which can be related to the arrival rate $s(t)$ at time $t$)
  • and the probability of no arrival before time $t$ (or otherwise it would not be the first).

This latter term is related to:

$$P(n=0,t+dt) = (1-s(t)dt) P(n=0,t)$$

or

$$\frac{\partial P(n=0,t)}{\partial t} = -s(t) P(n=0,t) $$

giving:

$$P(n=0,t) = e^{\int_0^t-s(t) dt}$$

and probability distribution for waiting times is:

$$f(t) = s(t)e^{\int_0^t-s(t) dt}$$

###Derivation of cumulative distribution.

Alternatively you could use the expression for the probability of less than one arrival conditional that the time is $t$

$$P(n<1|t) = F(n=0;t)$$

and the probability for arrival between time $t$ and $t+dt$ is equal to the derivative

$$f_{\text{arrival time}}(t) = - \frac{d}{d t} F(n=0 \vert t)$$

This approach/method is for instance useful in deriving the gamma distribution as the waiting time for the n-th arrival in a Poisson process. (waiting-time-of-poisson-process-follows-gamma-distribution)


###Two examples

You might relate this to the waiting paradox (Please explain the waiting paradox).

  • Exponential distribution: If the arrivals are random like a Poisson process then $s(t) = \lambda$ is constant. The probability of a next arrival is independent from the previous waiting time without arrival (say, if you roll a fair dice many times without six, then for the next roll you will not suddenly have a higher probability for a six, see gambler's fallacy). You will get the exponential distribution, and the pdf for the waiting times is: $$f(t) = \lambda e^{-\lambda t} $$

  • Constant distribution: If the arrivals are occurring at a constant rate (such as trains arriving according to a fixed schedule), then the probability of an arrival, when a person has already been waiting for some time, is increasing. Say a train is supposed to arrive every $T$ minutes then the frequency, after already waiting $t$ minutes is $s(t) = 1/(T-t)$ and the pdf for the waiting time will be: $$f(t)= \frac{e^{\int_0^t -\frac{1}{T-t} dt}}{T-t} = \frac{1}{T}$$ which makes sense since every time between $0$ and $T$ should have equal probability to be the first arrival.


So it is this second case, with "then the probability of an arrival, when a person has already been waiting for some time is increasing", that relates to your question.

It might need some adjustments depending on your situation. With more information the probability $s(t) dt$ for a train to arrive at a certain moment might be a more complex function.


Written by StackExchangeStrike

added 728 characters in body
Source Link
Sextus Empiricus
  • 93.9k
  • 6
  • 127
  • 338
Loading
added 108 characters in body
Source Link
Sextus Empiricus
  • 93.9k
  • 6
  • 127
  • 338
Loading
Source Link
Sextus Empiricus
  • 93.9k
  • 6
  • 127
  • 338
Loading