Skip to main content
added 687 characters in body
Source Link
Alex ten Brink
  • 9.2k
  • 3
  • 36
  • 63

As it stands, your static and dynamic schedules are both trivially optimal. IfUpdate:

For the new version where you try to minimize the total execution timemakespan, then the distribution ofyour static schedule still has the tasks overoptimal expected value.

Let $M$ be the slaves is irrelevantrandom variable for how long the total executionmakespan. Let $F_i$ be the time willslave $i$ is finished. We then have that $M = \max_i(X_i)$. Let $c_i$ be the number of jobs allocated to slave $i$. Then we have that $X_i = \sum_{i=1}^{c_i} X = c_i X$.

If $F_i(x)$ is the cumulative probability distribution function for $X$, whichthen $P(M < m) $ $ = P(\max_i(X_i) < m) $ $ = \prod_i P(X_i < m) $ $ = \prod_i P(c_i X < m) $ $ = \prod_i P(X < \frac{m}{c_i}) $ $ = \prod_i F(\frac{m}{c_i})$ is the sum of all execution timescumulative probability distribution function for $M$. Given tasks with expected execution timeThis means that $\mu$ per task$EM = \int_{-\infty}^{\infty} x (\prod_i F(\frac{x}{c_i}))' dx$ and a standard deviation of $\sigma$$stddev(M) = \sqrt{\int_{-\infty}^{\infty} (x - EM)^2 (\prod_i F(\frac{x}{c_i}))' dx}$, the expected total execution time isas normal.

Minimizing $n \mu$ with a standard deviation of$EM$ amounts to minimizing $\sqrt{n} \sigma$$\prod_i F(\frac{x}{c_i})$, which means we want to keep all $c_i$s equally low (as $F$ is monotonically increasing and between 0 and 1). This means we should equally distribute all tasks among the slaves, which is exactly what your static schedule achieves.

As it stands, your static and dynamic schedules are both trivially optimal. If you try to minimize the total execution time, then the distribution of the tasks over the slaves is irrelevant for how long the total execution time will be, which is the sum of all execution times. Given tasks with expected execution time $\mu$ per task and a standard deviation of $\sigma$, the expected total execution time is $n \mu$ with a standard deviation of $\sqrt{n} \sigma$.

Update:

For the new version where you try to minimize the makespan, your static schedule still has the optimal expected value.

Let $M$ be the random variable for the makespan. Let $F_i$ be the time slave $i$ is finished. We then have that $M = \max_i(X_i)$. Let $c_i$ be the number of jobs allocated to slave $i$. Then we have that $X_i = \sum_{i=1}^{c_i} X = c_i X$.

If $F_i(x)$ is the cumulative probability distribution function for $X$, then $P(M < m) $ $ = P(\max_i(X_i) < m) $ $ = \prod_i P(X_i < m) $ $ = \prod_i P(c_i X < m) $ $ = \prod_i P(X < \frac{m}{c_i}) $ $ = \prod_i F(\frac{m}{c_i})$ is the cumulative probability distribution function for $M$. This means that $EM = \int_{-\infty}^{\infty} x (\prod_i F(\frac{x}{c_i}))' dx$ and $stddev(M) = \sqrt{\int_{-\infty}^{\infty} (x - EM)^2 (\prod_i F(\frac{x}{c_i}))' dx}$, as normal.

Minimizing $EM$ amounts to minimizing $\prod_i F(\frac{x}{c_i})$, which means we want to keep all $c_i$s equally low (as $F$ is monotonically increasing and between 0 and 1). This means we should equally distribute all tasks among the slaves, which is exactly what your static schedule achieves.

Source Link
Alex ten Brink
  • 9.2k
  • 3
  • 36
  • 63

As it stands, your static and dynamic schedules are both trivially optimal. If you try to minimize the total execution time, then the distribution of the tasks over the slaves is irrelevant for how long the total execution time will be, which is the sum of all execution times. Given tasks with expected execution time $\mu$ per task and a standard deviation of $\sigma$, the expected total execution time is $n \mu$ with a standard deviation of $\sqrt{n} \sigma$.