0
$\begingroup$

In load balancing problem we have $m$ machines and $n$ jobs, each taking processing time $t_j$. Total processing time on the machine $i$ is $T_i =\sum_{j\in A(i)}{t_j}$, where $A(i)$ is the set of jobs assigned to machine $i$. Goal is to provide an assignment of jobs, such that $\max T_i$ is minimized.

It is known that sorted greedy algorithm is only approximation solution for load balancing problem. But I can't find an instance of problem, when it actually doesn't find an optimum solution. Can someone provide an example when this algorithm gives a wrong answer?

$\endgroup$
1
  • $\begingroup$ What have you tried? This is easily solved by writing a program to enumerate many possible problem instances, and compare what greedy gives you to an optimal solution (obtained via brute force). We expect you to put in a significant effort before asking and to show us what you've tried. $\endgroup$ Commented Jan 26, 2015 at 18:34

1 Answer 1

3
$\begingroup$

Take for instance $m=2$ machines and $n=5$ jobs with processing times $4,3,3,2,2$.

  • The optimal schedule puts $4+3$ on one machine and $3+2+2$ on the orther machine.

  • Sorted-greedy assigns $4$ to the first machine, $3$ to the second machine, $3$ to the second machine, $2$ to the first machine, $2$ to the first machine. Hence the loads are $4+2+2$ and $3+3$.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.