In load balancing problem we have $m$ machines and $n$ jobs, each taking processing time $t_j$. Total processing time on the machine $i$ is $T_i =\sum_{j\in A(i)}{t_j}$, where $A(i)$ is the set of jobs assigned to machine $i$. Goal is to provide an assignment of jobs, such that $\max T_i$ is minimized.
It is known that sorted greedy algorithm is only approximation solution for load balancing problem. But I can't find an instance of problem, when it actually doesn't find an optimum solution. Can someone provide an example when this algorithm gives a wrong answer?