Haskell is one of the few non-strict languages out there.
In his paper Why Functional Programming Matters, John Hughes uses (memoized) lazy evaluation (as well as higher-order functions) to implement a couple of nice algorithms where he is able to separate:
- how data is generated.
- how this data might be manipulated.
- how finally the resulting data might be consumed.
He shows two examples:
- Using an infinite stream to generate ever-increasing approximations of interesting numerical values (like a square root, a derivative or an integral)
- Using an infinite multi-way tree to explore possible future positions of a game board.
In my own work in a wide plethora of languages, I sometimes have seen lazyness used as well, but almost always it has been in one of these two contexts: Either a linear (finite or infinite) stream, or a multi-way tree.
Are there any other data structures whose consumption is advantageous if done in a lazy fashion?