I have a simple Mathematica program which writes some plots to image files for later conversion into a movie. Unfortunately, the program leaks so much memory that it quickly exhausts all 12G of RAM on my machine. The only way out is to quit the kernel(s).
I can't see why this program shouldn't use a bounded amount of memory. I've read through Debugging memory leaks which unfortunately hasn't helped - the only "heavy" symbol is 'data', whose size is fixed. I don't see what's growing!
Note that the same problem occurs regardless of whether the loop is Map or Do, Parallel- or not. The problem also occurs in both Mathematica 8 and 9, both under Linux.
Help?
(* number of frames *) n = 1000; (* just some bogus data *) makeData[_] := Table[{{x, y} = RandomReal[{-3, 3}, {2}], {y, -x}}, {200}]; data = Array[makeData, n]; (* Export the frames *) ParallelMap[ Export[ "movie-" <> IntegerString[#, 10, 4] <> ".png", ListVectorPlot[data[[#]]]] &, Range[1, n]]; Edit:
Some more information: after running this code (the non-parallel version), I checked the memory usage of the processes involved. The percentages are out of 12GB, so both the frontend and the kernel are consuming quite a bit of memory, while PNG.exe is almost nonexistent.
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 24575 gredner 20 0 4400 740 588 S 0 0.0 0:00.00 Mathematica 24646 gredner 20 0 3029m 2.3g 22m S 0 20.5 1:43.02 Mathematica 24655 gredner 20 0 3384m 1.6g 14m S 0 13.7 3:31.47 MathKernel 24984 gredner 20 0 387m 7364 2464 S 0 0.1 0:20.92 PNG.exe
ParallelDorather thanParallelMap? I know it shouldn't make a difference, but some of these things are more involved than they seem. Also try just plainDo. Maybe your system is launching like, infinity kernels all of which are running 100 computations, so you may need to specify things likeMethod -> "EvaluationsPerKernel" -> 1$\endgroup$