I'm having (again/still) a very large LateX document which I'm working on.
- >340 pages
- many graphics (mostly included pdf, but also some PNG or others)
- many tables (also longtables)
- resulting PDF is >18 MB in size
- many references and citations
Compilation takes between 2 and 7 minutes normally. I'm using latexmk for complete compiling, so the given times normally is for several latex runs in total.
Nevertheless I'd like to speed up the process of recompiling and try to find the bottlenecks. When I see the console output, it seems to hang at some points where compiling takes longer.
Is there a way to analyze which steps takes how much time and thus find those 20% of the things which might cause 80% of the compiling time?
E. g. can I print the duration it takes to process each page and so see which pages take the most time to build?
Pareto rules:
Thanks to @Robert 's answer, I now have the compiling times of each page and
- indeed it is about 20 pages of the 340 which take more than 50% of the compiling time!
- 2 pages even need about 20 s for compiling!
- none of those "slow pages" seems to include PNG images
Very strange, I'll have to dig deeper - maybe cross references or fixme remarks are the problem?!
PNG copyin the log file.draftoption, so that no images are included, only the bounding boxes.latexmkevery time? Why compile multiple times to get everything resolved when you are still working on a draft? I mostly compile a single time when drafting (and I don't necessarily compile at all if it is not especially visual). It sounds as if you are compiling to perfection every time and that seems quite unnecessary.