1

we've got a slightly grown spring webapp (on tomcat 7) that is very slow in shutdown. (which has negative impacts on the performance of our continous delivery)

My suspicion is, that there must be some bean that is blocking (or taking very long) in it's @PreDestroy method.

So far I've ensured that it's not related to a thread(pool) that is not shut down correctly by giving distinct names to every pool, thread and timer and ensuring that they are either daemon threads or being shut down correctly.

Has anybody every solved a situation like this and can give me a hint on how to cope with this?

BTW: killing the tomcat process is not an option - we really need a clean shutdown for our production system.

3
  • 1
    You can try to run a profiler on your application and see where the time is spent when your application is shutting down. Commented Feb 20, 2014 at 13:48
  • 1
    Did you inspect the application logs? Maybe increase the log level before checking. You should be able to see where it is consuming time. Commented Feb 20, 2014 at 14:24
  • the profiler is a good idea; on the application logs: set all log levels to debug, but nothing to see there :( Commented Feb 20, 2014 at 14:31

2 Answers 2

2

Profiling would be the nuclear option. It's probably easy to get a picture of what's happening (especially if it is just blocked threads since that state will be long lived) just using thread dumps. If you take 2 dumps separated by a few seconds and they show the same or similar output for one or more threads then that is probably the bottleneck. You can get a thread dump using jstack or "kill -3" (on a sensible operating system).

Sign up to request clarification or add additional context in comments.

Comments

0

and if you're on Windows, then selecting the java console window, and hitting ctrl + pause will dump to that window - just hit 'enter' to resume execution

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.