0

I saw some log entries that indicated transaction time outliers of up to 10s at times, where transaction times are typically below 1s. To get a view of how often this happens, is there a way to get transaction time statistics in Postgres for transactions involving particular tables?

Even just count number of transactions over some threshold, which I could then compare to the total number?

Thank you.

1 Answer 1

0

One of the problems with your wish is that transactions are not tied to tables.

Here are some ideas. None of them is exactly what you envision, but perhaps one is good enough:

  1. Set idle_in_transaction_session_timeout to 10s or 30s. That will disrupt processing somewhat, but you don't have to run like that for a long time. Sessions that get killed that way will leave a log entry in the PostgreSQL log and hopefully also in the application log, so you have something to go by.

  2. If backends get stuck behind a lock for more than one second, enabling log_lock_waits will give you a helpful log message. This will only report some long running transactions, but it is generally useful.

  3. Use a monitoring system that takes regular snapshots of pg_stat_activity. That view shows the age of the transaction and the last statement executed in the session.

1
  • True, they may involve multiple tables, just wasn't sure what might be available. The disruption wouldn't work in this case. I can try the other two points or maybe a logging/AOP approach in the backend. Thanks. Commented Jul 26 at 15:41

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.