-
Suggestion
-
Resolution: Unresolved
-
None
-
None
The scheduled job titled "Purge Old Job Run Details" runs once per day at 23:00 local time (by default). Per iteration, a maximum of 2000 rows (system property 'jobs.limit.per.purge') are removed.
In Confluence 9.2, without any 3rd party plugins, a minimum of 97 rows per minute are added to this table. Extrapolating that rate, the daily cleanup job will remove less than 21 minutes worth of entries from scheduler_run_details per day.
Excluding the use of 3rd-party plugins (which add more job entries), we need Confluence to remove approximately 140,000 rows per day.
The only workaround currently is to stop Confluence and truncate this table to curb the volume of content in this table, as running the above job more frequently (ie. hourly) with a higher 'jobs.limit.per.purge' value as described in this article is not a solution for some databases with slow 'DELETE FROM' performance on large tables such as this one.