-
Bug
-
Resolution: Fixed
-
Highest
-
8.20.11, 8.20.12
-
8.2
-
6
-
Severity 1 - Critical
-
165
-
-
Issue Summary
When a Filter subscription has thousands of recipients, the com.atlassian.jira.mail.SubscriptionSingleRecepientMailQueueItem object retains too much Heap memory, causing the JVM to go into Full GC cycles or eventually into the OutOfMemory state.
The symptoms are exactly the same as JRASERVER-31588 "Large filter subscriptions can crash a JIRA instance with an OutOfMemoryError", but the Objects seen accumulating the Heap are different.
That specific issue's been fixed in 7.1.6 but this one here affects 8.20 at least (and possibly other versions of Jira 8.x).
Steps to Reproduce
- Create an instance with 50 projects, 20 Issues in each and 100,000 users that all belong to a single group
- Create a Filter of all Issues (~1,000) and subscribe it to e-mail to all 100k users.
- Wait for the subscription to run and take a Heap dump
Expected Results
- Jira would not deplete the Heap memory
Actual Results
- The java.lang.Thread Sending mailitem com.atlassian.jira.mail.SubscriptionSingleRecepientMailQueueItem will grow and retain a large portion of the Heap. Full GC cycles or OutOfMemory may occur depending on the Heap size.
Real data
A customer had a Subscription to 130k recipients:
2022-09-09 00:00:01,945-0700 Sending mailitem com.atlassian.jira.mail.SubscriptionMailQueueItem id: '11111' owner: 'xxxxx(JIRAUSER111111)' INFO anonymous Mail Queue Service [c.a.jira.mail.SubscriptionMailQueueItem] Sending subscription '11111' of filter '222222' to 130948 recipients.
And the SubscriptionSingleRecepientMailQueueItem grew up to 32GB:
Workaround
There is no workaround for this. Subscriptions to too many recipients should be duplicated to smaller groups (and preferably, scheduled at different times).
The below grep can be used to spot subscriptions to 1000 recipients or more ([0-9]{4}{}) and :
$ grep -E "Sending subscription '[0-9]{+}' of filter '[0-9]{+}' to [0-9]{4}" <jira-home>/log/atlassian-jira.log*
In Windows you may search the logs for similar keywords: "Sending mailitem com.atlassian.jira.mail.SubscriptionMailQueueItem", "Sending subscription".
This other Linux command stream prints a table of the key data, also for 1000+ recipients:
$ grep -E "Sending subscription '[0-9]+' of filter '[0-9]+' to [0-9]{4}" <jira-home>/log/atlassian-jira.log* | awk 'BEGIN {printf "%s %s %s %s %s\n", "Time", "Owner", "Subscription_Id", "Filter_Id", "Recipients"}; {print $1"_"$2, $9, $18, $21, $23}' | sed 's/'\''//g' | column -tx;
Sample output (with usernames and user keys redacted):
Time Owner Subscription_Id Filter_Id Recipients 2022-12-11_17:05:00,121+0000 username(userkey) 14305 31001 1962 2022-12-11_17:15:18,022+0000 username(userkey) 14304 31000 1962 2022-12-12_09:27:18,762+0000 username(userkey) 13300 19063 2026 2022-12-12_09:27:18,793+0000 username(userkey) 13901 18712 1962 2022-12-12_09:27:18,805+0000 username(userkey) 13902 18710 1962 2022-12-12_09:27:18,819+0000 username(userkey) 13903 18711 1962 2022-12-12_09:27:18,832+0000 username(userkey) 13904 18716 1962 2022-12-12_09:27:18,845+0000 username(userkey) 13905 18717 1962 2022-12-12_09:27:18,858+0000 username(userkey) 13906 18718 1962 2022-12-12_09:27:18,870+0000 username(userkey) 13907 18713 1962 2022-12-12_09:27:18,882+0000 username(userkey) 13908 18714 1962 2022-12-12_09:27:18,895+0000 username(userkey) 13909 18715 1962 2022-12-12_09:27:18,908+0000 username(userkey) 13910 18719 1962 2022-12-12_09:27:18,921+0000 username(userkey) 13911 18720 1962 2022-12-12_09:27:18,934+0000 username(userkey) 13912 18721 1962
Mitigation
This bug can be mitigated by limiting who can create a Group Subscription - there is a Jira Global permission "Manage Group Filter Subscription" that can be used to limit filter subscription assignment to only admins or specific users:
- is related to
-
JRASERVER-31588 Large filter subscriptions can crash a JIRA instance with an OutOfMemoryError
- Closed
- relates to
-
JRASERVER-61543 Outgoing Mail Stopped Working due to Large Group Filter Subscription
- Gathering Impact
- followed by
-
MNSTR-6636 Loading...
- mentioned in
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...
-
Page Loading...