-
Notifications
You must be signed in to change notification settings - Fork 784
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unexpected database load with JpaEventStorageEngine::fetchEvents ramping up #2641
Comments
What is the scale on the graph, and how many events are there? Since for a relational database all rows/events are considered the same, I would expect a fetch events query to be getting a bit slower over time. The tokens are limited in size, so indeed I would expect token-related operations to stay constant over time. |
Only when the Event Store was passing sizes could not effectively handle the RDBMS indices anymore. However, whether this is the case yes or no isn't clear at the moment. So, what's the amount of events currently in the RDBMS, @junkdog? While waiting for your response, I have added the |
Closing this issue due to inactivity. |
Basic information
Description
Hi,
We have identified an issue with one of our applications that is built using Spring Boot 3, JPA, and Axon Framework 4.7.1. In a particular environment, we have noticed that despite low traffic, the database load is uneven.
(requests to RDS grouped by query)
Upon investigation, we have found that JpaEventStorageEngine::fetchEvents appeared to be ramping up over time, as indicated by the increasing blue bars. It is worth noting that during this time period, there was no scaling activity apart from a single deploy at 23 Feb 1500, which only caused a small bump. Interestingly, the UPDATE token_entry statement shown in the yellow bars was running at a constant rate.
Is this something you've seen in the past? Any ideas on how we could investigate this further?
Expected behavior
During no load,
JpaEventStorageEngine::fetchEvents
should be invoked at a more-or-less constant rate.Actual behaviour
In one of our environments, it ramped up over time.
The text was updated successfully, but these errors were encountered: