My account is having issues with hitting rate limits, every 5 mins.
I’ve run a simple script to return the usage every 5 seconds:
complexity {
after
reset_in_x_seconds
}
WARNING: 05/01/2022 15:09:47 | Reset: 60 secs. | Avail: 100.00% | Remaining: 10000000 | Used: 0
WARNING: 05/01/2022 15:09:52 | Reset: 55 secs. | Avail: 99.50% | Remaining: 9950000 | Used: 50000
WARNING: 05/01/2022 15:09:57 | Reset: 50 secs. | Avail: 99.00% | Remaining: 9900000 | Used: 100000
WARNING: 05/01/2022 15:10:02 | Reset: 45 secs. | Avail: 42.49% | Remaining: 4248952 | Used: 5751048
WARNING: 05/01/2022 15:10:08 | Reset: 39 secs. | Avail: 0.23% | Remaining: 23166 | Used: 9976834
I’m not aware of anything on our account that runs every 5 minutes, nor a query / mutation that would consume 9,000,000 complexity in the span of 10 seconds.
Additionally, I have functions / API calls that no longer return information as they hit limits. Seems as if something changed and now querying for subitems has a 10x or more multiplier. I used to be able to query items and subitems, paginated, without issue. Now the first page returns a 6million query complexity.
😕
I have a query that calculates to a complexity of 1120. Adding subitems { id } - causes the complexity to turn into 250295.