I’m kinda stuck here and been googling this for several hours
Use case:
- Running an app on Node/Express servers
- It does a lot of calls to Google Calendar API v3 to keep users calendar up to date
- There is a 4requests/second API quota for each Google Calendar user
- Requests/second will spike quite often and at times there can be up to 200 actions that i need to perform through API. Delete some events, update other events, push new events etc.
- There can be a case where an app user can initiate an action that will have to do ~100 API calls and then initiate an action for another 100 API calls
- Since the app is running on several node/express instances i can’t keep the queue in memory
- I might need to tell the queue to remove some API calls that are no longer valid/useless to perform, because of some other action that user performed in the system
It’s the first time i’m dealing with this and from what i gather there needs to be a separate server that would take care of all of these API calls/jobs, but my googling does not provide a clear answer or a recommended tech stack(examples) to achieve the results.
I’ve tried exploring Bull and other redis libraries, but none of them are offering an option to track the job queue for each user in the system.