I have an SPA and for technical reasons I have different elements potentially firing the same fetch()
call pretty much at the same time.[1]
Rather than going insane trying to prevent multiple unrelated elements to orchestrate loading of elements, I am thinking about creating a gloabalFetch() call where:
- the
init
argument is serialised (along with theresource
parameter) and used as hash - when a request is made, it’s queued and its hash is stored
- when another request comes, and the hash matches (which means it’s in-flight), another request will NOT be made, and it will piggy back from the previous one
async function globalFetch(resource, init) {
const sigObject = { ...init, resource }
const sig = JSON.stringify(sigObject)
// If it's already happening, return that one
if (globalFetch.inFlight[sig]) {
// NOTE: I know I don't yet have sig.timeStamp, this is just to show
// the logic
if (Date.now - sig.timeStamp < 1000 * 5) {
return globalFetch.inFlight[sig]
} else {
delete globalFetch.inFlight[sig]
}
const ret = globalFetch.inFlight[sig] = fetch(resource, init)
return ret
}
globalFetch.inFlight = {}
It’s obviously missing a way to have the requests’ timestamps. Plus, it’s missing a way to delete old requests in batch. Other than that… is this a good way to go about it?
Or, is there something already out there, and I am reinventing the wheel…?
[1] If you are curious, I have several location-aware elements which will reload data independently based on the URL. It’s all nice and decoupled, except that it’s a little… too decoupled. Nested elements (with partially matching URLs) needing the same data potentially end up making the same request at the same time.