I have a fixed size dataset (it won’t grow).
Here are the figures:
STORAGE SIZE: 190.13MB
LOGICAL DATA SIZE: 253.71MB
TOTAL DOCUMENTS: 53452
INDEXES TOTAL SIZE: 1.44MB
I don’t like the fact that I am loading the whole dataset into RAM for each network request
const data = await collection.find().toArray();
I could cache the table contents with a global variable.
I would like to know what are the optimal solutions for this case?
FYI. I need the entire table for each request to do comparator logic.