I’m using server sent events to send real time updates to clients. I’ve thought about 2 ways on how I can implement this
-
subscribe to a redis channel and use nodejs event emitter. Whenever there is a new message received from channel, use event emitter listener to write message to response stream. The problem I have with this is that the event emitter is synchronous and will block the event loop if there are a larger number of connections.
-
whenever a new request is made to subscribe to events, I create a new redis client and subscribe the client to channel. So this approach leads to N number of redisClients where N = number of users connected to server. Now, I’m aware I’m limited by the max limit of connections on redis server. Not a great way but if this is the way to go, I’ll use redis clusters.
Which approach would be better here when scaling the app?
I’ve added some code I wrote for both the approaches
- using event emitter
const emitter = new EventEmitter();
redisClient.subscribe(CHANNEL_NAME, (msg, channel) => {
emitter.emit("event",msg);
});
function subscribe(req: Request, res: Response) {
const handleEvent = (msg: string) => {
res.write(formatSseData("message", msg));
};
emitter.addListener("event", handleEvent);
req.on("close", () => {
emitter.removeListener("event", handleEvent);
});
}
- Setting up new redis client for every new user connected
async function subscribe(req: Request, res: Response) {
try {
const redisClient = createClient({ url: "redis://localhost:6379" });
await redisClient.connect();
redisClient.subscribe(CHANNEL_NAME, (msg, channel) => {
res.write(formatSseData("message", msg));
});
req.on("close", () => {
redisClient
.unsubscribe(CHANNEL_NAME)
.catch(err => {
console.log("error while unsubscribing", err);
});
});
} catch (error) {
console.error("Failed to connect: %s", (error as Error).message);
}
}