Storing streaming data into mongodb using nodejs

Where I’m At

I’m working on a streaming data pipeline currently for iot data which I read in nodejs application from a kinesis data stream.

I’ve data coming in from over 200+ different sensors at every 10 seconds. I batch the data in nodejs before writing it to the mongodb.

I have a 3 collection structure in mongoDB to store this data which aims to flexible data-schema which goes as follows :-

Datasource collection:- 
_id: ObjectId("1234567890abcdef12345678"),
name: "sensor-1"

Variables collection:- 
_id: ObjectId("234567890abcdef12345678"), // a unique id for each variable for a datasource
dataSourceId: ObjectId("1234567890abcdef12345678"),
variable: "temperature"

Values collection
_id: ObjectId("4567890abcdef12345678"),
variableId: ObjectId("234567890abcdef12345678"),
value: 25
timestamp : <timestamp

I’ve designed it on this principle :-

Data Sources that have Variables
Variables that have some Values
Values are time-stamped data points containing the information of a sensor .

Here is how the payload would look like :-

{
    "temperature" : 123,
    "humidity" : 32,
    "current" : 2,
    "voltage" : 23.23,
    "power" : 23,
    "windspeed" : 23,
    "directionwind" : "s",
    ....   
}
Device Id will be sent as the clientID from the sensor

What I Want

How can i write this batched data to the table structure I showed above.

For e.g Datasource id : 1 has 20 variables , I have to store each variable in variables collection, and also their corresponding value to the values table with their timestamps. Also keeping in mind the scenario where the same device with id 1 can also send more than 20 variables in future if needed.

What I Hope To Learn

  1. Does this db schema make sense? If not, can you suggest a better alternative?
  2. How can I write this to the database in the manner I mentioned above?

I would appreciate any code examples or suggestions. Thanks!