My dataset looks like the array below. This data is also segemented into chunks of 500 lines for a easier bulk upload. Data var that is referenced in the loop below.
[
[ // 500 lines in this sub-array
['value','value','value','value','value'],
['value','value','value','value','value'],
['value','value','value','value','value'],
.........
],
[
['value','value','value','value','value'],
['value','value','value','value','value'],
['value','value','value','value','value'],
.........
],....
]
I also have another array that is a 1-1 relation between the data and the name of the column. Headers var that is referenced in the loop below.
['valueName','valueName','valueName','valueName','valueName'],
I iterate over this data and assign column names into a object structure and store that into a array that mimics the initial structure of chunks of data with 500 lines each.
{colName:'valueName',data:'value'} // End result is the same as Data var with these replacing 'value'
My big question is is there a better way for me to iterate over this aside from how I am doing it. I’m trying to avoid filter and map since I figure they initialize a new array when used so it takes longer to process. Instead I’ve opted for nested for loops but I may be falling victim to the temple of doom so wanted some second opinions on how to optimize this data restructuring.
My for loops
let newDataChunks = []
for(let dataSet = 0;dataSet < data.length;dataSet++){ // Datachunk
newDataChunks.push([])
for(let line = 0;line < data[dataSet].length;line++){ // Line of data
newDataChunks[newDataChunks.length-1].push([])
for(let headerPos = 0;headerPos < headers.length;headerPos++){
newDataChunks[newDataChunks.length-1][newDataChunks[newDataChunks.length-1].length-1].push({
colName:headers[headerPos],
data:data[dataSet][line][headerPos]
})
}
}
}
Thanks for the time.