Ok so I have an array similar to the following.
array = [a,b,c,d,e,f,g,h,i,j,k,l];
This array is used to generate a table that I will display by distributing the array onto a grid based on the number of columns and rows that the table has. I’m trying to create a for loop that will push to an array the desired index values for new cells when adding a column to the table.
//Know Values//
selectedColumn = **A**; (2) // index of selected column
numRows = **B**; (3) // derived from array length so 1 less than actual rows displayed
numColumns = **C**; (2) // derived from array length so 1 less than actual columns displayed
//Mock Table Display, distributing array into rows//
index 0, 1, 2, 3
row0 a, b, c, d, A * (X) = 2
index 4, 5, 6, 7
row1 e, f, g, h, A * (X) = 6
index 8, 9,10,11
row2 i, j, k, l A * (X) = 10
indexValues = [2, 6, 10]
The chart above is hopefully able to describe what I’m referring to. I would like a value for X that works for any number of columns and rows to return the correct index value. I assume X will need to be a mathematic expression involving A and C. Currently using row in a loop to determine which row is being calculated for.
for (let row = 0; row < numRows; row++) {
let index = (row + 1) * selectedColumn;
indexValues.push(index);
}