In my application for processing my file upload, I am using xlxs npm package. And to host the data, I am using readAsArrayBuffer. The performance of readAsArrayBuffer degradding increases when the file’s number of rows increases.
Code block:
readExcelFile(file: File) {
const reader = new FileReader();
reader.onload = (e: any) => {
const data = new Uint8Array(e.target.result);
const workbook = xls.read(data, { type: 'array' });
const sheetName = workbook?.SheetNames[0];
const worksheet = workbook.Sheets[sheetName];
const jsonData = xls.utils
.sheet_to_json(worksheet, {
raw: true,
defval: null
})
.filter((row: any) => {
return Object.values(row).some(
(cell) => cell !== null && cell !== undefined && cell !== ''
);
});
const mappedData = this.mapExcelDataToColumns(jsonData);
this.orders_list =
mappedData.length > 0 ? mappedData : jsonData ? jsonData : [];
this.uploadFile(file);
};
reader.readAsArrayBuffer(file);
}
how to fix this issue ? or any better way to handle it ?
