Out of Memory Error in Browser When Loading Large DICOM Dataset with Cornerstone3D

I’m working on an Angular application using Cornerstone3D to process and render DICOM images. The DICOM images I am working with are quite large. For example, one CT image set has 4137 slices with a resolution of 1024×1024, resulting in a total file size of 8.14 GB.

Here’s a general outline of how I handle the data:

I receive the DICOM data from the server.
I parse the data using the dicomParser JavaScript library.
The parsed dataset is then stored in a cache for future use.
Here’s a simplified example of the code I use to manage the datasets:

let loadedDataSets: Record<string, { dataSet: DataSet; cacheCount: number }> = {};

dataset = dicomParser.parseDicom(buffer);
dataSetCacheManager.create(uri, dataset);

function create(uri: string, dataSet: DataSet, multiFrame: boolean) {
  loadedDataSets[uri] = {
    dataSet: dataSet,
    cacheCount: 1
  };
}
The DataSet structure looks like this:

export interface DataSet {
   byteArray: ByteArray;
   byteArrayParser: ByteArrayParser;
   elements: {
     [tag: string]: Element;
   };
   warnings: string[];

   uint16: (tag: string, index?: number) => number | undefined;
   int16: (tag: string, index?: number) => number | undefined;
   uint32: (tag: string, index?: number) => number | undefined;
   int32: (tag: string, index?: number) => number | undefined;
   float: (tag: string, index?: number) => number | undefined;
   double: (tag: string, index?: number) => number | undefined;
   numStringValues: (tag: string) => number | undefined;
   string: (tag: string, index?: number) => string | undefined;
   text: (tag: string, index?: number) => string | undefined;
   floatString: (tag: string) => number | undefined;
   intString: (tag: string) => number | undefined;
   attributeTag: (tag: string) => string | undefined;
}

export type ByteArray = Uint8Array | Buffer;

The Problem:

When the total size of the loaded data hits around 8.5 GB (based on Heap Snapshot), the browser crashes and throws an “Out of Memory” error.

I understand that handling such large data entirely in RAM might not be feasible, but I’m wondering if there are more memory-efficient ways to store or manage the DICOM datasets in the browser. Is there any strategy to offload or segment the data in a more RAM-efficient manner, another approach to mitigate the memory issue?

Any suggestions or alternative approaches would be greatly appreciated.

Using Heap Snapshots: I’ve used Chrome’s developer tools to take heap snapshots, which confirmed that the JSArrayBufferData is consuming about 8.5 GB when the crash occurs. This was unexpected because the system has more RAM available.

Exploring IndexedDB: I considered using IndexedDB as an alternative to store parts of the DICOM data, but due to the structure of the DICOM dataset (especially with its byteArrayParser functions for accessing binary data), it’s not straightforward to offload the data without significant changes to the parsing and rendering flow.