I have a script that runs COPY on postgres to STDOUT. This stream is consumed by nodejs into a readable. This readable is passed to pipeline which writes to a csv file locally. I am using nodePG and nodePG-copy-stream to handle creating the postgres client and copy stream. Lastly, I execute my script with bun. I attached my example read/write code below.
const saveCopyStreamToFile = async (
pgPool: pg.Pool,
copyQuery: string,
fileName: string
) => {
console.log(fileName);
const client = await pgPool.connect();
const readable = client.query(copyTo(copyQuery));
const writeable = createWriteStream(`${os.homedir()}/cms/${fileName}`, {
highWaterMark: 64 * 1024
});
try {
await pipeline(readable, writeable);
} finally {
client.release();
}
};
The Issue: On a certain table, the copy stream hangs indefinetly. I attached event listeners and the read stream never sends a close event. It does recieve data events and chunks. When I inspect the file, I can see the data begins to break randomly when writing a row in the csv. For example, XYZ utilizes a configuH djid
is a cell in the csv where the break begins. Interestingly, a different package postgresjs works fine on this table, but runs into the same issue on a different table. Lastly, I inspected the pg_stat_activity table and can see the process in there stuck in the idle state.
Expected: I can write the postgres copy stream to a csv file locally.
NodePg: https://node-postgres.com/
PostgresJs: https://github.com/porsager/postgres