Export PostgresSQL data to CSV file using Node.js

Previous tutorial shows you how to use Node.js to import CSV data into a PostgresSQL Database table. Today we’re gonna do the opposite: Export PostgresSQL data to CSV file. There are 2 steps:

  • Read PostgresSQL data using pg module
  • Export PostgresSQL data to CSV file using one of three modules: fast-csv, csv-writer or fs

Rest API: Node.js Download CSV file example


Read PostgresSQL data

Assume that our PostgresSQL database has a table category like this:

SELECT * FROM category;

 id |    name    |                  description                   | created_at
----+------------+------------------------------------------------+------------
  1 | Node.js    | JavaScript runtime environment                 | 2019-09-03
  2 | Vue.js     | JavaScript Framework for building UI           | 2019-09-06
  3 | Angular.js | Platform for building mobile & desktop web app | 2019-09-09

We need to get all the table’s rows and parse them into a kind of data that can be read & export to CSV file.
pg module can help us to connect and get data from PostgresSQL database.

Run the command to install the module: npm install pg.
Now we can use pg module to make a connection and query data easily:

const Pool = require("pg").Pool;

// Create a connection to the database
const pool = new Pool({
  host: "localhost",
  user: "postgres",
  database: "testdb",
  password: "123",
  port: 5432
});

// open the PostgreSQL connection
pool.connect((err, client, done) => {
  if (err) throw err;

  client.query("SELECT * FROM category", (err, res) => {
    done();

    if (err) {
      console.log(err.stack);
    } else {
      const jsonData = JSON.parse(JSON.stringify(res.rows));
      console.log("jsonData", jsonData);

      // TODO: export to CSV file
    }
  });
});

Let’s explain the code above:
– First we import Pool class from pg module, the class will help us create a connection pool to PostgreSQL database.
– Next, we create Pool object with appropriate configuration: host, user, database, password, port.
– Then we open the connection with connect() method and use the pool client to make query: SELECT * FROM category.

res parameter in the callback function will contain the result of the query. It looks like:

Result {
  command: 'SELECT',
  rowCount: 3,
  oid: null,
  rows:
   [ { id: 1,
       name: 'Node.js',
       description: 'JavaScript runtime environment',
       created_at: 2019-09-02T17:00:00.000Z },
     { id: 2,
       name: 'Vue.js',
       description: 'JavaScript Framework for building UI',
       created_at: 2019-09-05T17:00:00.000Z },
     { id: 3,
       name: 'Angular.js',
       description: 'Platform for building mobile & desktop web app',
       created_at: 2019-09-08T17:00:00.000Z } ],
  fields:
   [ ... ],
  _parsers:
   [ [Function: parseInteger],
     [Function: noParse],
     [Function: noParse],
     [Function: parseDate] ],
  ...
}

So we parse the json data from res.rows, then assign it to jsonData array.

In the next steps, we’ll show you 3 ways to export PostgreSQL data to a CSV file using:
fast-csv module
csv-writer module
fs module

Export PostgreSQL data to CSV file using fast-csv

Run the command first: npm install fast-csv.
With this way, we need the help of fs module to create a WriteStream object.

Now import the module & use it:

const fastcsv = require("fast-csv");
const fs = require("fs");
const ws = fs.createWriteStream("bezkoder_postgresql_fastcsv.csv");

const jsonData = ...;

fastcsv
  .write(jsonData, { headers: true })
  .on("finish", function() {
    console.log("Write to bezkoder_postgresql_fastcsv.csv successfully!");
  })
  .pipe(ws);

For full code, it looks like:

const Pool = require("pg").Pool;
const fastcsv = require("fast-csv");
const fs = require("fs");
const ws = fs.createWriteStream("bezkoder_postgresql_fastcsv.csv");

// Create a connection to the database
const pool = new Pool({
  host: "localhost",
  user: "postgres",
  database: "testdb",
  password: "123",
  port: 5432
});

// open the PostgreSQL connection
pool.connect((err, client, done) => {
  if (err) throw err;

  client.query("SELECT * FROM category", (err, res) => {
    done();

    if (err) {
      console.log(err.stack);
    } else {
      const jsonData = JSON.parse(JSON.stringify(res.rows));
      console.log("jsonData", jsonData);

      fastcsv
        .write(jsonData, { headers: true })
        .on("finish", function() {
          console.log("Write to bezkoder_postgresql_fastcsv.csv successfully!");
        })
        .pipe(ws);
    }
  });
});

The result is that csv file bezkoder_postgresql_fastcsv.csv will be generated with content:

id,name,description,created_at
1,Node.js,JavaScript runtime environment,2019-09-02T17:00:00.000Z
2,Vue.js,JavaScript Framework for building UI,2019-09-05T17:00:00.000Z
3,Angular.js,Platform for building mobile & desktop web app,2019-09-08T17:00:00.000Z

Export PostgreSQL data to CSV file using csv-writer

To use csv-writer module, run the command:
npm install csv-writer

From the module, we will import createObjectCsvWriter function that returns a CsvWriter object. The function need 2 parameters: path & header.

const createCsvWriter = require("csv-writer").createObjectCsvWriter;

const jsonData = ...;

const csvWriter = createCsvWriter({
  path: "bezkoder_postgresql_csvWriter.csv",
  header: [
    { id: "id", title: "id" },
    { id: "name", title: "name" },
    { id: "description", title: "description" },
    { id: "created_at", title: "created_at" }
  ]
});

csvWriter
  .writeRecords(jsonData)
  .then(() =>
    console.log("Write to bezkoder_postgresql_csvWriter.csv successfully!")
  );

To write data to csv file, we use writer’s writeRecords() method.

The full code will be written like:

const Pool = require("pg").Pool;
const createCsvWriter = require("csv-writer").createObjectCsvWriter;

// Create a connection to the database
const pool = new Pool({
  host: "localhost",
  user: "postgres",
  database: "testdb",
  password: "123",
  port: 5432
});

// open the PostgreSQL connection
pool.connect((err, client, done) => {
  if (err) throw err;

  client.query("SELECT * FROM category", (err, res) => {
    done();

    if (err) {
      console.log(err.stack);
    } else {
      const jsonData = JSON.parse(JSON.stringify(res.rows));
      console.log("jsonData", jsonData);

      const csvWriter = createCsvWriter({
        path: "bezkoder_postgresql_csvWriter.csv",
        header: [
          { id: "id", title: "id" },
          { id: "name", title: "name" },
          { id: "description", title: "description" },
          { id: "created_at", title: "created_at" }
        ]
      });

      csvWriter
        .writeRecords(jsonData)
        .then(() =>
          console.log("Write to bezkoder_postgresql_csvWriter.csv successfully!")
        );
    }
  });
});

Export PostgreSQL data to CSV file using fs

Node.js core modules has fs already, so we don’t need to install it but json2csv.
The json2csv module has Parser class that we can use parse() method to get the CSV formated data as a string.

So, run the command: npm install json2csv.

const Json2csvParser = require("json2csv").Parser;
const fs = require("fs");

const jsonData = ...;

const json2csvParser = new Json2csvParser({ header: true });
const csv = json2csvParser.parse(jsonData);

fs.writeFile("bezkoder_postgresql_fs.csv", csv, function(error) {
  if (error) throw error;
  console.log("Write to bezkoder_postgresql_fs.csv successfully!");
});

After having a CSV formated string, we use fs writeFile() function to write the string to CSV file.

Full implementation looks like:

const Pool = require("pg").Pool;
const Json2csvParser = require("json2csv").Parser;
const fs = require("fs");

// Create a connection to the database
const pool = new Pool({
  host: "localhost",
  user: "postgres",
  database: "testdb",
  password: "123",
  port: 5432
});

// open the PostgreSQL connection
pool.connect((err, client, done) => {
  if (err) throw err;

  client.query("SELECT * FROM category", (err, res) => {
    done();

    if (err) {
      console.log(err.stack);
    } else {
      const jsonData = JSON.parse(JSON.stringify(res.rows));
      console.log("jsonData", jsonData);

      const json2csvParser = new Json2csvParser({ header: true });
      const csv = json2csvParser.parse(jsonData);

      fs.writeFile("bezkoder_postgresql_fs.csv", csv, function(error) {
        if (error) throw error;
        console.log("Write to bezkoder_postgresql_fs.csv successfully!");
      });
    }
  });
});

Conclusion

Let’s review what we’ve done today.

We know how to read PostgresSQL table using pg module.
Then we use 3 ways to export PostgresSQL data to a CSV file: using fast-csv, csv-writer or fs module.

You can find how to do the opposite in the previous tutorial:
Import CSV data into PostgresSQL using Node.js

Or build a Rest API (just change configuration from MySQL to PostgreSQL):
Node.js Download CSV file from Database example

Happy learning! See you again.

Further Reading

Source Code

You can find the complete source code for this example on Github.