How to Parse JSON Data in Node.js?

In this article, we'll look at some options available to us in Node.js to parse JSON. The one you choose to use depends on your specific use case:

Parse a String Containing JSON Data

Node.js has a global JSON object (as defined in the ES5 specification) with a synchronous parse() method. This can be used to parse a JSON string like so:

const str = '{"name": "John Doe", "age": 30}';
JSON.parse(str);

It has the following syntax:

JSON.parse(text [, reviver])

The parse method takes the following two arguments:

  1. text — this is the json string you wish to parse.
  2. reviver — optionally, you could specify a function that transforms the resulting object before it is returned.

When using JSON.parse(), remember:

  • It is a good practice to ensure your code is wrapped inside a try/catch block before parsing it, especially if the JSON is coming from the client-side or an external source. This would not only help ensure the integrity of the JSON string, but also help prevent some potential XSS/script injection related bugs.
  • JSON.parse() is synchronous, so if you have a very big JSON file, it will block the event loop and tie up the current thread. Therefore, you should consider using a streaming JSON parser when parsing big JSON objects.

Parse a File Containing JSON Data

Using require() to Read & Parse JSON:

We could simply use require() function to load and parse JSON data from a file. For example:

const parsedJSON = require('./config.json');

This would load config.json file residing in the same directory as your source code file.

By design, the require function:

  • Only reads the file once; all subsequent requests for the same file are loaded from a cache. This makes it not suitable to load a file that's frequently updated.
  • Is synchronous, so if you have a very big JSON file, it will block the event loop.

It is not wrong to use this approach. It entirely depends on your use case. For example, it may be a quick, easy and effective way of loading a configuration file.

Using the Methods in fs Module With JSON.parse():

Asynchronously Reading the File:

const fs = require('fs');

fs.readFile('/path/to/file.json', 'utf8', function(error, data) {
    const obj = JSON.parse(data);

    // do something...
});
// v10+
const { readFile } = require('fs').promises;

(async () => {
    const data = await readFile('/path/to/file.json', 'utf8');
    const obj = JSON.parse(data);

    // do something...
})();

This may NOT be suitable for loading large JSON files as JSON.parse() is synchronous and might block the event loop. This might happen because parsing JSON is a CPU intensive task, and since Node.js is single threaded, the parsing might block the main thread at some point.

Synchronously Reading The File:

const fs = require('fs');
const json = JSON.parse(fs.readFileSync('/path/to/file.json', 'utf8'));

Same concerns as mentioned with JSON.parse() earlier apply here, since we are still using JSON.parse() for loading file contents.

Parsing a Large JSON File

There are a couple of ways you could go about parsing a large JSON file:

Breaking The Data Into Smaller Chunks:

Some ways this could be done are by:

  • Splitting a large file into smaller files might speed up things if they're read asynchronously or in parallel (for example by using worker threads).
  • Reading the file data in smaller chunks, for example by:
    • Buffering data line-by-line like by using the readline module or looking for the newline character.
    • Using a JSON stream parsing module, for example:
      • JSONStream module — provides an alternative to JSON.parse() to stream-parse JSON data from a JSON file.
      • split module — allows us to break up each line as a chunk of data and reassemble it as the data flows in;
      • stream-json module — streams array of JSON objects individually taking care of assembling them automatically as they're read in.

Processing the File in the Background:

By design, a single instance of Node runs in a single thread, and synchronously-run CPU-intensive tasks can block the event loop / main thread.

The following are a few ways we can allow the execution of CPU-intensive scripts in the background, outside of Node's event loop in an asynchronous / non-blocking way:

  • Using a web worker thread (for example, node-webworker-threads).
  • Launching a cluster of Node processes (as per the need).
  • Spawning a child process from your main process; these child processes are independent of the parent process, they're run in parallel and scheduled separately by the OS for execution (they have their own memory space, process id, and execution time).

This post was published (and was last revised ) by Daniyal Hamid. Daniyal currently works as the Head of Engineering in Germany and has 20+ years of experience in software engineering, design and marketing. Please show your love and support by sharing this post.