Reading and writing JSON files in Node.js: A complete tutorial - LogRocket Blog (2024)

Editor’s note: This article was last updated on 28 September 2023 to add information about best practices with JSON files in Node.js, such as the importance of backups when writing to files; and a section about reading and writing large JSON files using streams.

Reading and writing JSON files in Node.js: A complete tutorial - LogRocket Blog (1)

JavaScript Object Notation (JSON) is one of the most popular formats for data storage and data interchange over the internet. The simplicity of the JSON syntax makes it very easy for humans and machines alike to read and write.

Despite its name, the use of the JSON data format is not limited to JavaScript. Most programming languages implement data structures that you can easily convert to JSON string and vice versa. JavaScript, and therefore the Node.js runtime environment, is no exception. More often than not, this JSON data needs to be read from or written to a file for persistence. The Node runtime environment has the built-in fs module specifically for working with files.

This article is a comprehensive guide on how to use the built-in fs module to read and write data in JSON format. We’ll also look at some third-party npm packages that simplify working with data in the JSON format.

Jump ahead:

  • Serializing and deserializing JSON
  • Introduction to the fs module
    • Synchronous API
    • Callback API
    • Promise-based API
  • How to read JSON files in Node.js
    • Loading a JSON file using the require function
    • Reading a JSON file using the fs.readFile method
    • Reading a JSON file using the fs.readFileSync method
  • How to write to JSON files in Node.js
    • Using the fs.writeFile method
    • Using the fs.writeFileSync method
    • Appending a JSON file
  • Read/write to JSON files using third-party npm packages
    • Using the jsonfile npm package
    • Using the fs-extra npm package
    • Using the bfj npm package
  • Reading and writing large JSON files using streams
  • Best practices and common pitfalls when reading/writing JSON files
  • Handling circular references

Serializing and deserializing JSON

Serialization is the process of modifying an object or data structure to a format that is easy to store or transfer over the internet. You can recover the serialized data by applying the reverse process.

Deserialization refers to transforming the serialized data structure to its original format.

You will almost always need to serialize JSON or JavaScript objects to a JSON string in Node. You can do so with the JSON.stringify method before writing it to a storage device or transmitting it over the internet:

const config = { ip: '1234.22.11', port: 3000};console.log(JSON.stringify(config));

On the other hand, after reading the JSON file, you will need to deserialize the JSON string to a plain JavaScript object using the JSON.parse method before accessing or manipulating the data:

const config = JSON.stringify({ ip: '1234.22.11', port: 3000});console.log(JSON.parse(config));

JSON.stringify and JSON.parse are globally available methods in Node. You don’t need to install or require them before using.

Introduction to the fs module

The fs module is built in, and it provides functions that you can use to read and write data in the JSON format and much more.

Each function exposed by the fs module has the synchronous, callback, and promise-based forms. The synchronous and callback variants of a method are accessible from the synchronous and callback API. The promise-based variant of a function is accessible from the promise-based API.

Synchronous API

The synchronous methods of the built-in fs module block the event loop and further execution of the remaining code until the operation has succeeded or failed. More often than not, blocking the event loop is not something you want to do.

The names of all synchronous functions end with the "Sync" characters. For example, writeFileSync and readFileSync are both synchronous functions.

Over 200k developers use LogRocket to create better digital experiencesLearn more →

You can access the synchronous API by requiring fs:

const fs = require('fs');// Blocks the event loopfs.readFileSync(path, options);

Callback API

Unlike the synchronous methods that block the execution of the remaining code until the operation has succeeded or failed, the corresponding methods of the callback API are asynchronous. You’ll pass a callback function to the method as the last argument.

The callback function is invoked with an Error object as the first argument if an error occurs. The remainder of the arguments to the callback function depend on the fs method.

You can also access the methods of the callback API by requiring fs like the synchronous API:

const fs = require('fs');fs.readFile(path, options, callback);

Promise-based API

The promise-based API is asynchronous, like the callback API. It returns a promise, which you can manage via promise chaining or async/await.

You can access the promise-based API by requiring fs/promises:

const fs = require('fs/promises');fs.readFile(path) .then((data) => { // Do something with the data }) .catch((error) => { // Do something if error });

We used the CommonJS syntax for accessing the modules in the code snippets above. We’ll use the CommonJS syntax throughout this article. You can also use ES6 modules if you want.

According to the Node documentation, the callback API of the built-in fs module is more performant than the promise-based API. Therefore, most examples in this article will use the callback API.

How to read JSON files in Node.js

In the Node runtime environment, you can use the built-in require function and fs modules for loading or reading JSON files. Because the require function is available for each module, you don’t need to require it.

However, you will need to require the fs module before using it. I will discuss how to read JSON files using the built-in fs module and require function in the following sections.

More great articles from LogRocket:

  • Don't miss a moment with The Replay, a curated newsletter from LogRocket
  • Learn how LogRocket's Galileo cuts through the noise to proactively resolve issues in your app
  • Use React's useEffect to optimize your application's performance
  • Switch between multiple versions of Node
  • Discover how to use the React children prop with TypeScript
  • Explore creating a custom mouse cursor with CSS
  • Advisory boards aren’t just for executives. Join LogRocket’s Content Advisory Board. You’ll help inform the type of content we create and get access to exclusive meetups, social accreditation, and swag.

Loading a JSON file using the require function

You can use the require function to synchronously load JSON files in Node. After loading a file using require, it is cached. Therefore, loading the file again using require will load the cached version. In a server environment, the file will be loaded again in the next server restart.

It is therefore advisable to use require for loading static JSON files such as configuration files that do not change often. Do not use require if the JSON file you load keeps changing, because it will cache the loaded file and use the cached version if you require the same file again. Your latest changes will not be reflected.

Assuming you have a config.json file with the following contents:

{ "port": "3000", "ip": "127.00.12.3"}

You can load the config.json file in a JavaScript file using the code below. require will always load the JSON data as a JavaScript object:

const config = require('./config.json');console.log(config);

Reading a JSON file using the fs.readFile method

You can use the readFile method to read JSON files. It asynchronously reads the contents of the entire file in memory, and is therefore not the most optimal method for reading large JSON files.

The readFile method takes three arguments. The code snippet below shows its function signature:

fs.readFile(path, options, callback);

The first argument, path, is the file name or the file descriptor. The second is an optional object argument, and the third is a callback function. You can also pass a string as the second argument instead of an object. If you pass a string, then it has to be encoded.

The callback function takes two arguments. The first argument is the error object if an error occurs, and the second is the serialized JSON data.

The code snippet below will read the JSON data in the config.json file and log it on the terminal:

const fs = require("fs");fs.readFile("./config.json", "utf8", (error, data) => { if (error) { console.log(error); return; } console.log(JSON.parse(data));});

Make sure to deserialize the JSON string passed to the callback function before you start working with the resulting JavaScript object.

Reading a JSON file using the fs.readFileSync method

readFileSync is another built-in method for reading files in Node similar to readFile. The difference between the two is that readFile reads the file asynchronously while readFileSync reads the file synchronously. Therefore, readFileSync blocks the event loop and execution of the remaining code until all the data has been read.

Check out this article for more information about the difference between synchronous and asynchronous code.

Below is the function signature of fs.readFileSync:

fs.readFileSync(path, options);

path refers to the location of the JSON file you wish to read. Optionally, you can provide an object as the second argument.

In the code snippet below, we are reading JSON data from the config.json file using readFileSync:

const { readFileSync } = require('fs');const data = readFileSync('./config.json');console.log(JSON.parse(data));

How to write to JSON files in Node.js

Just like reading JSON files, the fs module provides built-in methods for writing to JSON files. You can use the writeFile and writeFileSync methods of the fs module. The difference between the two is that writeFile is asynchronous while writeFileSync is synchronous.

Before writing a JSON file, make sure to serialize the JavaScript object to a JSON string using the JSON.stringify method. JSON.stringify will format your JSON data in a single line if you do not pass the optional formatting argument to the JSON.stringify method specifying how to format your JSON data.

Using the fs.writeFile method

The writeFile method takes four arguments. The code snippet below shows its function signature:

fs.writeFile(file, data, options, callback);

If the path you pass to the writeFile method is for an existing JSON file, the method will overwrite the data in the specified file. It will create a new file if the file does not exist:

const { writeFile } = require('fs');const path = './config.json';const config = { ip: '192.0.2.1', port: 3000 };writeFile(path, JSON.stringify(config, null, 2), (error) => { if (error) { console.log('An error has occurred ', error); return; } console.log('Data written successfully to disk');});

Using the fs.writeFileSync method

Unlike writeFile, writeFileSync writes to a file synchronously. If you use writeFileSync, you will block the execution of the event loop and the rest of the code until the operation is successful or an error occurs. It will create a new file if the path you pass doesn’t exist and overwrite it if it does.

In the code snippet below, we are writing to the config.json file. We wrap the code in try-catch so that we can catch any errors:

const { writeFileSync } = require('fs');const path = './config.json';const config = { ip: '192.0.2.1', port: 3000 };try { writeFileSync(path, JSON.stringify(config, null, 2), 'utf8'); console.log('Data successfully saved to disk');} catch (error) { console.log('An error has occurred ', error);}

Appending a JSON file

Node doesn’t have a built-in function for appending or updating the fields of an existing JSON file out of the box. However, you can read the JSON file using the readFile method of the fs module, update it, and overwrite the JSON file with the updated JSON.

Below is a code snippet illustrating how to do this:

const { writeFile, readFile } = require('fs');const path = './config.json';readFile(path, (error, data) => { if (error) { console.log(error); return; } const parsedData = JSON.parse(data); parsedData.createdAt = new Date().toISOString(); writeFile(path, JSON.stringify(parsedData, null, 2), (err) => { if (err) { console.log('Failed to write updated data to file'); return; } console.log('Updated file successfully'); });});

Read/write to JSON files using third-party npm packages

In this section, we’ll look at the most popular third-party Node packages for reading and writing data in JSON format.

Using the jsonfile npm package

jsonfile is a popular npm package for reading and writing JSON files in Node. You can install it using the following command:

npm install jsonfile

It is similar to the readFile and writeFile methods of the built-in fs module, though jsonfile has some advantages over the built-in methods:

  • It serializes and deserializes JSON out of the box
  • It has a built-in utility for appending data to a JSON file
  • Supports promise chaining

You can see the jsonfile package in action in the code snippet below:

const jsonfile = require('jsonfile');const path = './config.json';jsonfile.readFile(path, (err, data) => { if (err) { console.log(err); return; } console.log(data);});

You can also use promise chaining instead of passing a callback function like in the example above:

const jsonfile = require('jsonfile');const path = './config.json';jsonfile .readFile(path) .then((data) => { console.log(data); }) .catch((err) => { console.log(err); });

Using the fs-extra npm package

fs-extra is another popular Node package you can use to work with files. Though you can use this package for managing JSON files, it has methods whose functions extend beyond just reading and writing JSON files.

As its name suggests, fs-extra has all the functionalities provided by the fs module and more. According to the documentation, you can use the fs-extra package instead of the fs module.

Before using it, you need to first install fs-extra from npm:

npm install fs-extra

The code below shows how you can read JSON files using the readJson method of the fs-extra package. You can use a callback function, promise chaining, or async/await:

const fsExtra = require('fs-extra');const path = './config.json';// Using callbackfsExtra.readJson(path, (error, config) => { if (error) { console.log('An error has occurred'); return; } console.log(config);});// Using promise chainingfsExtra .readJson(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); });// Using async/awaitasync function readJsonData() { try { const config = await fsExtra.readJson(path); console.log(config); } catch (error) { console.log(error); }}readJsonData();

The code below illustrates how you can write JSON data using the writeJson method:

const { writeJson } = require('fs-extra');const path = './config.json';const config = { ip: '192.0.2.1', port: 3000 };// Using callbackwriteJson(path, config, (error) => { if (error) { console.log('An error has occurred'); return; } console.log('Data written to file successfully ');});// Using promise chainingwriteJson(path, config) .then(() => { console.log('Data written to file successfully '); }) .catch((error) => { console.log(error); });// Using async/awaitasync function writeJsonData() { try { await writeJson(path, config); console.log('Data written to file successfully '); } catch (error) { console.log(error); }}writeJsonData();

Just like the fs module, fs-extra has both asynchronous and synchronous methods. You don’t need to stringify your JavaScript object before writing to a JSON file.

Similarly, you don’t need to parse to a JavaScript object after reading a JSON file. The module does it for you out of the box.

Using the bfj npm package

bfj is another npm package you can use to handle data in JSON format. According to the documentation, it was created for managing large JSON datasets:

bfj implements asynchronous functions and uses pre-allocated fixed-length arrays to try and alleviate issues associated with parsing and stringifying large JSON or JavaScript datasets. – bfj documentation

To install bfj from the npm package registry, run the following code:

npm install bfj

You can read JSON data using the read method, which is asynchronous and returns a promise.

Assuming you have a config.json file, you can use the following code to read it:

const bfj = require('bfj');const path = './config.json';bfj .read(path) .then((config) => { console.log(config); }) .catch((error) => { console.log(error); });

Similarly, you can use the write method to write data to a JSON file:

const bfj = require('bfj');const path = './config.json';const config = { ip: '192.0.2.1', port: 3000 };bfj .write(path, config) .then(() => { console.log('Data has been successfully written to disk'); }) .catch((error) => { console.log(error); });

bfj was created purposely for handling large JSON data. It is also slow, so you should only use it if you are handling relatively large JSON datasets.

Reading and writing large JSON files using streams

As explained above, the built-in functions of the synchronous and asynchronous APIs read the entire file into memory. This is inefficient in terms of both time and memory.

You need to wait until the entire file is read into memory before processing. If you are dealing with a large JSON file, you may wait for a long time. Similarly, you may run out of memory while reading large JSON files.

To remedy these issues, you may want to use streams to read and process JSON data. The stream-json package comes in handy when streaming large JSON data. You need to first install it from npm like so:

npm install stream-json

Depending on the shape of your JSON data, you can use one of the built-in functions, like in the example below. This reduces your application’s memory footprint and enables you to process chunks of data immediately after they become available:

const StreamArray = require("stream-json/streamers/StreamArray");const fs = require("fs");const pipeline = fs .createReadStream("large-file.json") .pipe(StreamArray.withParser());pipeline.on("data", (data) => console.log(data));

Best practices and common pitfalls when reading/writing JSON files

When dealing with file operations, it’s essential to first create a backup of datasets to avoid losing or corrupting the data. The require method loads the entire JSON file into memory and caches it. For frequently changing JSON files, it’s advisable to avoid require and instead use functions from the fs module.

Error handling is vital, especially with synchronous and promise-based APIs in conjunction with async/await because it prevents application failures. The space parameter in JSON.stringify improves JSON string readability, but it’s best avoided for network transmissions to reduce bundle size. Lastly, remember that Node.js’s promise-based API isn’t thread-safe according to its documentation. Concurrent operations on the same file might lead to issues, so use this API with caution.

Handling circular references

It is not uncommon to encounter the TypeError: Converting circular structure to JSON error when serializing a JavaScript object using the JSON.stringify function. This error occurs when you attempt to stringify a JavaScript object that references itself, as in the example below:

const object = { a: 1 };object.itself = object;try { JSON.stringify(object);} catch (e) { // TypeError: Converting circular structure to JSON console.log(e);}

There is no straightforward fix to this error. However, you can manually find and replace the circular references with serializable values or use a third-party library like cycle.js, which was created by Douglas Crockford, the brain behind the JSON format.

A fork of the library is maintained at the npm package registry as cycle. You can install it like so:

npm install cycle

Then, you can use it in your application, as shown below:

const cycle = require("cycle");const originalObj = { a: 1 };originalObj.itself = originalObj;const stringifiedObj = JSON.stringify(cycle.decycle(originalObj));const originalObjCopy = cycle.retrocycle(JSON.parse(stringifiedObj));console.log(originalObjCopy);

The decycle function of the circular.js package highlighted above will create a copy of the object, look for duplicate references, which might be circular references, and replace them with objects of the form { "$ref": PATH }.

You can then stringify and parse the resulting object without encountering the TypeError mentioned above. After that, you can store the resulting object on disk or transfer it over the network.

You can use the retrocycle function of the circular.js package to get a copy of the original object.

Conclusion

As explained in the above sections, JSON is one of the most popular formats for data exchange over the internet. The Node runtime environment has the built-in fs module you can use to work with files in general. The fs module has methods that you can use to read and write to JSON files using the callback API, promise-based API, or synchronous API.

Because methods of the callback API are more performant than those of the promise-based API, you are better off using the callback API.

In addition to the built-in fs module, several popular third-party packages such as jsonfile, fs-extra, and bfj exist. They have additional utility functions that make working with JSON files a breeze. On the flip side, you should evaluate the limitations of adding third-party packages to your application.

200s only Reading and writing JSON files in Node.js: A complete tutorial - LogRocket Blog (4) Monitor failed and slow network requests in production

Deploying a Node-based web app or website is the easy part. Making sure your Node instance continues to serve resources to your app is where things get tougher. If you’re interested in ensuring requests to the backend or third-party services are successful, try LogRocket.

LogRocket is like a DVR for web and mobile apps, recording literally everything that happens while a user interacts with your app. Instead of guessing why problems happen, you can aggregate and report on problematic network requests to quickly understand the root cause.

LogRocket instruments your app to record baseline performance timings such as page load time, time to first byte, slow network requests, and also logs Redux, NgRx, and Vuex actions/state. Start monitoring for free.

Reading and writing JSON files in Node.js: A complete tutorial - LogRocket Blog (2024)
Top Articles
SurgeTrader - What Does This Proprietary Trading Firm Offer? | Bit Rebels
How Is The U.S. Economy Doing? With Inflation Hitting 40-Year Highs, Watch These 4 Key Areas | Bankrate
Katie Nickolaou Leaving
Jail Inquiry | Polk County Sheriff's Office
No Hard Feelings Showtimes Near Metropolitan Fiesta 5 Theatre
Sprinter Tyrone's Unblocked Games
Dollywood's Smoky Mountain Christmas - Pigeon Forge, TN
Tesla Supercharger La Crosse Photos
10 Popular Hair Growth Products Made With Dermatologist-Approved Ingredients to Shop at Amazon
Acts 16 Nkjv
30% OFF Jellycat Promo Code - September 2024 (*NEW*)
Graveguard Set Bloodborne
Aries Auhsd
Flights To Frankfort Kentucky
Simon Montefiore artikelen kopen? Alle artikelen online
7440 Dean Martin Dr Suite 204 Directions
Shreveport Active 911
2021 Lexus IS for sale - Richardson, TX - craigslist
Studentvue Columbia Heights
Missing 2023 Showtimes Near Landmark Cinemas Peoria
Craigslist Free Stuff Santa Cruz
Skyward Login Jennings County
Velocity. The Revolutionary Way to Measure in Scrum
Hanger Clinic/Billpay
Promiseb Discontinued
Yosemite Sam Hood Ornament
PCM.daily - Discussion Forum: Classique du Grand Duché
Home
6 Most Trusted Pheromone perfumes of 2024 for Winning Over Women
How To Tighten Lug Nuts Properly (Torque Specs) | TireGrades
Sessional Dates U Of T
Truck from Finland, used truck for sale from Finland
Select The Best Reagents For The Reaction Below.
Korg Forums :: View topic
N.J. Hogenkamp Sons Funeral Home | Saint Henry, Ohio
Progressbook Newark
Warn Notice Va
Culver's Hartland Flavor Of The Day
Scioto Post News
Sinfuldeeds Vietnamese Rmt
450 Miles Away From Me
craigslist: modesto jobs, apartments, for sale, services, community, and events
Andrew Lee Torres
Cvs Coit And Alpha
Kaamel Hasaun Wikipedia
Wisconsin Volleyball titt*es
A jovem que batizou lei após ser sequestrada por 'amigo virtual'
All Buttons In Blox Fruits
Michaelangelo's Monkey Junction
Causeway Gomovies
Black Adam Showtimes Near Cinemark Texarkana 14
Koniec veľkorysých plánov. Prestížna LEAF Academy mení adresu, masívny kampus nepostaví
Latest Posts
Article information

Author: Delena Feil

Last Updated:

Views: 6120

Rating: 4.4 / 5 (65 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Delena Feil

Birthday: 1998-08-29

Address: 747 Lubowitz Run, Sidmouth, HI 90646-5543

Phone: +99513241752844

Job: Design Supervisor

Hobby: Digital arts, Lacemaking, Air sports, Running, Scouting, Shooting, Puzzles

Introduction: My name is Delena Feil, I am a clean, splendid, calm, fancy, jolly, bright, faithful person who loves writing and wants to share my knowledge and understanding with you.