fs-extra vs pify vs jsonfile vs node-fetch vs load-json-file
File and Network Operations Comparison
1 Year
fs-extrapifyjsonfilenode-fetchload-json-fileSimilar Packages:
What's File and Network Operations?

File and Network Operations libraries in Node.js provide developers with tools to interact with the file system and perform network requests. These libraries simplify tasks such as reading and writing files, handling JSON data, and making HTTP requests, which are essential for building server-side applications, automation scripts, and APIs. They help streamline I/O operations, manage asynchronous tasks, and improve overall productivity in Node.js development. fs-extra extends the built-in fs module with additional file system methods, jsonfile focuses on reading and writing JSON files, load-json-file simplifies loading JSON files with promises, node-fetch provides a lightweight API for making HTTP requests, and pify converts callback-based functions into promise-based ones, enhancing async programming.

npm Package Downloads Trend
Github Stars Ranking
Stat Detail
Package
Downloads
Stars
Size
Issues
Publish
License
fs-extra84,619,1849,51855.3 kB1618 days agoMIT
pify57,871,5421,50413.6 kB0-MIT
jsonfile49,694,7581,206-54 years agoMIT
node-fetch48,302,0788,824107 kB2112 years agoMIT
load-json-file14,340,359245-03 years agoMIT
Feature Comparison: fs-extra vs pify vs jsonfile vs node-fetch vs load-json-file

File System Operations

  • fs-extra:

    fs-extra provides a wide range of file system operations, including reading, writing, copying, moving, and deleting files and directories. It also includes additional features like mkdirs (create directories recursively), copyFile (copy files with options), and remove (delete files and directories recursively).

  • pify:

    pify does not perform file system operations. Instead, it is a utility for converting callback-based functions into promise-based ones, allowing for easier integration with asynchronous code and improving the handling of I/O operations that rely on callbacks.

  • jsonfile:

    jsonfile focuses on reading and writing JSON files. It provides simple methods like jsonfile.readFile and jsonfile.writeFile for handling JSON data, but it does not offer extensive file system operations beyond that.

  • node-fetch:

    node-fetch is primarily focused on making HTTP requests rather than file system operations. It provides a simple API for fetching resources over the network, handling responses, and working with streams, but it does not interact with the file system directly.

  • load-json-file:

    load-json-file specializes in loading JSON files using a single method, loadJson. It returns a promise that resolves with the parsed JSON data, making it easy to work with JSON files asynchronously. However, it does not provide methods for writing JSON data or performing other file system operations.

JSON Handling

  • fs-extra:

    fs-extra provides basic JSON handling capabilities through its file reading and writing methods. However, it does not have specialized features for JSON serialization or deserialization beyond what the standard fs module offers.

  • pify:

    pify does not handle JSON data directly. It is a utility for converting functions that use callbacks into promise-based functions, which can be used with JSON handling code but does not provide any JSON-specific functionality.

  • jsonfile:

    jsonfile excels at JSON handling, offering straightforward methods for reading and writing JSON files while preserving formatting and handling errors gracefully. It is designed specifically for working with JSON data, making it a reliable choice for JSON-related tasks.

  • node-fetch:

    node-fetch does not specialize in JSON handling, but it provides methods for working with JSON data in HTTP responses. You can easily parse JSON from a response using the response.json() method, making it compatible with JSON APIs and data exchange.

  • load-json-file:

    load-json-file is designed for loading JSON files efficiently. It handles parsing JSON data and returns a promise, making it easy to work with JSON files in an asynchronous manner. However, it does not provide writing capabilities or advanced JSON handling features.

Asynchronous Support

  • fs-extra:

    fs-extra supports asynchronous operations for all file system tasks, providing both callback and promise-based APIs. This makes it versatile and suitable for modern JavaScript applications that use async/await or traditional callback patterns.

  • pify:

    pify converts callback-based functions into promise-based ones, enhancing asynchronous support for any function that uses callbacks. This makes it a valuable tool for modernizing code and improving the handling of asynchronous operations.

  • jsonfile:

    jsonfile supports asynchronous reading and writing of JSON files. Its methods return promises, allowing for seamless integration with async/await syntax, making it easy to handle JSON data without blocking the event loop.

  • node-fetch:

    node-fetch is designed for asynchronous network requests. It uses promises to handle responses, making it compatible with async/await syntax. This is particularly useful for fetching data from APIs and handling responses in a non-blocking manner.

  • load-json-file:

    load-json-file is built around asynchronous file loading. Its main method returns a promise, making it ideal for modern applications that utilize async/await for handling I/O operations, particularly when working with JSON files.

HTTP Requests

  • fs-extra:

    fs-extra does not handle HTTP requests. It is focused on file system operations and does not provide any functionality for network communication or fetching resources from the web.

  • pify:

    pify does not handle HTTP requests. It is a utility for converting callback-based functions into promise-based ones, which can be used in conjunction with HTTP request functions but does not provide any networking capabilities.

  • jsonfile:

    jsonfile does not handle HTTP requests. It is solely focused on reading and writing JSON files and does not provide any functionality for network communication or interacting with APIs.

  • node-fetch:

    node-fetch is specifically designed for making HTTP requests. It provides a simple and flexible API for fetching resources, handling responses, and working with streams. It is ideal for interacting with APIs and retrieving data over the network.

  • load-json-file:

    load-json-file does not handle HTTP requests. It is designed for loading JSON files from the file system and does not provide any functionality for network communication or fetching data from remote sources.

Ease of Use: Code Examples

  • fs-extra:

    File System Operations with fs-extra

    const fs = require('fs-extra');
    
    // Copy a file
    fs.copy('./source.txt', './destination.txt')
      .then(() => console.log('File copied successfully!'))
      .catch(err => console.error(err));
    
    // Create a directory recursively
    fs.mkdirs('./path/to/directory')
      .then(() => console.log('Directory created!'))
      .catch(err => console.error(err));
    
    // Remove a file or directory
    fs.remove('./path/to/file-or-directory')
      .then(() => console.log('File or directory removed!'))
      .catch(err => console.error(err));
    
  • pify:

    Using pify to Convert Callback Functions to Promises

    const pify = require('pify');
    const fs = require('fs');
    const readFile = pify(fs.readFile);
    
    // Read a file using the promise-based readFile
    readFile('file.txt', 'utf8')
      .then(data => console.log('File content:', data))
      .catch(err => console.error(err));
    
  • jsonfile:

    JSON Handling with jsonfile

    const jsonfile = require('jsonfile');
    const file = 'data.json';
    
    // Write JSON data to a file
    const data = { name: 'Alice', age: 30 };
    jsonfile.writeFile(file, data)
      .then(() => console.log('JSON data written to file!'))
      .catch(err => console.error(err));
    
    // Read JSON data from a file
    jsonfile.readFile(file)
      .then(data => console.log('JSON data:', data))
      .catch(err => console.error(err));
    
  • node-fetch:

    Making HTTP Requests with node-fetch

    const fetch = require('node-fetch');
    
    // Fetch data from an API
    fetch('https://api.example.com/data')
      .then(response => response.json())
      .then(data => console.log('Fetched data:', data))
      .catch(err => console.error(err));
    
  • load-json-file:

    Loading JSON with load-json-file

    const loadJson = require('load-json-file');
    
    // Load JSON data from a file
    loadJson('data.json')
      .then(data => console.log('Loaded JSON data:', data))
      .catch(err => console.error(err));
    
How to Choose: fs-extra vs pify vs jsonfile vs node-fetch vs load-json-file
  • fs-extra:

    Choose fs-extra if you need a comprehensive solution for file system operations with additional features like recursive directory creation, file copying, and more. It is ideal for projects that require advanced file manipulation capabilities beyond the standard fs module.

  • pify:

    Choose pify if you want to convert callback-based functions into promise-based ones, making it easier to work with asynchronous code. It is a great utility for projects that want to leverage promises without rewriting existing callback functions.

  • jsonfile:

    Choose jsonfile if your primary focus is on reading and writing JSON files with a simple API. It is lightweight and easy to use, making it suitable for projects that need to handle JSON data without additional overhead.

  • node-fetch:

    Choose node-fetch if you need a lightweight and flexible solution for making HTTP requests in Node.js. It is especially useful for projects that require a Fetch API-like interface for server-side applications and APIs.

  • load-json-file:

    Choose load-json-file if you want a minimalistic and promise-based approach to loading JSON files. It is perfect for modern JavaScript applications that prioritize simplicity and async/await syntax when working with JSON data.

README for fs-extra

Node.js: fs-extra

fs-extra adds file system methods that aren't included in the native fs module and adds promise support to the fs methods. It also uses graceful-fs to prevent EMFILE errors. It should be a drop in replacement for fs.

npm Package License build status downloads per month JavaScript Style Guide

Why?

I got tired of including mkdirp, rimraf, and ncp in most of my projects.

Installation

npm install fs-extra

Usage

CommonJS

fs-extra is a drop in replacement for native fs. All methods in fs are attached to fs-extra. All fs methods return promises if the callback isn't passed.

You don't ever need to include the original fs module again:

const fs = require('fs') // this is no longer necessary

you can now do this:

const fs = require('fs-extra')

or if you prefer to make it clear that you're using fs-extra and not fs, you may want to name your fs variable fse like so:

const fse = require('fs-extra')

you can also keep both, but it's redundant:

const fs = require('fs')
const fse = require('fs-extra')

ESM

There is also an fs-extra/esm import, that supports both default and named exports. However, note that fs methods are not included in fs-extra/esm; you still need to import fs and/or fs/promises seperately:

import { readFileSync } from 'fs'
import { readFile } from 'fs/promises'
import { outputFile, outputFileSync } from 'fs-extra/esm'

Default exports are supported:

import fs from 'fs'
import fse from 'fs-extra/esm'
// fse.readFileSync is not a function; must use fs.readFileSync

but you probably want to just use regular fs-extra instead of fs-extra/esm for default exports:

import fs from 'fs-extra'
// both fs and fs-extra methods are defined

Sync vs Async vs Async/Await

Most methods are async by default. All async methods will return a promise if the callback isn't passed.

Sync methods on the other hand will throw if an error occurs.

Also Async/Await will throw an error if one occurs.

Example:

const fs = require('fs-extra')

// Async with promises:
fs.copy('/tmp/myfile', '/tmp/mynewfile')
  .then(() => console.log('success!'))
  .catch(err => console.error(err))

// Async with callbacks:
fs.copy('/tmp/myfile', '/tmp/mynewfile', err => {
  if (err) return console.error(err)
  console.log('success!')
})

// Sync:
try {
  fs.copySync('/tmp/myfile', '/tmp/mynewfile')
  console.log('success!')
} catch (err) {
  console.error(err)
}

// Async/Await:
async function copyFiles () {
  try {
    await fs.copy('/tmp/myfile', '/tmp/mynewfile')
    console.log('success!')
  } catch (err) {
    console.error(err)
  }
}

copyFiles()

Methods

Async

Sync

NOTE: You can still use the native Node.js methods. They are promisified and copied over to fs-extra. See notes on fs.read(), fs.write(), & fs.writev()

What happened to walk() and walkSync()?

They were removed from fs-extra in v2.0.0. If you need the functionality, walk and walkSync are available as separate packages, klaw and klaw-sync.

Third Party

CLI

fse-cli allows you to run fs-extra from a console or from npm scripts.

TypeScript

If you like TypeScript, you can use fs-extra with it: https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/fs-extra

File / Directory Watching

If you want to watch for changes to files or directories, then you should use chokidar.

Obtain Filesystem (Devices, Partitions) Information

fs-filesystem allows you to read the state of the filesystem of the host on which it is run. It returns information about both the devices and the partitions (volumes) of the system.

Misc.

Hacking on fs-extra

Wanna hack on fs-extra? Great! Your help is needed! fs-extra is one of the most depended upon Node.js packages. This project uses JavaScript Standard Style - if the name or style choices bother you, you're gonna have to get over it :) If standard is good enough for npm, it's good enough for fs-extra.

js-standard-style

What's needed?

  • First, take a look at existing issues. Those are probably going to be where the priority lies.
  • More tests for edge cases. Specifically on different platforms. There can never be enough tests.
  • Improve test coverage.

Note: If you make any big changes, you should definitely file an issue for discussion first.

Running the Test Suite

fs-extra contains hundreds of tests.

  • npm run lint: runs the linter (standard)
  • npm run unit: runs the unit tests
  • npm run unit-esm: runs tests for fs-extra/esm exports
  • npm test: runs the linter and all tests

When running unit tests, set the environment variable CROSS_DEVICE_PATH to the absolute path of an empty directory on another device (like a thumb drive) to enable cross-device move tests.

Windows

If you run the tests on the Windows and receive a lot of symbolic link EPERM permission errors, it's because on Windows you need elevated privilege to create symbolic links. You can add this to your Windows's account by following the instructions here: http://superuser.com/questions/104845/permission-to-make-symbolic-links-in-windows-7 However, I didn't have much luck doing this.

Since I develop on Mac OS X, I use VMWare Fusion for Windows testing. I create a shared folder that I map to a drive on Windows. I open the Node.js command prompt and run as Administrator. I then map the network drive running the following command:

net use z: "\\vmware-host\Shared Folders"

I can then navigate to my fs-extra directory and run the tests.

Naming

I put a lot of thought into the naming of these functions. Inspired by @coolaj86's request. So he deserves much of the credit for raising the issue. See discussion(s) here:

  • https://github.com/jprichardson/node-fs-extra/issues/2
  • https://github.com/flatiron/utile/issues/11
  • https://github.com/ryanmcgrath/wrench-js/issues/29
  • https://github.com/substack/node-mkdirp/issues/17

First, I believe that in as many cases as possible, the Node.js naming schemes should be chosen. However, there are problems with the Node.js own naming schemes.

For example, fs.readFile() and fs.readdir(): the F is capitalized in File and the d is not capitalized in dir. Perhaps a bit pedantic, but they should still be consistent. Also, Node.js has chosen a lot of POSIX naming schemes, which I believe is great. See: fs.mkdir(), fs.rmdir(), fs.chown(), etc.

We have a dilemma though. How do you consistently name methods that perform the following POSIX commands: cp, cp -r, mkdir -p, and rm -rf?

My perspective: when in doubt, err on the side of simplicity. A directory is just a hierarchical grouping of directories and files. Consider that for a moment. So when you want to copy it or remove it, in most cases you'll want to copy or remove all of its contents. When you want to create a directory, if the directory that it's suppose to be contained in does not exist, then in most cases you'll want to create that too.

So, if you want to remove a file or a directory regardless of whether it has contents, just call fs.remove(path). If you want to copy a file or a directory whether it has contents, just call fs.copy(source, destination). If you want to create a directory regardless of whether its parent directories exist, just call fs.mkdirs(path) or fs.mkdirp(path).

Credit

fs-extra wouldn't be possible without using the modules from the following authors:

License

Licensed under MIT

Copyright (c) 2011-2024 JP Richardson