These libraries facilitate communication between web applications and servers, each serving different protocols and use cases. They provide various methods for data transfer, including HTTP, gRPC, and WebSockets, allowing developers to choose the most appropriate technology for their application's needs. Understanding the differences in functionality, performance, and compatibility is crucial for selecting the right library for specific scenarios in web development.
Package Weekly Downloads Trend
Github Stars Ranking
Stat Detail
Package
Downloads
Stars
Size
Issues
Publish
License
axios
59,317,329
106,405
2.14 MB
672
3 days ago
MIT
protobufjs
22,086,741
10,091
2.77 MB
681
6 months ago
BSD-3-Clause
@grpc/grpc-js
13,737,971
4,591
1.94 MB
211
25 days ago
Apache-2.0
socket.io
6,765,214
61,691
1.41 MB
184
4 months ago
MIT
grpc
127,888
4,591
-
211
4 years ago
Apache-2.0
grpc-web
102,598
8,791
45.2 kB
207
a year ago
Apache-2.0
Feature Comparison: axios vs protobufjs vs @grpc/grpc-js vs socket.io vs grpc vs grpc-web
Communication Protocol
axios:
Axios uses the HTTP/1.1 protocol for making requests, which is the standard for RESTful APIs. It is designed for simplicity and ease of use when dealing with traditional web services.
protobufjs:
Protobuf.js is not a communication protocol but a library for encoding and decoding Protocol Buffers, which can be used with various protocols, including gRPC and HTTP.
@grpc/grpc-js:
@grpc/grpc-js uses the gRPC protocol, which is based on HTTP/2, allowing for multiplexed streams and efficient binary serialization. This makes it suitable for high-performance applications that require low latency and high throughput.
socket.io:
Socket.IO implements a WebSocket protocol with fallbacks to HTTP long polling, enabling real-time communication. It is designed for scenarios where low latency and immediate data transfer are critical.
grpc:
gRPC also utilizes HTTP/2, providing advantages like multiplexing and server push, making it ideal for microservices that need to communicate efficiently over a network.
grpc-web:
gRPC-Web is a thin layer that translates gRPC calls into HTTP/1.1 requests, allowing web clients to communicate with gRPC servers while conforming to browser limitations. It is designed to work seamlessly with existing gRPC services.
Data Serialization
axios:
Axios handles JSON data natively, automatically transforming request and response data to and from JSON format, making it easy to work with REST APIs that primarily use JSON.
protobufjs:
Protobuf.js provides tools for defining and working with Protocol Buffers, allowing developers to serialize and deserialize data efficiently, making it suitable for various applications beyond gRPC.
@grpc/grpc-js:
@grpc/grpc-js uses Protocol Buffers for data serialization, which is efficient and compact, reducing the amount of data transmitted over the network. This is particularly beneficial for performance-sensitive applications.
socket.io:
Socket.IO uses JSON for data transmission, allowing for easy integration with JavaScript applications. It supports sending and receiving various data types, including binary data.
grpc:
gRPC also uses Protocol Buffers for serialization, ensuring that data is transmitted in a compact binary format, which is faster and more efficient than JSON, especially for complex data structures.
grpc-web:
gRPC-Web uses Protocol Buffers for serialization, allowing web clients to send and receive data in a compact format, similar to gRPC, while ensuring compatibility with web standards.
Ease of Use
axios:
Axios is user-friendly and straightforward, making it easy for developers to perform HTTP requests with minimal setup. Its promise-based API aligns well with modern JavaScript practices, enhancing usability.
protobufjs:
Protobuf.js is relatively easy to use for those familiar with Protocol Buffers, providing a straightforward API for serialization and deserialization, but it requires knowledge of the Protocol Buffers syntax.
@grpc/grpc-js:
@grpc/grpc-js requires understanding of gRPC concepts and Protocol Buffers, which can add complexity, but it offers powerful features for those familiar with these technologies.
socket.io:
Socket.IO is designed to be easy to implement, providing a simple API for real-time communication. It abstracts many complexities of WebSockets, making it accessible for developers.
grpc:
gRPC has a steeper learning curve due to its reliance on Protocol Buffers and the need to define service contracts, which may be challenging for beginners but offers strong typing and validation.
grpc-web:
gRPC-Web simplifies the use of gRPC in web applications, but developers still need to understand gRPC concepts. It abstracts some complexities while ensuring compatibility with web standards.
Performance
axios:
Axios is performant for standard HTTP requests but may not match the efficiency of gRPC or WebSockets in real-time scenarios. It is best for applications where traditional RESTful communication suffices.
protobufjs:
Protobuf.js offers high performance in data serialization, allowing for quick encoding and decoding of Protocol Buffers, which is beneficial in data-intensive applications.
@grpc/grpc-js:
@grpc/grpc-js is optimized for performance, leveraging HTTP/2 features and Protocol Buffers to minimize latency and maximize throughput, making it suitable for high-performance applications.
socket.io:
Socket.IO provides good performance for real-time applications, but its reliance on fallbacks may introduce some latency compared to pure WebSocket implementations.
grpc:
gRPC excels in performance, especially in microservices architectures, due to its efficient binary serialization and support for streaming, making it ideal for high-load environments.
grpc-web:
gRPC-Web provides a performance bridge for web applications, allowing efficient communication with gRPC services, but it may not achieve the same performance as direct gRPC calls due to HTTP/1.1 limitations.
Use Cases
axios:
Axios is best suited for standard web applications that consume REST APIs, offering a straightforward way to handle HTTP requests and responses.
protobufjs:
Protobuf.js is useful for applications that require efficient data serialization, regardless of the communication protocol, making it versatile for various use cases.
@grpc/grpc-js:
@grpc/grpc-js is ideal for backend services that require efficient communication, such as microservices, where performance and strict contracts are crucial.
socket.io:
Socket.IO is ideal for applications that require real-time communication, such as chat applications, online gaming, or live notifications, where immediate data transfer is essential.
grpc:
gRPC is perfect for microservices that need to communicate efficiently across different languages and platforms, especially when strict contracts are necessary.
grpc-web:
gRPC-Web is designed for web applications that need to interact with gRPC services, allowing developers to leverage gRPC's benefits in a browser environment.
How to Choose: axios vs protobufjs vs @grpc/grpc-js vs socket.io vs grpc vs grpc-web
axios:
Choose Axios for standard HTTP requests in a promise-based manner. It is ideal for RESTful APIs and offers a simple API for handling requests and responses, along with built-in support for interceptors and automatic JSON data transformation.
protobufjs:
Choose Protobuf.js if you need a standalone library for encoding and decoding Protocol Buffers, especially when you want to work with Protocol Buffers without relying on gRPC. It is useful for data serialization in various applications.
@grpc/grpc-js:
Choose @grpc/grpc-js if you need a pure JavaScript implementation of gRPC for Node.js applications, especially when you require high performance and efficient binary serialization with Protocol Buffers.
socket.io:
Choose Socket.IO if you need real-time, bidirectional communication between clients and servers. It is perfect for applications that require instant updates, such as chat applications or live notifications.
grpc:
Choose gRPC if you need a high-performance RPC framework that supports multiple languages and is suitable for microservices architecture. It is best for applications that require efficient communication between services with strict contract definitions using Protocol Buffers.
grpc-web:
Choose gRPC-Web if you want to use gRPC in web applications that run in the browser. It allows you to communicate with gRPC services from the client-side while adhering to the limitations of web browsers, such as CORS and HTTP/1.1.
Similar Npm Packages to axios
axios is a popular promise-based HTTP client for both the browser and Node.js. It simplifies making HTTP requests and handling responses, providing a clean and intuitive API. Axios supports features such as request and response interception, automatic JSON data transformation, and the ability to cancel requests. It is widely used in web applications for interacting with RESTful APIs and is known for its ease of use and flexibility. However, there are several alternatives to axios that developers may consider based on their specific needs:
node-fetch is a lightweight module that brings window.fetch to Node.js. It is a simple and minimalistic implementation of the Fetch API, allowing developers to make HTTP requests using a familiar API. Node-fetch is particularly useful for server-side applications or when working with APIs in a Node.js environment. It supports promises and is a great choice for those who prefer the Fetch API's syntax and behavior over traditional XMLHttpRequest or other libraries.
request was once one of the most popular HTTP request libraries for Node.js. It provided a simple and flexible API for making HTTP requests and handling responses. However, it has been deprecated and is no longer actively maintained. While it may still be found in legacy projects, developers are encouraged to use more modern alternatives like axios or node-fetch for new applications.
superagent is another powerful HTTP request library for Node.js and browsers. It offers a flexible and expressive API for making HTTP requests, supporting features like chaining, automatic content type handling, and file uploads. Superagent is particularly useful for developers who need a more feature-rich alternative to axios or want to work with a library that provides a more expressive syntax.
protobufjs is a powerful library for working with Protocol Buffers in JavaScript. Protocol Buffers, developed by Google, are a language-agnostic way of serializing structured data, making it easier to communicate between services or store data efficiently. protobufjs allows developers to encode and decode Protocol Buffers messages, providing a straightforward API for defining message types and handling serialization and deserialization. This library is particularly useful for applications that require efficient data interchange, such as microservices or mobile applications.
While protobufjs is a popular choice, there are alternatives that developers might consider:
google-protobuf is the official JavaScript implementation of Protocol Buffers provided by Google. It offers a comprehensive set of features for working with Protocol Buffers, including support for both JavaScript and TypeScript. google-protobuf is well-suited for projects that require the latest features and updates directly from Google, making it a reliable choice for developers who want to leverage the full capabilities of Protocol Buffers in their applications.
ts-proto is a TypeScript-first implementation of Protocol Buffers that generates TypeScript code from .proto files. It aims to provide a more type-safe experience when working with Protocol Buffers in TypeScript projects. ts-proto is ideal for developers who want to take full advantage of TypeScript's type system while working with Protocol Buffers, ensuring better type safety and developer experience.
socket.io is a popular library for real-time web applications, enabling bidirectional communication between clients and servers. It abstracts the complexities of WebSocket connections and provides a simple API for building real-time features like chat applications, notifications, and live updates. Socket.io automatically falls back to other communication methods when WebSockets are not supported, making it a robust choice for various environments.
While socket.io is widely used, there are alternative libraries that also facilitate real-time communication. Here are a couple of notable alternatives:
uws (µWebSockets) is a highly efficient WebSocket library designed for performance and scalability. It is known for its low latency and high throughput, making it an excellent choice for applications that require handling a large number of concurrent connections. uws is particularly suitable for scenarios where performance is critical, such as gaming or real-time data streaming. However, it is worth noting that uws is more low-level compared to socket.io, meaning developers may need to implement additional features like reconnection logic and event handling manually.
ws is another lightweight WebSocket library for Node.js that provides a simple and straightforward API for establishing WebSocket connections. It is designed to be easy to use and integrates well with existing Node.js applications. While ws does not offer the same level of abstraction and additional features as socket.io, it is a solid choice for developers looking for a minimalistic solution to implement WebSocket communication without the overhead of a more complex library.
grpc is a high-performance, open-source universal RPC (Remote Procedure Call) framework that can run in any environment. It allows developers to define services and message types using Protocol Buffers (protobuf), enabling efficient communication between clients and servers. While gRPC is a powerful choice for building distributed systems, there are several alternatives that cater to different use cases and preferences. Here are a few notable alternatives:
@grpc/grpc-js is the official JavaScript implementation of gRPC. It is designed to work seamlessly with Node.js and provides a pure JavaScript alternative to the native gRPC library. This package is particularly useful for developers who want to leverage gRPC in their Node.js applications without relying on native code, making it easier to install and use across different platforms.
axios is a popular promise-based HTTP client for the browser and Node.js. While not a direct alternative to gRPC, axios is widely used for making HTTP requests in web applications. It provides a simple API for handling requests and responses, making it a great choice for RESTful APIs. If your application primarily communicates over HTTP rather than using RPC, axios is a solid option.
grpc-web is a JavaScript client library that allows gRPC to be used in web applications. It enables web clients to communicate with gRPC servers by translating gRPC calls into HTTP/1.1 requests. This is particularly useful for developers looking to integrate gRPC into their web applications while maintaining compatibility with existing web technologies.
protobufjs is a library for working with Protocol Buffers in JavaScript. It allows developers to encode and decode messages defined in .proto files, making it easier to work with gRPC services. While protobufjs is not a direct alternative to gRPC itself, it is an essential tool for developers who want to work with Protocol Buffers in their applications.
socket.io is a library for real-time communication between clients and servers. It enables bi-directional communication over WebSockets and falls under a different paradigm than gRPC. While gRPC is focused on RPC-style communication, socket.io is ideal for applications that require real-time updates, such as chat applications or live notifications.
grpc-web is a JavaScript library that allows web applications to communicate with gRPC services. It provides a way to make gRPC calls from the browser, enabling developers to build rich, interactive web applications that can leverage the power of gRPC for efficient communication with backend services. By using grpc-web, developers can take advantage of gRPC's features, such as bi-directional streaming and protocol buffers, while still adhering to the constraints of web browsers.
While grpc-web is a powerful tool for integrating gRPC with web applications, there are alternatives worth considering:
@grpc/grpc-js is a pure JavaScript implementation of gRPC that runs in Node.js and the browser. It is designed to be a more modern and flexible alternative to the original gRPC library. Unlike grpc-web, which is specifically tailored for web applications, @grpc/grpc-js can be used in various environments, including server-side applications. This library allows developers to use gRPC features without the need for a separate proxy, making it a suitable choice for projects that require a more comprehensive gRPC solution.
@improbable-eng/grpc-web is another library that enables gRPC communication in web applications. It provides a more flexible approach to gRPC-web integration and is designed to work seamlessly with existing gRPC services. This library focuses on performance and ease of use, allowing developers to quickly set up gRPC calls in their web applications. If you are looking for an alternative that emphasizes performance and simplicity, @improbable-eng/grpc-web may be the right choice for your project.
For some bundlers and some ES6 linters you may need to do the following:
import { default as axios } from 'axios';
For cases where something went wrong when trying to import a module into a custom or legacy environment,
you can try importing the module package directly:
Note: CommonJS usage
In order to gain the TypeScript typings (for intellisense / autocomplete) while using CommonJS imports with require(), use the following approach:
import axios from 'axios';
//const axios = require('axios'); // legacy way
// Make a request for a user with a given ID
axios.get('/user?ID=12345')
.then(function (response) {
// handle success
console.log(response);
})
.catch(function (error) {
// handle error
console.log(error);
})
.finally(function () {
// always executed
});
// Optionally the request above could also be done as
axios.get('/user', {
params: {
ID: 12345
}
})
.then(function (response) {
console.log(response);
})
.catch(function (error) {
console.log(error);
})
.finally(function () {
// always executed
});
// Want to use async/await? Add the `async` keyword to your outer function/method.
async function getUser() {
try {
const response = await axios.get('/user?ID=12345');
console.log(response);
} catch (error) {
console.error(error);
}
}
Note: async/await is part of ECMAScript 2017 and is not supported in Internet
Explorer and older browsers, so use with caution.
The available instance methods are listed below. The specified config will be merged with the instance config.
axios#request(config)
axios#get(url[, config])
axios#delete(url[, config])
axios#head(url[, config])
axios#options(url[, config])
axios#post(url[, data[, config]])
axios#put(url[, data[, config]])
axios#patch(url[, data[, config]])
axios#getUri([config])
Request Config
These are the available config options for making requests. Only the url is required. Requests will default to GET if method is not specified.
{
// `url` is the server URL that will be used for the request
url: '/user',
// `method` is the request method to be used when making the request
method: 'get', // default
// `baseURL` will be prepended to `url` unless `url` is absolute and option `allowAbsoluteUrls` is set to true.
// It can be convenient to set `baseURL` for an instance of axios to pass relative URLs
// to methods of that instance.
baseURL: 'https://some-domain.com/api/',
// `allowAbsoluteUrls` determines whether or not absolute URLs will override a configured `baseUrl`.
// When set to true (default), absolute values for `url` will override `baseUrl`.
// When set to false, absolute values for `url` will always be prepended by `baseUrl`.
allowAbsoluteUrls: true,
// `transformRequest` allows changes to the request data before it is sent to the server
// This is only applicable for request methods 'PUT', 'POST', 'PATCH' and 'DELETE'
// The last function in the array must return a string or an instance of Buffer, ArrayBuffer,
// FormData or Stream
// You may modify the headers object.
transformRequest: [function (data, headers) {
// Do whatever you want to transform the data
return data;
}],
// `transformResponse` allows changes to the response data to be made before
// it is passed to then/catch
transformResponse: [function (data) {
// Do whatever you want to transform the data
return data;
}],
// `headers` are custom headers to be sent
headers: {'X-Requested-With': 'XMLHttpRequest'},
// `params` are the URL parameters to be sent with the request
// Must be a plain object or a URLSearchParams object
params: {
ID: 12345
},
// `paramsSerializer` is an optional config that allows you to customize serializing `params`.
paramsSerializer: {
// Custom encoder function which sends key/value pairs in an iterative fashion.
encode?: (param: string): string => { /* Do custom operations here and return transformed string */ },
// Custom serializer function for the entire parameter. Allows user to mimic pre 1.x behaviour.
serialize?: (params: Record<string, any>, options?: ParamsSerializerOptions ),
// Configuration for formatting array indexes in the params.
indexes: false // Three available options: (1) indexes: null (leads to no brackets), (2) (default) indexes: false (leads to empty brackets), (3) indexes: true (leads to brackets with indexes).
},
// `data` is the data to be sent as the request body
// Only applicable for request methods 'PUT', 'POST', 'DELETE , and 'PATCH'
// When no `transformRequest` is set, must be of one of the following types:
// - string, plain object, ArrayBuffer, ArrayBufferView, URLSearchParams
// - Browser only: FormData, File, Blob
// - Node only: Stream, Buffer, FormData (form-data package)
data: {
firstName: 'Fred'
},
// syntax alternative to send data into the body
// method post
// only the value is sent, not the key
data: 'Country=Brasil&City=Belo Horizonte',
// `timeout` specifies the number of milliseconds before the request times out.
// If the request takes longer than `timeout`, the request will be aborted.
timeout: 1000, // default is `0` (no timeout)
// `withCredentials` indicates whether or not cross-site Access-Control requests
// should be made using credentials
withCredentials: false, // default
// `adapter` allows custom handling of requests which makes testing easier.
// Return a promise and supply a valid response (see lib/adapters/README.md)
adapter: function (config) {
/* ... */
},
// Also, you can set the name of the built-in adapter, or provide an array with their names
// to choose the first available in the environment
adapter: 'xhr', // 'fetch' | 'http' | ['xhr', 'http', 'fetch']
// `auth` indicates that HTTP Basic auth should be used, and supplies credentials.
// This will set an `Authorization` header, overwriting any existing
// `Authorization` custom headers you have set using `headers`.
// Please note that only HTTP Basic auth is configurable through this parameter.
// For Bearer tokens and such, use `Authorization` custom headers instead.
auth: {
username: 'janedoe',
password: 's00pers3cret'
},
// `responseType` indicates the type of data that the server will respond with
// options are: 'arraybuffer', 'document', 'json', 'text', 'stream'
// browser only: 'blob'
responseType: 'json', // default
// `responseEncoding` indicates encoding to use for decoding responses (Node.js only)
// Note: Ignored for `responseType` of 'stream' or client-side requests
// options are: 'ascii', 'ASCII', 'ansi', 'ANSI', 'binary', 'BINARY', 'base64', 'BASE64', 'base64url',
// 'BASE64URL', 'hex', 'HEX', 'latin1', 'LATIN1', 'ucs-2', 'UCS-2', 'ucs2', 'UCS2', 'utf-8', 'UTF-8',
// 'utf8', 'UTF8', 'utf16le', 'UTF16LE'
responseEncoding: 'utf8', // default
// `xsrfCookieName` is the name of the cookie to use as a value for xsrf token
xsrfCookieName: 'XSRF-TOKEN', // default
// `xsrfHeaderName` is the name of the http header that carries the xsrf token value
xsrfHeaderName: 'X-XSRF-TOKEN', // default
// `undefined` (default) - set XSRF header only for the same origin requests
withXSRFToken: boolean | undefined | ((config: InternalAxiosRequestConfig) => boolean | undefined),
// `onUploadProgress` allows handling of progress events for uploads
// browser & node.js
onUploadProgress: function ({loaded, total, progress, bytes, estimated, rate, upload = true}) {
// Do whatever you want with the Axios progress event
},
// `onDownloadProgress` allows handling of progress events for downloads
// browser & node.js
onDownloadProgress: function ({loaded, total, progress, bytes, estimated, rate, download = true}) {
// Do whatever you want with the Axios progress event
},
// `maxContentLength` defines the max size of the http response content in bytes allowed in node.js
maxContentLength: 2000,
// `maxBodyLength` (Node only option) defines the max size of the http request content in bytes allowed
maxBodyLength: 2000,
// `validateStatus` defines whether to resolve or reject the promise for a given
// HTTP response status code. If `validateStatus` returns `true` (or is set to `null`
// or `undefined`), the promise will be resolved; otherwise, the promise will be
// rejected.
validateStatus: function (status) {
return status >= 200 && status < 300; // default
},
// `maxRedirects` defines the maximum number of redirects to follow in node.js.
// If set to 0, no redirects will be followed.
maxRedirects: 21, // default
// `beforeRedirect` defines a function that will be called before redirect.
// Use this to adjust the request options upon redirecting,
// to inspect the latest response headers,
// or to cancel the request by throwing an error
// If maxRedirects is set to 0, `beforeRedirect` is not used.
beforeRedirect: (options, { headers }) => {
if (options.hostname === "example.com") {
options.auth = "user:password";
}
},
// `socketPath` defines a UNIX Socket to be used in node.js.
// e.g. '/var/run/docker.sock' to send requests to the docker daemon.
// Only either `socketPath` or `proxy` can be specified.
// If both are specified, `socketPath` is used.
socketPath: null, // default
// `transport` determines the transport method that will be used to make the request.
// If defined, it will be used. Otherwise, if `maxRedirects` is 0,
// the default `http` or `https` library will be used, depending on the protocol specified in `protocol`.
// Otherwise, the `httpFollow` or `httpsFollow` library will be used, again depending on the protocol,
// which can handle redirects.
transport: undefined, // default
// `httpAgent` and `httpsAgent` define a custom agent to be used when performing http
// and https requests, respectively, in node.js. This allows options to be added like
// `keepAlive` that are not enabled by default.
httpAgent: new http.Agent({ keepAlive: true }),
httpsAgent: new https.Agent({ keepAlive: true }),
// `proxy` defines the hostname, port, and protocol of the proxy server.
// You can also define your proxy using the conventional `http_proxy` and
// `https_proxy` environment variables. If you are using environment variables
// for your proxy configuration, you can also define a `no_proxy` environment
// variable as a comma-separated list of domains that should not be proxied.
// Use `false` to disable proxies, ignoring environment variables.
// `auth` indicates that HTTP Basic auth should be used to connect to the proxy, and
// supplies credentials.
// This will set an `Proxy-Authorization` header, overwriting any existing
// `Proxy-Authorization` custom headers you have set using `headers`.
// If the proxy server uses HTTPS, then you must set the protocol to `https`.
proxy: {
protocol: 'https',
host: '127.0.0.1',
// hostname: '127.0.0.1' // Takes precedence over 'host' if both are defined
port: 9000,
auth: {
username: 'mikeymike',
password: 'rapunz3l'
}
},
// `cancelToken` specifies a cancel token that can be used to cancel the request
// (see Cancellation section below for details)
cancelToken: new CancelToken(function (cancel) {
}),
// an alternative way to cancel Axios requests using AbortController
signal: new AbortController().signal,
// `decompress` indicates whether or not the response body should be decompressed
// automatically. If set to `true` will also remove the 'content-encoding' header
// from the responses objects of all decompressed responses
// - Node only (XHR cannot turn off decompression)
decompress: true, // default
// `insecureHTTPParser` boolean.
// Indicates where to use an insecure HTTP parser that accepts invalid HTTP headers.
// This may allow interoperability with non-conformant HTTP implementations.
// Using the insecure parser should be avoided.
// see options https://nodejs.org/dist/latest-v12.x/docs/api/http.html#http_http_request_url_options_callback
// see also https://nodejs.org/en/blog/vulnerability/february-2020-security-releases/#strict-http-header-parsing-none
insecureHTTPParser: undefined, // default
// transitional options for backward compatibility that may be removed in the newer versions
transitional: {
// silent JSON parsing mode
// `true` - ignore JSON parsing errors and set response.data to null if parsing failed (old behaviour)
// `false` - throw SyntaxError if JSON parsing failed (Note: responseType must be set to 'json')
silentJSONParsing: true, // default value for the current Axios version
// try to parse the response string as JSON even if `responseType` is not 'json'
forcedJSONParsing: true,
// throw ETIMEDOUT error instead of generic ECONNABORTED on request timeouts
clarifyTimeoutError: false,
},
env: {
// The FormData class to be used to automatically serialize the payload into a FormData object
FormData: window?.FormData || global?.FormData
},
formSerializer: {
visitor: (value, key, path, helpers) => {}; // custom visitor function to serialize form values
dots: boolean; // use dots instead of brackets format
metaTokens: boolean; // keep special endings like {} in parameter key
indexes: boolean; // array indexes format null - no brackets, false - empty brackets, true - brackets with indexes
},
// http adapter only (node.js)
maxRate: [
100 * 1024, // 100KB/s upload limit,
100 * 1024 // 100KB/s download limit
]
}
Response Schema
The response for a request contains the following information.
{
// `data` is the response that was provided by the server
data: {},
// `status` is the HTTP status code from the server response
status: 200,
// `statusText` is the HTTP status message from the server response
statusText: 'OK',
// `headers` the HTTP headers that the server responded with
// All header names are lowercase and can be accessed using the bracket notation.
// Example: `response.headers['content-type']`
headers: {},
// `config` is the config that was provided to `axios` for the request
config: {},
// `request` is the request that generated this response
// It is the last ClientRequest instance in node.js (in redirects)
// and an XMLHttpRequest instance in the browser
request: {}
}
When using then, you will receive the response as follows:
When using catch, or passing a rejection callback as second parameter of then, the response will be available through the error object as explained in the Handling Errors section.
Config Defaults
You can specify config defaults that will be applied to every request.
Global axios defaults
axios.defaults.baseURL = 'https://api.example.com';
// Important: If axios is used with multiple domains, the AUTH_TOKEN will be sent to all of them.
// See below for an example using Custom instance defaults instead.
axios.defaults.headers.common['Authorization'] = AUTH_TOKEN;
axios.defaults.headers.post['Content-Type'] = 'application/x-www-form-urlencoded';
Custom instance defaults
// Set config defaults when creating the instance
const instance = axios.create({
baseURL: 'https://api.example.com'
});
// Alter defaults after instance has been created
instance.defaults.headers.common['Authorization'] = AUTH_TOKEN;
Config order of precedence
Config will be merged with an order of precedence. The order is library defaults found in lib/defaults/index.js, then defaults property of the instance, and finally config argument for the request. The latter will take precedence over the former. Here's an example.
// Create an instance using the config defaults provided by the library
// At this point the timeout config value is `0` as is the default for the library
const instance = axios.create();
// Override timeout default for the library
// Now all requests using this instance will wait 2.5 seconds before timing out
instance.defaults.timeout = 2500;
// Override timeout for this request as it's known to take a long time
instance.get('/longRequest', {
timeout: 5000
});
Interceptors
You can intercept requests or responses before they are handled by then or catch.
const instance = axios.create();
// Add a request interceptor
instance.interceptors.request.use(function (config) {
// Do something before request is sent
return config;
}, function (error) {
// Do something with request error
return Promise.reject(error);
});
// Add a response interceptor
instance.interceptors.response.use(function (response) {
// Any status code that lie within the range of 2xx cause this function to trigger
// Do something with response data
return response;
}, function (error) {
// Any status codes that falls outside the range of 2xx cause this function to trigger
// Do something with response error
return Promise.reject(error);
});
If you need to remove an interceptor later you can.
When you add request interceptors, they are presumed to be asynchronous by default. This can cause a delay
in the execution of your axios request when the main thread is blocked (a promise is created under the hood for
the interceptor and your request gets put on the bottom of the call stack). If your request interceptors are synchronous you can add a flag
to the options object that will tell axios to run the code synchronously and avoid any delays in request execution.
axios.interceptors.request.use(function (config) {
config.headers.test = 'I am only a header!';
return config;
}, null, { synchronous: true });
If you want to execute a particular interceptor based on a runtime check,
you can add a runWhen function to the options object. The request interceptor will not be executed if and only if the return
of runWhen is false. The function will be called with the config
object (don't forget that you can bind your own arguments to it as well.) This can be handy when you have an
asynchronous request interceptor that only needs to run at certain times.
There are many different axios error messages that can appear that can provide basic information about the specifics of the error and where opportunities may lie in debugging.
The general structure of axios errors is as follows:
| Property | Definition |
| -------- | ---------- |
| message | A quick summary of the error message and the status it failed with. |
| name | This defines where the error originated from. For axios, it will always be an 'AxiosError'. |
| stack | Provides the stack trace of the error. |
| config | An axios config object with specific instance configurations defined by the user from when the request was made |
| code | Represents an axios identified error. The table below lists out specific definitions for internal axios error. |
| status | HTTP response status code. See here for common HTTP response status code meanings.
Below is a list of potential axios identified error:
| Code | Definition |
| -------- | ---------- |
| ERR_BAD_OPTION_VALUE | Invalid or unsupported value provided in axios configuration. |
| ERR_BAD_OPTION | Invalid option provided in axios configuration. |
| ECONNABORTED | Request timed out due to exceeding timeout specified in axios configuration. |
| ETIMEDOUT | Request timed out due to exceeding default axios timelimit. |
| ERR_NETWORK | Network-related issue.
| ERR_FR_TOO_MANY_REDIRECTS | Request is redirected too many times; exceeds max redirects specified in axios configuration.
| ERR_DEPRECATED | Deprecated feature or method used in axios.
| ERR_BAD_RESPONSE | Response cannot be parsed properly or is in an unexpected format.
| ERR_BAD_REQUEST | Requested has unexpected format or missing required parameters. |
| ERR_CANCELED | Feature or method is canceled explicitly by the user.
| ERR_NOT_SUPPORT | Feature or method not supported in the current axios environment.
| ERR_INVALID_URL | Invalid URL provided for axios request.
Handling Errors
the default behavior is to reject every response that returns with a status code that falls out of the range of 2xx and treat it as an error.
axios.get('/user/12345')
.catch(function (error) {
if (error.response) {
// The request was made and the server responded with a status code
// that falls out of the range of 2xx
console.log(error.response.data);
console.log(error.response.status);
console.log(error.response.headers);
} else if (error.request) {
// The request was made but no response was received
// `error.request` is an instance of XMLHttpRequest in the browser and an instance of
// http.ClientRequest in node.js
console.log(error.request);
} else {
// Something happened in setting up the request that triggered an Error
console.log('Error', error.message);
}
console.log(error.config);
});
Using the validateStatus config option, you can override the default condition (status >= 200 && status < 300) and define HTTP code(s) that should throw an error.
axios.get('/user/12345', {
validateStatus: function (status) {
return status < 500; // Resolve only if the status code is less than 500
}
})
Using toJSON you get an object with more information about the HTTP error.
This API is deprecated since v0.22.0 and shouldn't be used in new projects
You can create a cancel token using the CancelToken.source factory as shown below:
const CancelToken = axios.CancelToken;
const source = CancelToken.source();
axios.get('/user/12345', {
cancelToken: source.token
}).catch(function (thrown) {
if (axios.isCancel(thrown)) {
console.log('Request canceled', thrown.message);
} else {
// handle error
}
});
axios.post('/user/12345', {
name: 'new name'
}, {
cancelToken: source.token
})
// cancel the request (the message parameter is optional)
source.cancel('Operation canceled by the user.');
You can also create a cancel token by passing an executor function to the CancelToken constructor:
const CancelToken = axios.CancelToken;
let cancel;
axios.get('/user/12345', {
cancelToken: new CancelToken(function executor(c) {
// An executor function receives a cancel function as a parameter
cancel = c;
})
});
// cancel the request
cancel();
Note: you can cancel several requests with the same cancel token/abort controller.
If a cancellation token is already cancelled at the moment of starting an Axios request, then the request is cancelled immediately, without any attempts to make a real request.
During the transition period, you can use both cancellation APIs, even for the same request:
If your backend body-parser (like body-parser of express.js) supports nested objects decoding, you will get the same object on the server-side automatically
var app = express();
app.use(bodyParser.urlencoded({ extended: true })); // support encoded bodies
app.post('/', function (req, res, next) {
// echo body as JSON
res.send(JSON.stringify(req.body));
});
server = app.listen(3000);
Using multipart/form-data format
FormData
To send the data as a multipart/formdata you need to pass a formData instance as a payload.
Setting the Content-Type header is not required as Axios guesses it based on the payload type.
const formData = new FormData();
formData.append('foo', 'bar');
axios.post('https://httpbin.org/post', formData);
In node.js, you can use the form-data library as follows:
const FormData = require('form-data');
const form = new FormData();
form.append('my_field', 'my value');
form.append('my_buffer', new Buffer(10));
form.append('my_file', fs.createReadStream('/foo/bar.jpg'));
axios.post('https://example.com', form)
🆕 Automatic serialization to FormData
Starting from v0.27.0, Axios supports automatic object serialization to a FormData object if the request Content-Type
header is set to multipart/form-data.
The following request will submit the data in a FormData format (Browser & Node.js):
Axios FormData serializer supports some special endings to perform the following operations:
{} - serialize the value with JSON.stringify
[] - unwrap the array-like object as separate fields with the same key
Note: unwrap/expand operation will be used by default on arrays and FileList objects
FormData serializer supports additional options via config.formSerializer: object property to handle rare cases:
visitor: Function - user-defined visitor function that will be called recursively to serialize the data object
to a FormData object by following custom rules.
dots: boolean = false - use dot notation instead of brackets to serialize arrays and objects;
metaTokens: boolean = true - add the special ending (e.g user{}: '{"name": "John"}') in the FormData key.
The back-end body-parser could potentially use this meta-information to automatically parse the value as JSON.
indexes: null|false|true = false - controls how indexes will be added to unwrapped keys of flat array-like objects.
Axios supports the following shortcut methods: postForm, putForm, patchForm
which are just the corresponding http methods with the Content-Type header preset to multipart/form-data.
Sending Blobs/Files as JSON (base64) is not currently supported.
🆕 Progress capturing
Axios supports both browser and node environments to capture request upload/download progress.
The frequency of progress events is forced to be limited to 3 times per second.
await axios.post(url, data, {
onUploadProgress: function (axiosProgressEvent) {
/*{
loaded: number;
total?: number;
progress?: number; // in range [0..1]
bytes: number; // how many bytes have been transferred since the last trigger (delta)
estimated?: number; // estimated time in seconds
rate?: number; // upload speed in bytes
upload: true; // upload sign
}*/
},
onDownloadProgress: function (axiosProgressEvent) {
/*{
loaded: number;
total?: number;
progress?: number;
bytes: number;
estimated?: number;
rate?: number; // download speed in bytes
download: true; // download sign
}*/
}
});
You can also track stream upload/download progress in node.js:
Note:
Capturing FormData upload progress is not currently supported in node.js environments.
⚠️ Warning
It is recommended to disable redirects by setting maxRedirects: 0 to upload the stream in the node.js environment,
as follow-redirects package will buffer the entire stream in RAM without following the "backpressure" algorithm.
🆕 Rate limiting
Download and upload rate limits can only be set for the http adapter (node.js):
Axios has its own AxiosHeaders class to manipulate headers using a Map-like API that guarantees caseless work.
Although HTTP is case-insensitive in headers, Axios will retain the case of the original header for stylistic reasons
and for a workaround when servers mistakenly consider the header's case.
The old approach of directly manipulating headers object is still available, but deprecated and not recommended for future usage.
Working with headers
An AxiosHeaders object instance can contain different types of internal values. that control setting and merging logic.
The final headers object with string values is obtained by Axios by calling the toJSON method.
Note: By JSON here we mean an object consisting only of string values intended to be sent over the network.
The header value can be one of the following types:
string - normal string value that will be sent to the server
null - skip header when rendering to JSON
false - skip header when rendering to JSON, additionally indicates that set method must be called with rewrite option set to true
to overwrite this value (Axios uses this internally to allow users to opt out of installing certain headers like User-Agent or Content-Type)
undefined - value is not set
Note: The header value is considered set if it is not equal to undefined.
The headers object is always initialized inside interceptors and transformers:
axios.interceptors.request.use((request: InternalAxiosRequestConfig) => {
request.headers.set('My-header', 'value');
request.headers.set({
"My-set-header1": "my-set-value1",
"My-set-header2": "my-set-value2"
});
request.headers.set('User-Agent', false); // disable subsequent setting the header by Axios
request.headers.setContentType('text/plain');
request.headers['My-set-header2'] = 'newValue' // direct access is deprecated
return request;
}
);
You can iterate over an AxiosHeaders instance using a for...of statement:
const headers = new AxiosHeaders({
foo: '1',
bar: '2',
baz: '3'
});
for(const [header, value] of headers) {
console.log(header, value);
}
// foo 1
// bar 2
// baz 3
Returns the internal value of the header. It can take an extra argument to parse the header's value with RegExp.exec,
matcher function or internal key-value parser.
Returns true if at least one header has been cleared.
AxiosHeaders#normalize(format);
If the headers object was changed directly, it can have duplicates with the same name but in different cases.
This method normalizes the headers object by combining duplicate keys into one.
Axios uses this method internally after calling each interceptor.
Set format to true for converting headers name to lowercase and capitalize the initial letters (cOntEnt-type => Content-Type)
Merges the instance with targets into a new AxiosHeaders instance. If the target is a string, it will be parsed as RAW HTTP headers.
Returns a new AxiosHeaders instance.
AxiosHeaders#toJSON(asStrings?)
toJSON(asStrings?: boolean): RawAxiosHeaders;
Resolve all internal headers values into a new null prototype object.
Set asStrings to true to resolve arrays as a string containing all elements, separated by commas.
Returns a new AxiosHeaders instance created from the raw headers passed in,
or simply returns the given headers object if it's an AxiosHeaders instance.
Fetch adapter was introduced in v1.7.0. By default, it will be used if xhr and http adapters are not available in the build,
or not supported by the environment.
To use it by default, it must be selected explicitly:
The adapter supports the same functionality as xhr adapter, including upload and download progress capturing.
Also, it supports additional response types such as stream and formdata (if supported by the environment).
Semver
Until axios reaches a 1.0 release, breaking changes will be released with a new minor version. For example 0.5.1, and 0.5.4 will have the same API, but 0.6.0 will have breaking changes.
Promises
axios depends on a native ES6 Promise implementation to be supported.
If your environment doesn't support ES6 Promises, you can polyfill.
TypeScript
axios includes TypeScript definitions and a type guard for axios errors.
let user: User = null;
try {
const { data } = await axios.get('/user?ID=12345');
user = data.userDetails;
} catch (error) {
if (axios.isAxiosError(error)) {
handleAxiosError(error);
} else {
handleUnexpectedError(error);
}
}
Because axios dual publishes with an ESM default export and a CJS module.exports, there are some caveats.
The recommended setting is to use "moduleResolution": "node16" (this is implied by "module": "node16"). Note that this requires TypeScript 4.7 or greater.
If use ESM, your settings should be fine.
If you compile TypeScript to CJS and you can’t use "moduleResolution": "node 16", you have to enable esModuleInterop.
If you use TypeScript to type check CJS JavaScript code, your only option is to use "moduleResolution": "node16".
Online one-click setup
You can use Gitpod, an online IDE(which is free for Open Source) for contributing or running the examples online.
axios is heavily inspired by the $http service provided in AngularJS. Ultimately axios is an effort to provide a standalone $http-like service for use outside of AngularJS.