micro vs moleculer vs nestjs vs seneca
Node.js Microservices Frameworks Comparison
1 Year
micromoleculernestjssenecaSimilar Packages:
What's Node.js Microservices Frameworks?

Node.js microservices frameworks are designed to facilitate the development of microservices architectures, allowing developers to build, deploy, and manage small, independent services that can communicate over a network. These frameworks provide tools and abstractions that simplify the process of creating scalable and maintainable applications. They often include features such as service discovery, load balancing, and fault tolerance, which are essential for building robust microservices. The choice of framework can significantly impact the development speed, scalability, and maintainability of the application.

Package Weekly Downloads Trend
Github Stars Ranking
Stat Detail
Package
Downloads
Stars
Size
Issues
Publish
License
micro954,43710,59942.1 kB10-MIT
moleculer41,7986,2561.22 MB695 months agoMIT
nestjs10,465----ISC
seneca5,7623,965680 kB2102 months agoMIT
Feature Comparison: micro vs moleculer vs nestjs vs seneca

Architecture

  • micro:

    Micro adopts a minimalist approach, allowing developers to create HTTP-based microservices with minimal boilerplate code. It promotes a straightforward architecture that is easy to understand and implement.

  • moleculer:

    Moleculer follows a service-oriented architecture (SOA) model, where each service is a self-contained unit that can communicate with others. It includes built-in features like service discovery and event-driven communication, making it suitable for complex systems.

  • nestjs:

    NestJS is built on top of Express (or Fastify) and promotes a modular architecture using decorators and dependency injection. This structure enhances maintainability and testability, making it suitable for large-scale applications.

  • seneca:

    Seneca provides a microservices architecture that emphasizes the separation of business logic from transport and infrastructure concerns. It allows developers to define business logic as reusable plugins, promoting code reuse and modularity.

Extensibility

  • micro:

    Micro is highly extensible due to its simple API, allowing developers to easily integrate additional middleware or functionality as needed. This flexibility makes it suitable for various use cases.

  • moleculer:

    Moleculer is designed to be extensible with a rich set of built-in features and the ability to create custom actions and services. Its plugin system allows developers to enhance functionality without modifying core code.

  • nestjs:

    NestJS is highly extensible, supporting a wide range of modules and libraries. Its architecture encourages the use of third-party packages, making it easy to add new features and functionalities to applications.

  • seneca:

    Seneca is extensible through its plugin system, allowing developers to create custom actions and transport layers. This flexibility enables developers to tailor the framework to their specific needs.

Learning Curve

  • micro:

    Micro has a low learning curve, making it easy for developers to get started with microservices. Its minimalist design means fewer concepts to grasp, which is beneficial for beginners.

  • moleculer:

    Moleculer has a moderate learning curve due to its rich feature set and concepts like service discovery and event-driven architecture. However, its documentation is comprehensive, aiding in the learning process.

  • nestjs:

    NestJS has a steeper learning curve, especially for those unfamiliar with TypeScript or Angular-like architecture. However, its structured approach and extensive documentation help mitigate this challenge.

  • seneca:

    Seneca has a moderate learning curve, as it introduces concepts related to messaging and service orchestration. Developers familiar with asynchronous programming will find it easier to adapt.

Performance

  • micro:

    Micro is designed for high performance with minimal overhead. Its lightweight nature allows for quick response times, making it suitable for high-throughput applications.

  • moleculer:

    Moleculer is optimized for performance, offering features like caching and load balancing out of the box. It efficiently handles multiple requests and scales well under load.

  • nestjs:

    NestJS performance is competitive, leveraging the underlying Express or Fastify framework. Its modular architecture allows for optimization and efficient resource management, suitable for large applications.

  • seneca:

    Seneca's performance can vary based on the complexity of the business logic and the number of services. While it provides powerful abstractions, developers need to be mindful of potential overhead in message passing.

Community and Ecosystem

  • micro:

    Micro has a smaller community compared to others, but it benefits from being part of the broader Node.js ecosystem. Its simplicity allows for quick adoption and integration with existing tools.

  • moleculer:

    Moleculer has a growing community and a rich ecosystem of plugins and integrations. Its active development and comprehensive documentation make it a robust choice for microservices.

  • nestjs:

    NestJS boasts a large and active community, with extensive resources, plugins, and modules available. Its alignment with TypeScript and modern development practices enhances its ecosystem.

  • seneca:

    Seneca has a dedicated community focused on microservices and messaging patterns. While its ecosystem may not be as extensive as others, it provides essential tools for building microservices.

How to Choose: micro vs moleculer vs nestjs vs seneca
  • micro:

    Choose Micro if you need a minimalistic and lightweight framework for building microservices quickly. It is ideal for small projects or when you want to create a simple API without the overhead of a full-fledged framework.

  • moleculer:

    Choose Moleculer if you require a powerful microservices framework with built-in features like service discovery, load balancing, and fault tolerance. It is suitable for complex applications that need to scale and manage multiple services effectively.

  • nestjs:

    Choose NestJS if you prefer a structured and opinionated framework that leverages TypeScript and follows the modular architecture pattern. It is great for building scalable server-side applications and offers a rich ecosystem of tools and libraries.

  • seneca:

    Choose Seneca if you want a microservices toolkit that focuses on business logic and provides a simple way to build and manage microservices. It is particularly useful for applications that require a strong emphasis on messaging and service orchestration.

README for micro

Micro — Asynchronous HTTP microservices

Features

  • Easy: Designed for usage with async and await
  • Fast: Ultra-high performance (even JSON parsing is opt-in)
  • Micro: The whole project is ~260 lines of code
  • Agile: Super easy deployment and containerization
  • Simple: Oriented for single purpose modules (function)
  • Standard: Just HTTP!
  • Explicit: No middleware - modules declare all dependencies
  • Lightweight: With all dependencies, the package weighs less than a megabyte

Disclaimer: Micro was created for use within containers and is not intended for use in serverless environments. For those using Vercel, this means that there is no requirement to use Micro in your projects as the benefits it provides are not applicable to the platform. Utility features provided by Micro, such as json, are readily available in the form of Serverless Function helpers.

Installation

Important: Micro is only meant to be used in production. In development, you should use micro-dev, which provides you with a tool belt specifically tailored for developing microservices.

To prepare your microservice for running in the production environment, firstly install micro:

npm install --save micro

Usage

Create an index.js file and export a function that accepts the standard http.IncomingMessage and http.ServerResponse objects:

module.exports = (req, res) => {
  res.end('Welcome to Micro');
};

Micro provides useful helpers but also handles return values – so you can write it even shorter!

module.exports = () => 'Welcome to Micro';

Next, ensure that the main property inside package.json points to your microservice (which is inside index.js in this example case) and add a start script:

{
  "main": "index.js",
  "scripts": {
    "start": "micro"
  }
}

Once all of that is done, the server can be started like this:

npm start

And go to this URL: http://localhost:3000 - 🎉

Command line

  micro - Asynchronous HTTP microservices

  USAGE

      $ micro --help
      $ micro --version
      $ micro [-l listen_uri [-l ...]] [entry_point.js]

      By default micro will listen on 0.0.0.0:3000 and will look first
      for the "main" property in package.json and subsequently for index.js
      as the default entry_point.

      Specifying a single --listen argument will overwrite the default, not supplement it.

  OPTIONS

      --help                              shows this help message

      -v, --version                       displays the current version of micro

      -l, --listen listen_uri             specify a URI endpoint on which to listen (see below) -
                                          more than one may be specified to listen in multiple places

  ENDPOINTS

      Listen endpoints (specified by the --listen or -l options above) instruct micro
      to listen on one or more interfaces/ports, UNIX domain sockets, or Windows named pipes.

      For TCP (traditional host/port) endpoints:

          $ micro -l tcp://hostname:1234

      For UNIX domain socket endpoints:

          $ micro -l unix:/path/to/socket.sock

      For Windows named pipe endpoints:

          $ micro -l pipe:\\.\pipe\PipeName

async & await

Examples

Micro is built for usage with async/await.

const sleep = require('then-sleep');

module.exports = async (req, res) => {
  await sleep(500);
  return 'Ready!';
};

Port Based on Environment Variable

When you want to set the port using an environment variable you can use:

micro -l tcp://0.0.0.0:$PORT

Optionally you can add a default if it suits your use case:

micro -l tcp://0.0.0.0:${PORT-3000}

${PORT-3000} will allow a fallback to port 3000 when $PORT is not defined.

Note that this only works in Bash.

Body parsing

Examples

For parsing the incoming request body we included an async functions buffer, text and json

const { buffer, text, json } = require('micro');

module.exports = async (req, res) => {
  const buf = await buffer(req);
  console.log(buf);
  // <Buffer 7b 22 70 72 69 63 65 22 3a 20 39 2e 39 39 7d>
  const txt = await text(req);
  console.log(txt);
  // '{"price": 9.99}'
  const js = await json(req);
  console.log(js.price);
  // 9.99
  return '';
};

API

buffer(req, { limit = '1mb', encoding = 'utf8' })
text(req, { limit = '1mb', encoding = 'utf8' })
json(req, { limit = '1mb', encoding = 'utf8' })
  • Buffers and parses the incoming body and returns it.
  • Exposes an async function that can be run with await.
  • Can be called multiple times, as it caches the raw request body the first time.
  • limit is how much data is aggregated before parsing at max. Otherwise, an Error is thrown with statusCode set to 413 (see Error Handling). It can be a Number of bytes or a string like '1mb'.
  • If JSON parsing fails, an Error is thrown with statusCode set to 400 (see Error Handling)

For other types of data check the examples

Sending a different status code

So far we have used return to send data to the client. return 'Hello World' is the equivalent of send(res, 200, 'Hello World').

const { send } = require('micro');

module.exports = async (req, res) => {
  const statusCode = 400;
  const data = { error: 'Custom error message' };

  send(res, statusCode, data);
};
send(res, statusCode, data = null)
  • Use require('micro').send.
  • statusCode is a Number with the HTTP status code, and must always be supplied.
  • If data is supplied it is sent in the response. Different input types are processed appropriately, and Content-Type and Content-Length are automatically set.
    • Stream: data is piped as an octet-stream. Note: it is your responsibility to handle the error event in this case (usually, simply logging the error and aborting the response is enough).
    • Buffer: data is written as an octet-stream.
    • object: data is serialized as JSON.
    • string: data is written as-is.
  • If JSON serialization fails (for example, if a cyclical reference is found), a 400 error is thrown. See Error Handling.

Programmatic use

You can use Micro programmatically by requiring Micro directly:

const http = require('http');
const serve = require('micro');
const sleep = require('then-sleep');

const server = new http.Server(
  serve(async (req, res) => {
    await sleep(500);
    return 'Hello world';
  }),
);

server.listen(3000);
serve(fn)
  • Use require('micro').serve.
  • Returns a function with the (req, res) => void signature. That uses the provided function as the request handler.
  • The supplied function is run with await. So it can be async
sendError(req, res, error)
  • Use require('micro').sendError.
  • Used as the default handler for errors thrown.
  • Automatically sets the status code of the response based on error.statusCode.
  • Sends the error.message as the body.
  • Stacks are printed out with console.error and during development (when NODE_ENV is set to 'development') also sent in responses.
  • Usually, you don't need to invoke this method yourself, as you can use the built-in error handling flow with throw.
createError(code, msg, orig)
  • Use require('micro').createError.
  • Creates an error object with a statusCode.
  • Useful for easily throwing errors with HTTP status codes, which are interpreted by the built-in error handling.
  • orig sets error.originalError which identifies the original error (if any).

Error Handling

Micro allows you to write robust microservices. This is accomplished primarily by bringing sanity back to error handling and avoiding callback soup.

If an error is thrown and not caught by you, the response will automatically be 500. Important: Error stacks will be printed as console.error and during development mode (if the env variable NODE_ENV is 'development'), they will also be included in the responses.

If the Error object that's thrown contains a statusCode property, that's used as the HTTP code to be sent. Let's say you want to write a rate limiting module:

const rateLimit = require('my-rate-limit');

module.exports = async (req, res) => {
  await rateLimit(req);
  // ... your code
};

If the API endpoint is abused, it can throw an error with createError like so:

if (tooMany) {
  throw createError(429, 'Rate limit exceeded');
}

Alternatively you can create the Error object yourself

if (tooMany) {
  const err = new Error('Rate limit exceeded');
  err.statusCode = 429;
  throw err;
}

The nice thing about this model is that the statusCode is merely a suggestion. The user can override it:

try {
  await rateLimit(req);
} catch (err) {
  if (429 == err.statusCode) {
    // perhaps send 500 instead?
    send(res, 500);
  }
}

If the error is based on another error that Micro caught, like a JSON.parse exception, then originalError will point to it. If a generic error is caught, the status will be set to 500.

In order to set up your own error handling mechanism, you can use composition in your handler:

const { send } = require('micro');

const handleErrors = (fn) => async (req, res) => {
  try {
    return await fn(req, res);
  } catch (err) {
    console.log(err.stack);
    send(res, 500, 'My custom error!');
  }
};

module.exports = handleErrors(async (req, res) => {
  throw new Error('What happened here?');
});

Testing

Micro makes tests compact and a pleasure to read and write. We recommend Node TAP or AVA, a highly parallel test framework with built-in support for async tests:

const http = require('http');
const { send, serve } = require('micro');
const test = require('ava');
const listen = require('test-listen');
const fetch = require('node-fetch');

test('my endpoint', async (t) => {
  const service = new http.Server(
    serve(async (req, res) => {
      send(res, 200, {
        test: 'woot',
      });
    }),
  );

  const url = await listen(service);
  const response = await fetch(url);
  const body = await response.json();

  t.deepEqual(body.test, 'woot');
  service.close();
});

Look at test-listen for a function that returns a URL with an ephemeral port every time it's called.

Contributing

  1. Fork this repository to your own GitHub account and then clone it to your local device
  2. Link the package to the global module directory: npm link
  3. Within the module you want to test your local development instance of Micro, just link it to the dependencies: npm link micro. Instead of the default one from npm, node will now use your clone of Micro!

You can run the tests using: npm test.

Credits

Thanks to Tom Yandell and Richard Hodgson for donating the name "micro" on npm!

Authors