feedparser, rss, and rss-parser are npm packages used for working with RSS and Atom feeds in JavaScript applications, but they serve fundamentally different purposes. feedparser and rss-parser are feed parsers that convert XML feed content into structured JavaScript objects, while rss is a feed generator that creates RSS or Atom XML from JavaScript data. Understanding this distinction is crucial: you cannot parse feeds with rss, nor can you generate feeds with feedparser or rss-parser. Each package offers different APIs—feedparser uses a streaming, event-driven model suitable for large-scale or memory-constrained scenarios; rss-parser provides a modern promise-based interface that simplifies fetching and parsing from URLs or XML strings; and rss uses a builder pattern to construct valid RSS 2.0 or Atom 1.0 feeds for publishing content.
When you need to consume or generate RSS/Atom feeds in a JavaScript application, three npm packages often come up: feedparser, rss, and rss-parser. But they serve very different purposes — two are for parsing feeds, and one is for generating them. Let’s cut through the confusion and compare them based on real-world usage.
First, it’s critical to understand what each package actually does:
feedparser: A streaming XML parser for RSS and Atom feeds. It reads raw XML (as a stream or string) and emits structured JavaScript objects.rss-parser: A promise-based RSS/Atom feed parser that fetches and parses feeds from URLs or XML strings.rss: A feed generator, not a parser. It helps you create RSS 2.0 or Atom 1.0 XML from JavaScript objects.⚠️ Important: You cannot use
rssto parse incoming feeds. It only builds them. Confusing it with the others is a common mistake.
Let’s look at how each handles its role.
Both feedparser and rss-parser parse RSS/Atom, but their APIs and execution models differ significantly.
feedparser: Streaming, Event-Driven Parsingfeedparser works with Node.js streams. It’s ideal when you’re dealing with large feeds or want fine-grained control over parsing (e.g., processing items as they arrive).
import FeedParser from 'feedparser';
import { createReadStream } from 'fs';
const parser = new FeedParser();
const stream = createReadStream('feed.xml');
stream.pipe(parser);
parser.on('readable', function () {
let item;
while ((item = this.read())) {
console.log(item.title);
}
});
parser.on('error', (err) => {
console.error('Parse error:', err);
});
You can also pipe HTTP responses directly:
import https from 'https';
import FeedParser from 'feedparser';
https.get('https://example.com/feed.xml', (res) => {
const parser = new FeedParser();
res.pipe(parser);
// ... handle 'readable' and 'error' events
});
This approach is memory-efficient for large feeds but requires managing streams and events.
rss-parser: Simple Promise-Based APIrss-parser abstracts away streams and gives you a clean async/await interface. It can fetch from a URL or parse a raw XML string.
import Parser from 'rss-parser';
const parser = new Parser();
// Parse from URL
const feed = await parser.parseURL('https://example.com/feed.xml');
console.log(feed.title);
feed.items.forEach(item => console.log(item.title));
// Or parse from XML string
const xmlString = `<rss version="2.0"><channel>...</channel></rss>`;
const feedFromXml = await parser.parseString(xmlString);
It normalizes fields across RSS and Atom formats (e.g., pubDate becomes isoDate), which reduces boilerplate.
rss Does ThisIf you need to create an RSS feed (e.g., for a blog or podcast), rss is your go-to. The other two cannot do this.
import RSS from 'rss';
const feed = new RSS({
title: 'My Blog',
description: 'A tech blog',
feed_url: 'https://example.com/rss.xml',
site_url: 'https://example.com',
pubDate: new Date(),
});
feed.item({
title: 'New Post',
description: 'Learn about RSS!',
url: 'https://example.com/post1',
date: new Date()
});
const xml = feed.xml(); // Returns XML string
This is straightforward and widely used in static site generators and CMS backends.
As of the latest checks:
feedparser: Actively maintained. No deprecation notice. Works well in modern Node.js environments.rss-parser: Actively maintained. Regular updates and issue responses.rss: Actively maintained. Used by many production systems for feed generation.None of these packages are deprecated, so all are safe for new projects — as long as you use them for their intended purpose.
feedparserrss-parserrss❌ Never try to use
rssto parse a feed — it will not work.
Yes! A common pattern is:
rss-parser to fetch and parse a third-party feed.rss to generate your own aggregated feed.import Parser from 'rss-parser';
import RSS from 'rss';
const parser = new Parser();
const sourceFeed = await parser.parseURL('https://news.example.com/feed');
const myFeed = new RSS({
title: 'My Aggregated News',
site_url: 'https://myapp.com',
feed_url: 'https://myapp.com/aggregated.rss'
});
sourceFeed.items.slice(0, 10).forEach(item => {
myFeed.item({
title: item.title,
url: item.link,
date: item.pubDate
});
});
const xml = myFeed.xml();
| Package | Purpose | API Style | Fetches URLs? | Generates Feeds? | Streaming? |
|---|---|---|---|---|---|
feedparser | Parser | Event-driven | ❌ (use with http) | ❌ | ✅ |
rss-parser | Parser | Promise-based | ✅ | ❌ | ❌ |
rss | Generator | Builder pattern | ❌ | ✅ | ❌ |
Don’t pick based on popularity — pick based on what you’re trying to do:
feedparser (for performance and control) and rss-parser (for simplicity).rss — it’s the standard tool for the job.Mixing up parsers and generators is the #1 mistake developers make with these packages. Keep their roles clear, and you’ll save hours of debugging.
Choose rss-parser if you want a simple, promise-based API to fetch and parse RSS/Atom feeds from URLs or XML strings with minimal setup. It normalizes feed metadata across formats and is ideal for typical web apps (e.g., Next.js API routes) where you don’t need streaming. Avoid it if you’re processing very large feeds or require fine-grained control over parsing performance.
Choose rss only when you need to generate RSS 2.0 or Atom 1.0 feeds from your own content, such as for blogs, podcasts, or news sites. It cannot parse incoming feeds, so never use it for consumption. It’s the standard choice for feed publishing in Node.js backends and static site generators due to its straightforward builder API.
Choose feedparser if you need a streaming, event-driven parser for RSS/Atom feeds and are working in a Node.js environment where memory efficiency or real-time item processing matters (e.g., aggregating thousands of feeds). It integrates naturally with Node streams but requires handling events and errors manually. Avoid it if you prefer a simpler async/await API or need built-in HTTP fetching.
A small library for turning RSS XML feeds into JavaScript objects.
npm install --save rss-parser
You can parse RSS from a URL (parser.parseURL) or an XML string (parser.parseString).
Both callbacks and Promises are supported.
Here's an example in NodeJS using Promises with async/await:
let Parser = require('rss-parser');
let parser = new Parser();
(async () => {
let feed = await parser.parseURL('https://www.reddit.com/.rss');
console.log(feed.title);
feed.items.forEach(item => {
console.log(item.title + ':' + item.link)
});
})();
When using TypeScript, you can set a type to control the custom fields:
import Parser from 'rss-parser';
type CustomFeed = {foo: string};
type CustomItem = {bar: number};
const parser: Parser<CustomFeed, CustomItem> = new Parser({
customFields: {
feed: ['foo', 'baz'],
// ^ will error because `baz` is not a key of CustomFeed
item: ['bar']
}
});
(async () => {
const feed = await parser.parseURL('https://www.reddit.com/.rss');
console.log(feed.title); // feed will have a `foo` property, type as a string
feed.items.forEach(item => {
console.log(item.title + ':' + item.link) // item will have a `bar` property type as a number
});
})();
We recommend using a bundler like webpack, but we also provide pre-built browser distributions in the
dist/folder. If you use the pre-built distribution, you'll need a polyfill for Promise support.
Here's an example in the browser using callbacks:
<script src="/node_modules/rss-parser/dist/rss-parser.min.js"></script>
<script>
// Note: some RSS feeds can't be loaded in the browser due to CORS security.
// To get around this, you can use a proxy.
const CORS_PROXY = "https://cors-anywhere.herokuapp.com/"
let parser = new RSSParser();
parser.parseURL(CORS_PROXY + 'https://www.reddit.com/.rss', function(err, feed) {
if (err) throw err;
console.log(feed.title);
feed.items.forEach(function(entry) {
console.log(entry.title + ':' + entry.link);
})
})
</script>
A few minor breaking changes were made in v3. Here's what you need to know:
new Parser() before calling parseString or parseURLparseFile is no longer available (for better browser support)options are now passed to the Parser constructorparsed.feed is now just feed (top-level object removed)feed.entries is now feed.items (to better match RSS XML)Check out the full output format in test/output/reddit.json
feedUrl: 'https://www.reddit.com/.rss'
title: 'reddit: the front page of the internet'
description: ""
link: 'https://www.reddit.com/'
items:
- title: 'The water is too deep, so he improvises'
link: 'https://www.reddit.com/r/funny/comments/3skxqc/the_water_is_too_deep_so_he_improvises/'
pubDate: 'Thu, 12 Nov 2015 21:16:39 +0000'
creator: "John Doe"
content: '<a href="http://example.com">this is a link</a> & <b>this is bold text</b>'
contentSnippet: 'this is a link & this is bold text'
guid: 'https://www.reddit.com/r/funny/comments/3skxqc/the_water_is_too_deep_so_he_improvises/'
categories:
- funny
isoDate: '2015-11-12T21:16:39.000Z'
contentSnippet field strips out HTML tags and unescapes HTML entitiesdc: prefix will be removed from all fieldsdc:date and pubDate will be available in ISO 8601 format as isoDateauthor is specified, but not dc:creator, creator will be set to author (see article)updated becomes lastBuildDate for consistencyIf your RSS feed contains fields that aren't currently returned, you can access them using the customFields option.
let parser = new Parser({
customFields: {
feed: ['otherTitle', 'extendedDescription'],
item: ['coAuthor','subtitle'],
}
});
parser.parseURL('https://www.reddit.com/.rss', function(err, feed) {
console.log(feed.extendedDescription);
feed.items.forEach(function(entry) {
console.log(entry.coAuthor + ':' + entry.subtitle);
})
})
To rename fields, you can pass in an array with two items, in the format [fromField, toField]:
let parser = new Parser({
customFields: {
item: [
['dc:coAuthor', 'coAuthor'],
]
}
})
To pass additional flags, provide an object as the third array item. Currently there is one such flag:
keepArray (false) - set to true to return all values for fields that can have multiple entries.includeSnippet (false) - set to true to add an additional field, ${toField}Snippet, with HTML stripped outlet parser = new Parser({
customFields: {
item: [
['media:content', 'media:content', {keepArray: true}],
]
}
})
If your RSS Feed doesn't contain a <rss> tag with a version attribute,
you can pass a defaultRSS option for the Parser to use:
let parser = new Parser({
defaultRSS: 2.0
});
rss-parser uses xml2js
to parse XML. You can pass these options
to new xml2js.Parser() by specifying options.xml2js:
let parser = new Parser({
xml2js: {
emptyTag: '--EMPTY--',
}
});
You can set the amount of time (in milliseconds) to wait before the HTTP request times out (default 60 seconds):
let parser = new Parser({
timeout: 1000,
});
You can pass headers to the HTTP request:
let parser = new Parser({
headers: {'User-Agent': 'something different'},
});
By default, parseURL will follow up to five redirects. You can change this
with options.maxRedirects.
let parser = new Parser({maxRedirects: 100});
rss-parser uses http/https module
to do requests. You can pass these options
to http.get()/https.get() by specifying options.requestOptions:
e.g. to allow unauthorized certificate
let parser = new Parser({
requestOptions: {
rejectUnauthorized: false
}
});
Contributions are welcome! If you are adding a feature or fixing a bug, please be sure to add a test case
The tests run the RSS parser for several sample RSS feeds in test/input and outputs the resulting JSON into test/output. If there are any changes to the output files the tests will fail.
To check if your changes affect the output of any test cases, run
npm test
To update the output files with your changes, run
WRITE_GOLDEN=true npm test
npm run build
git commit -a -m "Build distribution"
npm version minor # or major/patch
npm publish
git push --follow-tags