feed, feedparser, rss, and rss-parser are npm packages that handle RSS and Atom feed operations in JavaScript applications. feed and rss focus on generating valid RSS, Atom, and JSON Feed output from structured data, while feedparser and rss-parser specialize in parsing existing XML-based feeds into usable JavaScript objects. These tools enable developers to syndicate content or consume external feeds in Node.js or browser environments, though their capabilities, maintenance status, and architectural approaches differ significantly.
When building applications that publish or consume syndicated content, you’ll likely encounter RSS, Atom, or JSON Feed formats. The four packages under review fall into two distinct categories: feed generators (feed, rss) and feed parsers (feedparser, rss-parser). Understanding their roles, capabilities, and current state is crucial for making the right architectural choice.
feed and rss help you produce standardized feed documents from your content.
// Using `feed` to generate multiple formats
import { Feed } from 'feed';
const feed = new Feed({
title: 'My Blog',
description: 'Tech insights',
id: 'https://example.com/',
link: 'https://example.com/',
language: 'en',
feedLinks: {
rss2: 'https://example.com/rss.xml',
atom: 'https://example.com/atom.xml'
}
});
feed.addItem({
title: 'Post 1',
id: 'https://example.com/post1',
link: 'https://example.com/post1',
description: 'Content here',
date: new Date()
});
const rssFeed = feed.rss2(); // RSS 2.0 string
const atomFeed = feed.atom1(); // Atom 1.0 string
const jsonFeed = feed.json1(); // JSON Feed 1.0 string
// Using `rss` (RSS 2.0 only)
import RSS from 'rss';
const feed = new RSS({
title: 'My Blog',
description: 'Tech insights',
feed_url: 'https://example.com/rss.xml',
site_url: 'https://example.com/'
});
feed.item({
title: 'Post 1',
description: 'Content here',
url: 'https://example.com/post1',
date: new Date()
});
const rssString = feed.xml(); // Only RSS 2.0
rss-parser and the deprecated feedparser help you read and extract data from existing feed URLs or XML strings.
// Using `rss-parser` (modern, maintained)
import Parser from 'rss-parser';
const parser = new Parser();
const feed = await parser.parseURL('https://example.com/feed.xml');
console.log(feed.title);
feed.items.forEach(item => console.log(item.title));
// Also works with raw XML
const feedFromXml = await parser.parseString(xmlString);
// Using `feedparser` (deprecated — do not use)
// const FeedParser = require('feedparser');
// const feedparser = new FeedParser();
// request('http://example.com/feed.xml')
// .pipe(feedparser)
// .on('readable', function() {
// let item;
// while (item = this.read()) {
// console.log(item.title);
// }
// });
// This approach is outdated and unsupported.
⚠️ Critical Note:
feedparseris officially deprecated. Its npm page states: "This package is no longer maintained." Do not use it in new projects.
| Package | RSS 2.0 | Atom 1.0 | JSON Feed | RDF |
|---|---|---|---|---|
feed | ✅ | ✅ | ✅ | ❌ |
rss | ✅ | ❌ | ❌ | ❌ |
rss-parser | ✅ | ✅ | ❌ | ✅ |
feedparser | ✅ | ✅ | ❌ | ✅ |
If you need to generate Atom or JSON Feed, only feed supports them. If you’re parsing RDF feeds, both rss-parser and the deprecated feedparser handle them, but only rss-parser is safe to use.
feed offers a fluent, object-oriented API with strong TypeScript definitions. It validates inputs and escapes content automatically, reducing XSS risks.
rss uses a simpler, procedural style. It’s easy to get started but lacks built-in escaping — you must sanitize content yourself.
rss-parser provides a promise-based interface with consistent output regardless of input format. It normalizes fields like pubDate → isoDate and link → link, making consumption predictable.
feedparser used a streaming, event-driven model based on Node.js streams. While memory-efficient for large feeds, this pattern is harder to use and doesn’t work in browsers.
feed: Works in Node.js and browsers (via bundlers). No external dependencies.rss: Pure JavaScript, works everywhere.rss-parser: Works in Node.js and browsers. In browsers, you must provide XML text (due to CORS); it can’t fetch URLs directly.feedparser: Node.js only (relies on streams and HTTP modules).feed: Actively maintained, auto-escapes HTML in titles/descriptions.rss: Minimal maintenance, but stable; you must escape user-generated content.rss-parser: Actively maintained, uses xml2js under the hood with secure defaults.feedparser: Deprecated, contains unpatched vulnerabilities, and uses outdated dependencies.You run a Next.js blog and want to offer RSS, Atom, and JSON Feed.
feed// Next.js API route
import { Feed } from 'feed';
export default function handler(req, res) {
const feed = new Feed({ /* ... */ });
posts.forEach(post => feed.addItem({ /* ... */ }));
if (req.query.format === 'atom') {
res.setHeader('Content-Type', 'application/atom+xml');
res.send(feed.atom1());
} else if (req.query.format === 'json') {
res.setHeader('Content-Type', 'application/json');
res.send(feed.json1());
} else {
res.setHeader('Content-Type', 'application/rss+xml');
res.send(feed.rss2());
}
}
You’re building a feed reader that pulls from various RSS/Atom endpoints.
rss-parserconst parser = new Parser();
const urls = ['https://a.com/feed', 'https://b.com/atom'];
const feeds = await Promise.all(
urls.map(url => parser.parseURL(url))
);
const allItems = feeds.flatMap(feed =>
feed.items.map(item => ({
title: item.title,
source: feed.title,
link: item.link
}))
);
You have a small Eleventy or Hugo-like site and only need basic RSS 2.0.
rssconst feed = new RSS({
title: 'My Site',
site_url: 'https://mysite.com',
feed_url: 'https://mysite.com/rss.xml'
});
posts.forEach(post => feed.item({
title: post.title,
description: post.excerpt,
url: `https://mysite.com${post.url}`,
date: post.date
}));
fs.writeFileSync('rss.xml', feed.xml());
| Feature | feed | rss | rss-parser | feedparser |
|---|---|---|---|---|
| Status | ✅ Active | ✅ Stable | ✅ Active | ❌ Deprecated |
| Primary Role | Generator | Generator | Parser | Parser |
| RSS 2.0 | ✅ | ✅ | ✅ | ✅ |
| Atom 1.0 | ✅ | ❌ | ✅ | ✅ |
| JSON Feed | ✅ | ❌ | ❌ | ❌ |
| Browser Support | ✅ | ✅ | ✅ (with XML text) | ❌ |
| Auto-Escaping | ✅ | ❌ | N/A | ❌ |
| TypeScript | ✅ | ❌ | ✅ | ❌ |
feed for full format support and safety, or rss for minimal RSS-only needs.rss-parser. Never use feedparser.These tools solve different halves of the syndication problem. Pick the right one for your side of the equation — and always verify that your chosen package is actively maintained and secure.
Choose rss-parser if your primary need is parsing RSS, Atom, or RDF feeds from external sources into consistent JavaScript objects. It works in both Node.js and browsers, normalizes fields across feed types, and supports custom XML parsing options. Best for aggregators, readers, or any app consuming third-party feeds.
Avoid feedparser in new projects — it is officially deprecated per its npm page and GitHub repository. While it was once a streaming SAX-based parser for RSS and Atom feeds in Node.js, it hasn't been updated in years and lacks support for modern JavaScript features, security patches, and feed format variations. Use rss-parser instead for parsing needs.
Choose feed if you need a modern, well-maintained library for generating RSS 2.0, Atom 1.0, and JSON Feed 1.0 from JavaScript objects. It provides a clean, fluent API with strong TypeScript support and handles edge cases like escaping and date formatting correctly. Ideal for blogs, news sites, or any application that needs to publish machine-readable content feeds.
Choose rss if you only need to generate RSS 2.0 feeds (not Atom or JSON Feed) and prefer a minimal, dependency-free solution. It’s a lightweight generator with a straightforward API, but it doesn’t support newer feed standards or advanced metadata. Suitable for simple use cases where bundle size is critical and Atom/JSON Feed isn’t required.
A small library for turning RSS XML feeds into JavaScript objects.
npm install --save rss-parser
You can parse RSS from a URL (parser.parseURL) or an XML string (parser.parseString).
Both callbacks and Promises are supported.
Here's an example in NodeJS using Promises with async/await:
let Parser = require('rss-parser');
let parser = new Parser();
(async () => {
let feed = await parser.parseURL('https://www.reddit.com/.rss');
console.log(feed.title);
feed.items.forEach(item => {
console.log(item.title + ':' + item.link)
});
})();
When using TypeScript, you can set a type to control the custom fields:
import Parser from 'rss-parser';
type CustomFeed = {foo: string};
type CustomItem = {bar: number};
const parser: Parser<CustomFeed, CustomItem> = new Parser({
customFields: {
feed: ['foo', 'baz'],
// ^ will error because `baz` is not a key of CustomFeed
item: ['bar']
}
});
(async () => {
const feed = await parser.parseURL('https://www.reddit.com/.rss');
console.log(feed.title); // feed will have a `foo` property, type as a string
feed.items.forEach(item => {
console.log(item.title + ':' + item.link) // item will have a `bar` property type as a number
});
})();
We recommend using a bundler like webpack, but we also provide pre-built browser distributions in the
dist/folder. If you use the pre-built distribution, you'll need a polyfill for Promise support.
Here's an example in the browser using callbacks:
<script src="/node_modules/rss-parser/dist/rss-parser.min.js"></script>
<script>
// Note: some RSS feeds can't be loaded in the browser due to CORS security.
// To get around this, you can use a proxy.
const CORS_PROXY = "https://cors-anywhere.herokuapp.com/"
let parser = new RSSParser();
parser.parseURL(CORS_PROXY + 'https://www.reddit.com/.rss', function(err, feed) {
if (err) throw err;
console.log(feed.title);
feed.items.forEach(function(entry) {
console.log(entry.title + ':' + entry.link);
})
})
</script>
A few minor breaking changes were made in v3. Here's what you need to know:
new Parser() before calling parseString or parseURLparseFile is no longer available (for better browser support)options are now passed to the Parser constructorparsed.feed is now just feed (top-level object removed)feed.entries is now feed.items (to better match RSS XML)Check out the full output format in test/output/reddit.json
feedUrl: 'https://www.reddit.com/.rss'
title: 'reddit: the front page of the internet'
description: ""
link: 'https://www.reddit.com/'
items:
- title: 'The water is too deep, so he improvises'
link: 'https://www.reddit.com/r/funny/comments/3skxqc/the_water_is_too_deep_so_he_improvises/'
pubDate: 'Thu, 12 Nov 2015 21:16:39 +0000'
creator: "John Doe"
content: '<a href="http://example.com">this is a link</a> & <b>this is bold text</b>'
contentSnippet: 'this is a link & this is bold text'
guid: 'https://www.reddit.com/r/funny/comments/3skxqc/the_water_is_too_deep_so_he_improvises/'
categories:
- funny
isoDate: '2015-11-12T21:16:39.000Z'
contentSnippet field strips out HTML tags and unescapes HTML entitiesdc: prefix will be removed from all fieldsdc:date and pubDate will be available in ISO 8601 format as isoDateauthor is specified, but not dc:creator, creator will be set to author (see article)updated becomes lastBuildDate for consistencyIf your RSS feed contains fields that aren't currently returned, you can access them using the customFields option.
let parser = new Parser({
customFields: {
feed: ['otherTitle', 'extendedDescription'],
item: ['coAuthor','subtitle'],
}
});
parser.parseURL('https://www.reddit.com/.rss', function(err, feed) {
console.log(feed.extendedDescription);
feed.items.forEach(function(entry) {
console.log(entry.coAuthor + ':' + entry.subtitle);
})
})
To rename fields, you can pass in an array with two items, in the format [fromField, toField]:
let parser = new Parser({
customFields: {
item: [
['dc:coAuthor', 'coAuthor'],
]
}
})
To pass additional flags, provide an object as the third array item. Currently there is one such flag:
keepArray (false) - set to true to return all values for fields that can have multiple entries.includeSnippet (false) - set to true to add an additional field, ${toField}Snippet, with HTML stripped outlet parser = new Parser({
customFields: {
item: [
['media:content', 'media:content', {keepArray: true}],
]
}
})
If your RSS Feed doesn't contain a <rss> tag with a version attribute,
you can pass a defaultRSS option for the Parser to use:
let parser = new Parser({
defaultRSS: 2.0
});
rss-parser uses xml2js
to parse XML. You can pass these options
to new xml2js.Parser() by specifying options.xml2js:
let parser = new Parser({
xml2js: {
emptyTag: '--EMPTY--',
}
});
You can set the amount of time (in milliseconds) to wait before the HTTP request times out (default 60 seconds):
let parser = new Parser({
timeout: 1000,
});
You can pass headers to the HTTP request:
let parser = new Parser({
headers: {'User-Agent': 'something different'},
});
By default, parseURL will follow up to five redirects. You can change this
with options.maxRedirects.
let parser = new Parser({maxRedirects: 100});
rss-parser uses http/https module
to do requests. You can pass these options
to http.get()/https.get() by specifying options.requestOptions:
e.g. to allow unauthorized certificate
let parser = new Parser({
requestOptions: {
rejectUnauthorized: false
}
});
Contributions are welcome! If you are adding a feature or fixing a bug, please be sure to add a test case
The tests run the RSS parser for several sample RSS feeds in test/input and outputs the resulting JSON into test/output. If there are any changes to the output files the tests will fail.
To check if your changes affect the output of any test cases, run
npm test
To update the output files with your changes, run
WRITE_GOLDEN=true npm test
npm run build
git commit -a -m "Build distribution"
npm version minor # or major/patch
npm publish
git push --follow-tags