Convert CSV files to JSON with no dependencies. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance.
Transform CSV data into JSON with a simple, chainable API. Choose your implementation style:
ā
RFC 4180 Compliant - Proper handling of quoted fields, delimiters, newlines, and escape sequences
ā
Zero Dependencies - No external packages required
ā
Full TypeScript Support - Included type definitions for all APIs
ā
Flexible Configuration - Custom delimiters, encoding, trimming, and more
ā
Method Chaining - Fluent API for readable code
ā
Large File Support - Stream processing for memory-efficient handling
ā
Comprehensive Error Handling - Detailed, actionable error messages with solutions (see ERROR_HANDLING.md)
RFC 4180 is the IETF standard specification for CSV (Comma-Separated Values) files. This library is fully compliant with RFC 4180, ensuring proper handling of:
| Aspect | RFC 4180 Specification |
|---|---|
| Default Delimiter | Comma (,) |
| Record Delimiter | CRLF (\r\n) or LF (\n) |
| Quote Character | Double-quote (") |
| Quote Escaping | Double quotes ("") |
firstName,lastName,email
"Smith, John",Smith,john@example.com
Jane,Doe,jane@example.com
"Cooper, Andy",Cooper,andy@company.com
Note the quoted fields containing commas are properly handled. See RFC4180_MIGRATION_GUIDE.md for breaking changes and migration details.
npm install convert-csv-to-json
const csvToJson = require('convert-csv-to-json');
const json = csvToJson.getJsonFromCsv('input.csv');
const csvToJson = require('convert-csv-to-json');
const json = await csvToJson.getJsonFromCsvAsync('input.csv');
const convert = require('convert-csv-to-json');
const json = await convert.browser.parseFile(file);
| Implementation | Use Case | Learn More |
|---|---|---|
| Sync API | Simple, blocking operations | Read SYNC.md |
| Async API | Concurrent operations, large files | Read ASYNC.md |
| Browser API | Client-side file parsing | Read BROWSER.md |
const json = csvToJson.csvStringToJson('name,age\nAlice,30');
const json = csvToJson
.fieldDelimiter(';')
.getJsonFromCsv('input.csv');
const json = csvToJson
.formatValueByType()
.getJsonFromCsv('input.csv');
// Converts "30" ā 30, "true" ā true, etc.
const json = csvToJson
.supportQuotedField(true)
.getJsonFromCsv('input.csv');
const files = ['file1.csv', 'file2.csv', 'file3.csv'];
const results = await Promise.all(
files.map(f => csvToJson.getJsonFromCsvAsync(f))
);
All APIs (Sync, Async and Browser) support the same configuration methods:
fieldDelimiter(char) - Set field delimiter (default: ,)formatValueByType() - Auto-convert numbers, booleanssupportQuotedField(bool) - Handle quoted fields with embedded delimitersindexHeader(num) - Specify header row (default: 0)trimHeaderFieldWhiteSpace(bool) - Remove spaces from headersparseSubArray(delim, sep) - Parse delimited arraysmapRows(fn) - Transform, filter, or enrich each rowutf8Encoding(), latin1Encoding(), etc. - Set file encodingfieldDelimiter(char) - Set field delimiter (default: ,)// Semicolon-delimited
csvToJson.fieldDelimiter(';').getJsonFromCsv('data.csv');
// Tab-delimited
csvToJson.fieldDelimiter('\t').getJsonFromCsv('data.tsv');
// Pipe-delimited
csvToJson.fieldDelimiter('|').getJsonFromCsv('data.psv');
formatValueByType() - Auto-convert numbers, booleans// Input: name,age,active
// John,30,true
csvToJson.formatValueByType().getJsonFromCsv('data.csv');
// Output: { name: 'John', age: 30, active: true }
supportQuotedField(bool) - Handle quoted fields with embedded delimiters// Input: name,description
// "Smith, John","He said ""Hello"""
csvToJson.supportQuotedField(true).getJsonFromCsv('data.csv');
// Output: { name: 'Smith, John', description: 'He said "Hello"' }
indexHeader(num) - Specify header row (default: 0)// If headers are in row 2 (3rd line):
csvToJson.indexHeader(2).getJsonFromCsv('data.csv');
trimHeaderFieldWhiteSpace(bool) - Remove spaces from headers// Input: " First Name ", " Last Name "
csvToJson.trimHeaderFieldWhiteSpace(true).getJsonFromCsv('data.csv');
// Output: { FirstName: 'John', LastName: 'Doe' }
parseSubArray(delim, sep) - Parse delimited arrays// Input: name,tags
// John,*javascript,nodejs,typescript*
csvToJson.parseSubArray('*', ',').getJsonFromCsv('data.csv');
// Output: { name: 'John', tags: ['javascript', 'nodejs', 'typescript'] }
mapRows(fn) - Transform, filter, or enrich each row// Filter out rows that don't match a condition
const result = csvToJson
.fieldDelimiter(',')
.mapRows((row) => {
// Only keep rows where age >= 30
if (parseInt(row.age) >= 30) {
return row;
}
return null; // Filters out this row
})
.getJsonFromCsv('input.csv');
See mapRows Feature - Usage Guide.
utf8Encoding(), latin1Encoding(), etc. - Set file encoding// UTF-8 encoding
csvToJson.utf8Encoding().getJsonFromCsv('data.csv');
// Latin-1 encoding
csvToJson.latin1Encoding().getJsonFromCsv('data.csv');
// Custom encoding
csvToJson.customEncoding('ucs2').getJsonFromCsv('data.csv');
See SYNC.md, ASYNC.md or BROWSER.md for complete configuration details.
const csvToJson = require('convert-csv-to-json');
async function processCSV() {
const data = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.supportQuotedField(true)
.getJsonFromCsvAsync('data.csv');
console.log(`Parsed ${data.length} records`);
return data;
}
Install dependencies:
npm install
Run tests:
npm test
Debug tests:
npm run test-debug
See CI/CD GitHub Action.
When pushing to the master branch:
[MAJOR] in commit message for major release (e.g., v1.0.0 ā v2.0.0)[PATCH] in commit message for patch release (e.g., v1.0.0 ā v1.0.1)CSVtoJSON is licensed under the MIT License.
Found a bug or need a feature? Open an issue on GitHub.
Follow me and consider starring the project to show your support ā
If you find this project helpful and would like to support its development:
BTC: 37vdjQhbaR7k7XzhMKWzMcnqUxfw1njBNk