By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

Top 5 Javascript CSV Parsers

Javascript CSV Parsers allow you to ingest a CSV file and work with its JSON representation for advanced typing and manipulation. This deep dive focuses on comparing the top 5 parsers and guides you in making a decision on when to use one over the other.

May 13, 2024

Comma-Separated Values (CSV) are a streamlined text file format that encapsulates tabular data in plain text, where each line represents a single data record. This tabular representation makes them ideal for importing and exporting to spreadsheet applications like Microsoft Excel and Google Sheets or for storing relational data more broadly.

Unlike JSON, CSVs lack inherent support for complex data types and do not support operations such as filtering and direct data manipulation in their native form. To solve this problem, we have CSV parsers, which take our CSV files and turn them into a JSON representation more suitable for manipulation. 

Today, we’re focusing on Javascript CSV Parsers, which allow you to ingest a CSV file and work with its JSON representation for advanced typing and manipulation.

Note: We will be referencing performance numbers found in this excellent repository dedicated to CSV performance benchmarking. While a great reference, your performance may vary depending on configurations like chunk size. In this article, we will focus on how the parsers handle large CSVs (10 columns, 1 million rows) for both quoted and unquoted CSVs.

1. Papaparse

Papaparse has a simple API, but don’t let that fool you. It’s one of the fastest, most feature-rich open-source CSV parsers. 

Feature Highlights

Papaparse includes a lot of “nice-to-haves,” like auto-detecting delimiters and dynamic typing for booleans and numbers. It also features advanced functionality, like multi-threading (via the Web Workers API), and file streaming, both of which make it ideal for processing large CSVs.

Performance

Papaparse performs particularly well when parsing CSVs with quotes, taking just 5.5 seconds to parse a CSV file with ten columns and 1 million rows. When parsing CSVs without quotes, performance was considerably worse, taking a whopping 18 seconds.

Edge Cases

This is a strength of papaparse. The package features a comprehensive set of configuration options, a forgiving parser, and solid default settings. Malformed CSV files that cause errors in the parsing process are straightforward to handle gracefully. 

Best For

  • When you need a full-featured CSV parsing library

Weaknesses

  • As we saw in the performance section, parsing CSVs without quotes had a major impact on the performance of papaparse on very large CSVs. 
  • Another caveat to watch out for is the automatic type casting, which can sometimes incorrectly guess the data type.

Usage Example

  
Papa.parse(file, {    delimiter: ";",
   

  // Custom delimiter    
  newline: "",
   

  // Auto-detect newline    
  header: true,
   

  // First row is headers    
  dynamicTyping: true,

  // Automatically convert numbers and booleans    
  skipEmptyLines: true,

  // Skip empty lines    
  step: function(result) {

    // Called for each row in the file     
    processRow(result.data);    
  },
      

  complete: function() {      console.log("All rows successfully processed.");  

    // Perform a final operation if needed      
    finalizeProcessing();    
  },
      error: function(error) { 

    // Handle any errors      
    console.error("Parsing error:", error.message);    
  }  
});
}

Package Stats

Github Stars:  11.7k stars

Package size (Unpacked): 260 kB

Weekly downloads: 1.3 million 

To install papaparse, use the following command in your CLI:

  
npm install papaparse

2. fast-csv

A parser that combines the @fast-csv/format  and @fast-csv/parse packages, fast-csv is a popular, lightweight package for parsing and formatting CSVs.

Feature Highlights

Fast-csv is laser-focused on performance. It handles parsing large CSVs through the Node.js Streams API, but also supports promises if that’s your pattern of choice. It is incredibly lightweight, at just 8.5kb, and as its dependencies suggest, it supports parsing and formatting. 

Performance

At 16 seconds and 14 seconds for processing the benchmark CSV quoted and unquoted, it edges out papaparse for unquoted parsing. 

Edge Cases

Fast-csv provides flexible error handling options. For example, when encountering lines that don’t have the right number of fields, it can be configured to skip these lines or abort the operation. The library can also apply transformations on the data as it’s being parsed, which could be leveraged to handle data inconsistencies during parsing.

Best For

  • Use in Node.js environments, stream processing, custom delimiters and unquoted parsing. 
  • If package size is a major concern, this footprint is hard to beat.

Weaknesses

Not best-in-class on performance, particularly on parsing quoted CSVs. Not a good option for parsing CSVs client-side, and lacks concurrent processing capabilities.

Usage Example

  
const fs = require('fs');

const path = require('path');

const fastcsv = require('fast-csv');‍

let stream = fs.createReadStream(path.resolve(__dirname, 'data.csv'));

let csvData = [];‍let csvStream = fastcsv  .parse()  .on('data',

function(row) {    console.log(row);    csvData.push(row);  

})  

.on('end',

function() {    
    console.log('Read finished');
    
    // Do something with csvData  })  .on('error', function(error) {
    
    console.error(error);  });
    
    stream.pipe(csvStream);

Package Stats

Github Stars:  11.7k stars

Package size (Unpacked): 8.5kB

Weekly downloads: 960,000

To install fast-csv, run the following command in your CLI:

  
npm i -S fast-csv

3. SheetJS

SheetJS is a popular and full-featured CSV parser that focuses on reading, editing, and exporting spreadsheets for use with spreadsheet programs like Microsoft Excel.

Feature Highlights

SheetJS runs in the browser, on servers (Node.js and Deno), and Electron. It natively supports a range of Excel formats, can evaluate Excel formulas, merge cells, and can even read and write Excel chart information. 

It can work with raw binary data and base64 encoded strings, which is great for handling files client side. 

Performance

While we don’t have specific benchmarks for SheetJS, generally speaking, it handles small to midsize CSVs exceptionally well. There are reports of some performance issues on larger files, including excessive memory usage and long processing times for particularly complex spreadsheets.

Edge Cases

SheetJS will try to read as much as it can from a file, even if it’s malformed. For corrupted files, SheetJS has modes that attempt to recover as much data as possible. In many cases, when SheetJS encounters an unexpected value or a missing expected element, it will use a sensible default or fallback to avoid throwing an error.

Best For

  • If you’re working with excel files, this one is a no-brainer
  • If you need a parser on the client and/or server.

Weaknesses

  • Potential fine-tuning required to stay performant if dealing with large, complex Excel files
  • Some functionality requires paying for the premium option.
  • It is the biggest package in our top 5, at 7.5MB.

Usage Example

  
const XLSX = require('xlsx');  // Sample data set

const data = [
  {Name : "John", City : "Seattle", Salary : 70000},

  {Name : "Mike", City : "Los Angeles", Salary : 58000},

  {Name : "Samantha", City : "New York", Salary : 65000}
];

// Create a new workbook
const workbook = XLSX.utils.book_new();

// Convert the data to a worksheet
const worksheet = XLSX.utils.json_to_sheet(data);

// Append the worksheet to the workbook
XLSX.utils.book_append_sheet(workbook, worksheet, "Employees");

// Write the workbook to a file
XLSX.writeFile(workbook, 'output.xlsx');

console.log('Excel file created successfully.');

Package Stats

Github Stars:  33.8k stars

Package size (Unpacked): 7.5 MB

Weekly downloads: 1,400,000

To install SheetJS, run the following command in your CLI:

  
npm install xlsx

4. csv-parser

csv-parser is an efficient and streamlined library optimized for parsing CSV data quickly and effectively. It provides minimal overhead and is designed to be simple and lightweight for Node.js streams.

Feature Highlights

Like other parsers on this list, csv-parser is implemented as a transform stream in Node.js, which enables it to process data as it is being read, reducing memory overhead. It is also compliant with the RFC 4180 CSV standard and passes the csv-spectrum acid test suite, ensuring compatibility and correctness across a wide range of CSV variations.

Performance

csv-parser goes toe-to-toe with papaparse on quoted CSVs, parsing the largest benchmark dataset in 5.5 seconds. On unquoted CSVs, csv-parser really shines. At ~5.5 seconds, it is nearly identical to the quoted benchmark, and is over 3x faster than papaparse.

Edge Cases

Basic features like skipping bad lines (with proper configuration) or handling unconventional delimiters come out of the box, but this library may lack some of the advanced configurability offered by other parsers on this list.

Best For

  • Applications that require fast and efficient parsing of CSV data, especially when working with Node.js streams
  • Performant processing of large CSV files

Weaknesses

  • The minimalistic design means it might lack some of the high-level features and conveniences offered by more full-featured CSV libraries, such as built-in support for complex transformations or integrations with external data sources.
  • It does not provide functionality for writing CSV files or manipulating CSV data beyond parsing.

Usage Example

  
const csv = require('csv-parser');

const fs = require('fs');

const results = [];

fs.createReadStream('data.csv')  .pipe(csv())

.on('data', (data) = >results.push(data)) 

.on('end', () = >{
  console.log(results);

// handle the parsed data here  });

Package Stats

Github Stars: 1.4k

Package size (Unpacked): 27.7 kB

Weekly downloads: 1,600,000

To install csv-parser, use the following command in your CLI:

  
npm install csv-parser

5. csv

The csv package is a project that provides CSV generation, parsing, transformation and serialization for Node.js. It includes four packages - csv-generate (GitHub), csv-parse (GitHub), csv-stringify (GitHub), stream-transform (GitHub). The actual parsing is done by csv-parse, but as you can see, the package includes a transformation framework and CSV generator. 

Feature Highlights

csv comes with a rich set of options to customize the parser’s behavior, including automatic column detection, custom column delimiters, and handling of quotes / escape characters. Data can be transformed synchronously or asynchronously, which is useful for manipulating data during the parsing/stringifying process. It can handle large files like other parsers on this list through its streaming interface.

Performance

The csv-parse portion of this library performed well on quoted CSVs, parsing the largest benchmark dataset in 10.3 seconds. You can expect similar performance on unquoted CSVs, clocking in at 9.5 seconds.

Edge Cases

csv includes detailed error messages out of the box, making troubleshooting straightforward. The library can skip empty lines or irregular records, and can be configured to handle other edge cases that occur in CSV files.

Best For

  • Developers who need a comprehensive toolkit for handling CSV data within the Node.js ecosystem.
  • Use cases where the modular nature of the library can be leveraged to minimize the package’s size.

Weaknesses

  • Relatively large (when the full library is imported) at ~2MB.
  • Because its functionality is spread out across a few different packages, there may be a steeper learning curve, and your code will likely be more verbose.

Usage Example

  
  
const fs = require('fs');

const parse = require('csv-parse');

const stringify = require('csv-stringify'); 

// Input and Output file paths
const inputFilePath = 'users.csv';
const outputFilePath = 'copy_users.csv';

// Create a parser that assumes the first line of the CSV file contains column names
const parser = parse({  columns: true,  trim: true});

// Create a stringifier to convert objects back into CSV format
const stringifier = stringify({  header: true});

// Create read and write streams
const inputStream = fs.createReadStream(inputFilePath);
const outputStream = fs.createWriteStream(outputFilePath);

// Pipe the streamsinputStream  
.pipe(parser)      

// Parse the input CSV file  
.pipe(stringifier)  

// Stringify the data into CSV format  
.pipe(outputStream)  

// Write the CSV data to the output file  
.on('finish', () => {  

    console.log('CSV file has been copied to copy_users.csv'); 
    });
    
    // Handle stream errors
    parser.on('error', error => console.error(error));
    stringifier.on('error', error => console.error(error));
    outputStream.on('error', error => console.error(error));

Package Stats

Github Stars: 3.7k

Package size (Unpacked): 2.01 MB

Weekly downloads: 800,000

To install csv, use the following command in your CLI:

  
npm install csv

Honorable Mentions

If none of the above libraries fit the bill, here are some honorable mentions that didn’t make our list.

  1. json-2-csv: A library that supports sub-documents and custom ordering of columns.
  2. csvtojson: Includes a command line utility, making it easy to parse local CSV files.
  3. nest-csv-parser: Nest.js wrapper for csv-parser. Pick this one up if you’re using the Nest framework.
  4. ya-csv: Event-based API with zero dependencies.
  5. skipper-csv: Parser adapter that supports multipart file uploads.

Choosing The Best Javascript CSV Parser

As with all libraries, it’s important to first evaluate your requirements and understand your use case. Here is a rough guide to aid in your decision-making process. 

First, assess the nature and volume of the data you’re dealing with. For small to medium-sized files, a simple and lightweight parser like csv-parser or fast-csv is a good fit. They require minimal setup and have straightforward APIs.

If your datasets are large or complex, with nested structures, varying encodings, or messy files, turn to a more robust solution like papaparse or csv. Their streaming capability will make processing large files memory-efficient, and they include built-in support for character encoding, custom delimiters, and more nuanced error handling. 

Second, assess your edge case requirements. If you’re expecting to see malformed data, make sure to ask the following questions before deciding on your parser:

  1. How does the parser handle incorrectly formatted lines or missing data?
  2. Is it tolerant of extra or missing delimiters, line breaks within cells, or unclosed quotes?
  3. How detailed are the error messages?
  4. How extensive is the documentation / How large is the community?

The “best” CSV parser is a subjective and contextual question. Define the needs of your project, the nature of your data, and the environments in which the parser will run. This will get you 90% of the way there. Lastly, you should consider how your team writes code. Is there a pattern or interface that aligns better with your existing codebase? Striving for codebase consistency will ensure your team can work efficiently and scale effectively.

Want a CSV Experience Users Will Actually Enjoy?

If performance and user experience are top concerns, OneSchema offers a powerful option for CSV parsing and importing. Validate and transform files up to 4GB in under a second with plug-and-play components for vanilla JavaScript, React, Angular and Vue projects. OneSchema goes beyond an exceptional developer experience by providing a rich CSV error correction toolset, empowering end users to clean and validate their data in a streamlined workflow.

Increase your onboarding conversion rates, save on engineering cycles, and prevent excessive support tickets with OneSchema. Get started for free.

Continue reading

Human with plants and dashboards

Make data imports effortless and efficient

Ready to get started?

November 15, 2023

Top 5 Javascript CSV Parsers

Javascript CSV Parsers allow you to ingest a CSV file and work with its JSON representation for advanced typing and manipulation. This deep dive focuses on comparing the top 5 parsers and guides you in making a decision on when to use one over the other.

Comma-Separated Values (CSV) are a streamlined text file format that encapsulates tabular data in plain text, where each line represents a single data record. This tabular representation makes them ideal for importing and exporting to spreadsheet applications like Microsoft Excel and Google Sheets or for storing relational data more broadly.

Unlike JSON, CSVs lack inherent support for complex data types and do not support operations such as filtering and direct data manipulation in their native form. To solve this problem, we have CSV parsers, which take our CSV files and turn them into a JSON representation more suitable for manipulation. 

Today, we’re focusing on Javascript CSV Parsers, which allow you to ingest a CSV file and work with its JSON representation for advanced typing and manipulation.

Note: We will be referencing performance numbers found in this excellent repository dedicated to CSV performance benchmarking. While a great reference, your performance may vary depending on configurations like chunk size. In this article, we will focus on how the parsers handle large CSVs (10 columns, 1 million rows) for both quoted and unquoted CSVs.

1. Papaparse

Papaparse has a simple API, but don’t let that fool you. It’s one of the fastest, most feature-rich open-source CSV parsers. 

Feature Highlights

Papaparse includes a lot of “nice-to-haves,” like auto-detecting delimiters and dynamic typing for booleans and numbers. It also features advanced functionality, like multi-threading (via the Web Workers API), and file streaming, both of which make it ideal for processing large CSVs.

Performance

Papaparse performs particularly well when parsing CSVs with quotes, taking just 5.5 seconds to parse a CSV file with ten columns and 1 million rows. When parsing CSVs without quotes, performance was considerably worse, taking a whopping 18 seconds.

Edge Cases

This is a strength of papaparse. The package features a comprehensive set of configuration options, a forgiving parser, and solid default settings. Malformed CSV files that cause errors in the parsing process are straightforward to handle gracefully. 

Best For

  • When you need a full-featured CSV parsing library

Weaknesses

  • As we saw in the performance section, parsing CSVs without quotes had a major impact on the performance of papaparse on very large CSVs. 
  • Another caveat to watch out for is the automatic type casting, which can sometimes incorrectly guess the data type.

Usage Example

  
Papa.parse(file, {    delimiter: ";",
   

  // Custom delimiter    
  newline: "",
   

  // Auto-detect newline    
  header: true,
   

  // First row is headers    
  dynamicTyping: true,

  // Automatically convert numbers and booleans    
  skipEmptyLines: true,

  // Skip empty lines    
  step: function(result) {

    // Called for each row in the file     
    processRow(result.data);    
  },
      

  complete: function() {      console.log("All rows successfully processed.");  

    // Perform a final operation if needed      
    finalizeProcessing();    
  },
      error: function(error) { 

    // Handle any errors      
    console.error("Parsing error:", error.message);    
  }  
});
}

Package Stats

Github Stars:  11.7k stars

Package size (Unpacked): 260 kB

Weekly downloads: 1.3 million 

To install papaparse, use the following command in your CLI:

  
npm install papaparse

2. fast-csv

A parser that combines the @fast-csv/format  and @fast-csv/parse packages, fast-csv is a popular, lightweight package for parsing and formatting CSVs.

Feature Highlights

Fast-csv is laser-focused on performance. It handles parsing large CSVs through the Node.js Streams API, but also supports promises if that’s your pattern of choice. It is incredibly lightweight, at just 8.5kb, and as its dependencies suggest, it supports parsing and formatting. 

Performance

At 16 seconds and 14 seconds for processing the benchmark CSV quoted and unquoted, it edges out papaparse for unquoted parsing. 

Edge Cases

Fast-csv provides flexible error handling options. For example, when encountering lines that don’t have the right number of fields, it can be configured to skip these lines or abort the operation. The library can also apply transformations on the data as it’s being parsed, which could be leveraged to handle data inconsistencies during parsing.

Best For

  • Use in Node.js environments, stream processing, custom delimiters and unquoted parsing. 
  • If package size is a major concern, this footprint is hard to beat.

Weaknesses

Not best-in-class on performance, particularly on parsing quoted CSVs. Not a good option for parsing CSVs client-side, and lacks concurrent processing capabilities.

Usage Example

  
const fs = require('fs');

const path = require('path');

const fastcsv = require('fast-csv');‍

let stream = fs.createReadStream(path.resolve(__dirname, 'data.csv'));

let csvData = [];‍let csvStream = fastcsv  .parse()  .on('data',

function(row) {    console.log(row);    csvData.push(row);  

})  

.on('end',

function() {    
    console.log('Read finished');
    
    // Do something with csvData  })  .on('error', function(error) {
    
    console.error(error);  });
    
    stream.pipe(csvStream);

Package Stats

Github Stars:  11.7k stars

Package size (Unpacked): 8.5kB

Weekly downloads: 960,000

To install fast-csv, run the following command in your CLI:

  
npm i -S fast-csv

3. SheetJS

SheetJS is a popular and full-featured CSV parser that focuses on reading, editing, and exporting spreadsheets for use with spreadsheet programs like Microsoft Excel.

Feature Highlights

SheetJS runs in the browser, on servers (Node.js and Deno), and Electron. It natively supports a range of Excel formats, can evaluate Excel formulas, merge cells, and can even read and write Excel chart information. 

It can work with raw binary data and base64 encoded strings, which is great for handling files client side. 

Performance

While we don’t have specific benchmarks for SheetJS, generally speaking, it handles small to midsize CSVs exceptionally well. There are reports of some performance issues on larger files, including excessive memory usage and long processing times for particularly complex spreadsheets.

Edge Cases

SheetJS will try to read as much as it can from a file, even if it’s malformed. For corrupted files, SheetJS has modes that attempt to recover as much data as possible. In many cases, when SheetJS encounters an unexpected value or a missing expected element, it will use a sensible default or fallback to avoid throwing an error.

Best For

  • If you’re working with excel files, this one is a no-brainer
  • If you need a parser on the client and/or server.

Weaknesses

  • Potential fine-tuning required to stay performant if dealing with large, complex Excel files
  • Some functionality requires paying for the premium option.
  • It is the biggest package in our top 5, at 7.5MB.

Usage Example

  
const XLSX = require('xlsx');  // Sample data set

const data = [
  {Name : "John", City : "Seattle", Salary : 70000},

  {Name : "Mike", City : "Los Angeles", Salary : 58000},

  {Name : "Samantha", City : "New York", Salary : 65000}
];

// Create a new workbook
const workbook = XLSX.utils.book_new();

// Convert the data to a worksheet
const worksheet = XLSX.utils.json_to_sheet(data);

// Append the worksheet to the workbook
XLSX.utils.book_append_sheet(workbook, worksheet, "Employees");

// Write the workbook to a file
XLSX.writeFile(workbook, 'output.xlsx');

console.log('Excel file created successfully.');

Package Stats

Github Stars:  33.8k stars

Package size (Unpacked): 7.5 MB

Weekly downloads: 1,400,000

To install SheetJS, run the following command in your CLI:

  
npm install xlsx

4. csv-parser

csv-parser is an efficient and streamlined library optimized for parsing CSV data quickly and effectively. It provides minimal overhead and is designed to be simple and lightweight for Node.js streams.

Feature Highlights

Like other parsers on this list, csv-parser is implemented as a transform stream in Node.js, which enables it to process data as it is being read, reducing memory overhead. It is also compliant with the RFC 4180 CSV standard and passes the csv-spectrum acid test suite, ensuring compatibility and correctness across a wide range of CSV variations.

Performance

csv-parser goes toe-to-toe with papaparse on quoted CSVs, parsing the largest benchmark dataset in 5.5 seconds. On unquoted CSVs, csv-parser really shines. At ~5.5 seconds, it is nearly identical to the quoted benchmark, and is over 3x faster than papaparse.

Edge Cases

Basic features like skipping bad lines (with proper configuration) or handling unconventional delimiters come out of the box, but this library may lack some of the advanced configurability offered by other parsers on this list.

Best For

  • Applications that require fast and efficient parsing of CSV data, especially when working with Node.js streams
  • Performant processing of large CSV files

Weaknesses

  • The minimalistic design means it might lack some of the high-level features and conveniences offered by more full-featured CSV libraries, such as built-in support for complex transformations or integrations with external data sources.
  • It does not provide functionality for writing CSV files or manipulating CSV data beyond parsing.

Usage Example

  
const csv = require('csv-parser');

const fs = require('fs');

const results = [];

fs.createReadStream('data.csv')  .pipe(csv())

.on('data', (data) = >results.push(data)) 

.on('end', () = >{
  console.log(results);

// handle the parsed data here  });

Package Stats

Github Stars: 1.4k

Package size (Unpacked): 27.7 kB

Weekly downloads: 1,600,000

To install csv-parser, use the following command in your CLI:

  
npm install csv-parser

5. csv

The csv package is a project that provides CSV generation, parsing, transformation and serialization for Node.js. It includes four packages - csv-generate (GitHub), csv-parse (GitHub), csv-stringify (GitHub), stream-transform (GitHub). The actual parsing is done by csv-parse, but as you can see, the package includes a transformation framework and CSV generator. 

Feature Highlights

csv comes with a rich set of options to customize the parser’s behavior, including automatic column detection, custom column delimiters, and handling of quotes / escape characters. Data can be transformed synchronously or asynchronously, which is useful for manipulating data during the parsing/stringifying process. It can handle large files like other parsers on this list through its streaming interface.

Performance

The csv-parse portion of this library performed well on quoted CSVs, parsing the largest benchmark dataset in 10.3 seconds. You can expect similar performance on unquoted CSVs, clocking in at 9.5 seconds.

Edge Cases

csv includes detailed error messages out of the box, making troubleshooting straightforward. The library can skip empty lines or irregular records, and can be configured to handle other edge cases that occur in CSV files.

Best For

  • Developers who need a comprehensive toolkit for handling CSV data within the Node.js ecosystem.
  • Use cases where the modular nature of the library can be leveraged to minimize the package’s size.

Weaknesses

  • Relatively large (when the full library is imported) at ~2MB.
  • Because its functionality is spread out across a few different packages, there may be a steeper learning curve, and your code will likely be more verbose.

Usage Example

  
  
const fs = require('fs');

const parse = require('csv-parse');

const stringify = require('csv-stringify'); 

// Input and Output file paths
const inputFilePath = 'users.csv';
const outputFilePath = 'copy_users.csv';

// Create a parser that assumes the first line of the CSV file contains column names
const parser = parse({  columns: true,  trim: true});

// Create a stringifier to convert objects back into CSV format
const stringifier = stringify({  header: true});

// Create read and write streams
const inputStream = fs.createReadStream(inputFilePath);
const outputStream = fs.createWriteStream(outputFilePath);

// Pipe the streamsinputStream  
.pipe(parser)      

// Parse the input CSV file  
.pipe(stringifier)  

// Stringify the data into CSV format  
.pipe(outputStream)  

// Write the CSV data to the output file  
.on('finish', () => {  

    console.log('CSV file has been copied to copy_users.csv'); 
    });
    
    // Handle stream errors
    parser.on('error', error => console.error(error));
    stringifier.on('error', error => console.error(error));
    outputStream.on('error', error => console.error(error));

Package Stats

Github Stars: 3.7k

Package size (Unpacked): 2.01 MB

Weekly downloads: 800,000

To install csv, use the following command in your CLI:

  
npm install csv

Honorable Mentions

If none of the above libraries fit the bill, here are some honorable mentions that didn’t make our list.

  1. json-2-csv: A library that supports sub-documents and custom ordering of columns.
  2. csvtojson: Includes a command line utility, making it easy to parse local CSV files.
  3. nest-csv-parser: Nest.js wrapper for csv-parser. Pick this one up if you’re using the Nest framework.
  4. ya-csv: Event-based API with zero dependencies.
  5. skipper-csv: Parser adapter that supports multipart file uploads.

Choosing The Best Javascript CSV Parser

As with all libraries, it’s important to first evaluate your requirements and understand your use case. Here is a rough guide to aid in your decision-making process. 

First, assess the nature and volume of the data you’re dealing with. For small to medium-sized files, a simple and lightweight parser like csv-parser or fast-csv is a good fit. They require minimal setup and have straightforward APIs.

If your datasets are large or complex, with nested structures, varying encodings, or messy files, turn to a more robust solution like papaparse or csv. Their streaming capability will make processing large files memory-efficient, and they include built-in support for character encoding, custom delimiters, and more nuanced error handling. 

Second, assess your edge case requirements. If you’re expecting to see malformed data, make sure to ask the following questions before deciding on your parser:

  1. How does the parser handle incorrectly formatted lines or missing data?
  2. Is it tolerant of extra or missing delimiters, line breaks within cells, or unclosed quotes?
  3. How detailed are the error messages?
  4. How extensive is the documentation / How large is the community?

The “best” CSV parser is a subjective and contextual question. Define the needs of your project, the nature of your data, and the environments in which the parser will run. This will get you 90% of the way there. Lastly, you should consider how your team writes code. Is there a pattern or interface that aligns better with your existing codebase? Striving for codebase consistency will ensure your team can work efficiently and scale effectively.

Want a CSV Experience Users Will Actually Enjoy?

If performance and user experience are top concerns, OneSchema offers a powerful option for CSV parsing and importing. Validate and transform files up to 4GB in under a second with plug-and-play components for vanilla JavaScript, React, Angular and Vue projects. OneSchema goes beyond an exceptional developer experience by providing a rich CSV error correction toolset, empowering end users to clean and validate their data in a streamlined workflow.

Increase your onboarding conversion rates, save on engineering cycles, and prevent excessive support tickets with OneSchema. Get started for free.

Comma-Separated Values (CSV) are a streamlined text file format that encapsulates tabular data in plain text, where each line represents a single data record. This tabular representation makes them ideal for importing and exporting to spreadsheet applications like Microsoft Excel and Google Sheets or for storing relational data more broadly.

Unlike JSON, CSVs lack inherent support for complex data types and do not support operations such as filtering and direct data manipulation in their native form. To solve this problem, we have CSV parsers, which take our CSV files and turn them into a JSON representation more suitable for manipulation. 

Today, we’re focusing on Javascript CSV Parsers, which allow you to ingest a CSV file and work with its JSON representation for advanced typing and manipulation.

Note: We will be referencing performance numbers found in this excellent repository dedicated to CSV performance benchmarking. While a great reference, your performance may vary depending on configurations like chunk size. In this article, we will focus on how the parsers handle large CSVs (10 columns, 1 million rows) for both quoted and unquoted CSVs.

1. Papaparse

Papaparse has a simple API, but don’t let that fool you. It’s one of the fastest, most feature-rich open-source CSV parsers. 

Feature Highlights

Papaparse includes a lot of “nice-to-haves,” like auto-detecting delimiters and dynamic typing for booleans and numbers. It also features advanced functionality, like multi-threading (via the Web Workers API), and file streaming, both of which make it ideal for processing large CSVs.

Performance

Papaparse performs particularly well when parsing CSVs with quotes, taking just 5.5 seconds to parse a CSV file with ten columns and 1 million rows. When parsing CSVs without quotes, performance was considerably worse, taking a whopping 18 seconds.

Edge Cases

This is a strength of papaparse. The package features a comprehensive set of configuration options, a forgiving parser, and solid default settings. Malformed CSV files that cause errors in the parsing process are straightforward to handle gracefully. 

Best For

  • When you need a full-featured CSV parsing library

Weaknesses

  • As we saw in the performance section, parsing CSVs without quotes had a major impact on the performance of papaparse on very large CSVs. 
  • Another caveat to watch out for is the automatic type casting, which can sometimes incorrectly guess the data type.

Usage Example

  
Papa.parse(file, {    delimiter: ";",
   

  // Custom delimiter    
  newline: "",
   

  // Auto-detect newline    
  header: true,
   

  // First row is headers    
  dynamicTyping: true,

  // Automatically convert numbers and booleans    
  skipEmptyLines: true,

  // Skip empty lines    
  step: function(result) {

    // Called for each row in the file     
    processRow(result.data);    
  },
      

  complete: function() {      console.log("All rows successfully processed.");  

    // Perform a final operation if needed      
    finalizeProcessing();    
  },
      error: function(error) { 

    // Handle any errors      
    console.error("Parsing error:", error.message);    
  }  
});
}

Package Stats

Github Stars:  11.7k stars

Package size (Unpacked): 260 kB

Weekly downloads: 1.3 million 

To install papaparse, use the following command in your CLI:

  
npm install papaparse

2. fast-csv

A parser that combines the @fast-csv/format  and @fast-csv/parse packages, fast-csv is a popular, lightweight package for parsing and formatting CSVs.

Feature Highlights

Fast-csv is laser-focused on performance. It handles parsing large CSVs through the Node.js Streams API, but also supports promises if that’s your pattern of choice. It is incredibly lightweight, at just 8.5kb, and as its dependencies suggest, it supports parsing and formatting. 

Performance

At 16 seconds and 14 seconds for processing the benchmark CSV quoted and unquoted, it edges out papaparse for unquoted parsing. 

Edge Cases

Fast-csv provides flexible error handling options. For example, when encountering lines that don’t have the right number of fields, it can be configured to skip these lines or abort the operation. The library can also apply transformations on the data as it’s being parsed, which could be leveraged to handle data inconsistencies during parsing.

Best For

  • Use in Node.js environments, stream processing, custom delimiters and unquoted parsing. 
  • If package size is a major concern, this footprint is hard to beat.

Weaknesses

Not best-in-class on performance, particularly on parsing quoted CSVs. Not a good option for parsing CSVs client-side, and lacks concurrent processing capabilities.

Usage Example

  
const fs = require('fs');

const path = require('path');

const fastcsv = require('fast-csv');‍

let stream = fs.createReadStream(path.resolve(__dirname, 'data.csv'));

let csvData = [];‍let csvStream = fastcsv  .parse()  .on('data',

function(row) {    console.log(row);    csvData.push(row);  

})  

.on('end',

function() {    
    console.log('Read finished');
    
    // Do something with csvData  })  .on('error', function(error) {
    
    console.error(error);  });
    
    stream.pipe(csvStream);

Package Stats

Github Stars:  11.7k stars

Package size (Unpacked): 8.5kB

Weekly downloads: 960,000

To install fast-csv, run the following command in your CLI:

  
npm i -S fast-csv

3. SheetJS

SheetJS is a popular and full-featured CSV parser that focuses on reading, editing, and exporting spreadsheets for use with spreadsheet programs like Microsoft Excel.

Feature Highlights

SheetJS runs in the browser, on servers (Node.js and Deno), and Electron. It natively supports a range of Excel formats, can evaluate Excel formulas, merge cells, and can even read and write Excel chart information. 

It can work with raw binary data and base64 encoded strings, which is great for handling files client side. 

Performance

While we don’t have specific benchmarks for SheetJS, generally speaking, it handles small to midsize CSVs exceptionally well. There are reports of some performance issues on larger files, including excessive memory usage and long processing times for particularly complex spreadsheets.

Edge Cases

SheetJS will try to read as much as it can from a file, even if it’s malformed. For corrupted files, SheetJS has modes that attempt to recover as much data as possible. In many cases, when SheetJS encounters an unexpected value or a missing expected element, it will use a sensible default or fallback to avoid throwing an error.

Best For

  • If you’re working with excel files, this one is a no-brainer
  • If you need a parser on the client and/or server.

Weaknesses

  • Potential fine-tuning required to stay performant if dealing with large, complex Excel files
  • Some functionality requires paying for the premium option.
  • It is the biggest package in our top 5, at 7.5MB.

Usage Example

  
const XLSX = require('xlsx');  // Sample data set

const data = [
  {Name : "John", City : "Seattle", Salary : 70000},

  {Name : "Mike", City : "Los Angeles", Salary : 58000},

  {Name : "Samantha", City : "New York", Salary : 65000}
];

// Create a new workbook
const workbook = XLSX.utils.book_new();

// Convert the data to a worksheet
const worksheet = XLSX.utils.json_to_sheet(data);

// Append the worksheet to the workbook
XLSX.utils.book_append_sheet(workbook, worksheet, "Employees");

// Write the workbook to a file
XLSX.writeFile(workbook, 'output.xlsx');

console.log('Excel file created successfully.');

Package Stats

Github Stars:  33.8k stars

Package size (Unpacked): 7.5 MB

Weekly downloads: 1,400,000

To install SheetJS, run the following command in your CLI:

  
npm install xlsx

4. csv-parser

csv-parser is an efficient and streamlined library optimized for parsing CSV data quickly and effectively. It provides minimal overhead and is designed to be simple and lightweight for Node.js streams.

Feature Highlights

Like other parsers on this list, csv-parser is implemented as a transform stream in Node.js, which enables it to process data as it is being read, reducing memory overhead. It is also compliant with the RFC 4180 CSV standard and passes the csv-spectrum acid test suite, ensuring compatibility and correctness across a wide range of CSV variations.

Performance

csv-parser goes toe-to-toe with papaparse on quoted CSVs, parsing the largest benchmark dataset in 5.5 seconds. On unquoted CSVs, csv-parser really shines. At ~5.5 seconds, it is nearly identical to the quoted benchmark, and is over 3x faster than papaparse.

Edge Cases

Basic features like skipping bad lines (with proper configuration) or handling unconventional delimiters come out of the box, but this library may lack some of the advanced configurability offered by other parsers on this list.

Best For

  • Applications that require fast and efficient parsing of CSV data, especially when working with Node.js streams
  • Performant processing of large CSV files

Weaknesses

  • The minimalistic design means it might lack some of the high-level features and conveniences offered by more full-featured CSV libraries, such as built-in support for complex transformations or integrations with external data sources.
  • It does not provide functionality for writing CSV files or manipulating CSV data beyond parsing.

Usage Example

  
const csv = require('csv-parser');

const fs = require('fs');

const results = [];

fs.createReadStream('data.csv')  .pipe(csv())

.on('data', (data) = >results.push(data)) 

.on('end', () = >{
  console.log(results);

// handle the parsed data here  });

Package Stats

Github Stars: 1.4k

Package size (Unpacked): 27.7 kB

Weekly downloads: 1,600,000

To install csv-parser, use the following command in your CLI:

  
npm install csv-parser

5. csv

The csv package is a project that provides CSV generation, parsing, transformation and serialization for Node.js. It includes four packages - csv-generate (GitHub), csv-parse (GitHub), csv-stringify (GitHub), stream-transform (GitHub). The actual parsing is done by csv-parse, but as you can see, the package includes a transformation framework and CSV generator. 

Feature Highlights

csv comes with a rich set of options to customize the parser’s behavior, including automatic column detection, custom column delimiters, and handling of quotes / escape characters. Data can be transformed synchronously or asynchronously, which is useful for manipulating data during the parsing/stringifying process. It can handle large files like other parsers on this list through its streaming interface.

Performance

The csv-parse portion of this library performed well on quoted CSVs, parsing the largest benchmark dataset in 10.3 seconds. You can expect similar performance on unquoted CSVs, clocking in at 9.5 seconds.

Edge Cases

csv includes detailed error messages out of the box, making troubleshooting straightforward. The library can skip empty lines or irregular records, and can be configured to handle other edge cases that occur in CSV files.

Best For

  • Developers who need a comprehensive toolkit for handling CSV data within the Node.js ecosystem.
  • Use cases where the modular nature of the library can be leveraged to minimize the package’s size.

Weaknesses

  • Relatively large (when the full library is imported) at ~2MB.
  • Because its functionality is spread out across a few different packages, there may be a steeper learning curve, and your code will likely be more verbose.

Usage Example

  
  
const fs = require('fs');

const parse = require('csv-parse');

const stringify = require('csv-stringify'); 

// Input and Output file paths
const inputFilePath = 'users.csv';
const outputFilePath = 'copy_users.csv';

// Create a parser that assumes the first line of the CSV file contains column names
const parser = parse({  columns: true,  trim: true});

// Create a stringifier to convert objects back into CSV format
const stringifier = stringify({  header: true});

// Create read and write streams
const inputStream = fs.createReadStream(inputFilePath);
const outputStream = fs.createWriteStream(outputFilePath);

// Pipe the streamsinputStream  
.pipe(parser)      

// Parse the input CSV file  
.pipe(stringifier)  

// Stringify the data into CSV format  
.pipe(outputStream)  

// Write the CSV data to the output file  
.on('finish', () => {  

    console.log('CSV file has been copied to copy_users.csv'); 
    });
    
    // Handle stream errors
    parser.on('error', error => console.error(error));
    stringifier.on('error', error => console.error(error));
    outputStream.on('error', error => console.error(error));

Package Stats

Github Stars: 3.7k

Package size (Unpacked): 2.01 MB

Weekly downloads: 800,000

To install csv, use the following command in your CLI:

  
npm install csv

Honorable Mentions

If none of the above libraries fit the bill, here are some honorable mentions that didn’t make our list.

  1. json-2-csv: A library that supports sub-documents and custom ordering of columns.
  2. csvtojson: Includes a command line utility, making it easy to parse local CSV files.
  3. nest-csv-parser: Nest.js wrapper for csv-parser. Pick this one up if you’re using the Nest framework.
  4. ya-csv: Event-based API with zero dependencies.
  5. skipper-csv: Parser adapter that supports multipart file uploads.

Choosing The Best Javascript CSV Parser

As with all libraries, it’s important to first evaluate your requirements and understand your use case. Here is a rough guide to aid in your decision-making process. 

First, assess the nature and volume of the data you’re dealing with. For small to medium-sized files, a simple and lightweight parser like csv-parser or fast-csv is a good fit. They require minimal setup and have straightforward APIs.

If your datasets are large or complex, with nested structures, varying encodings, or messy files, turn to a more robust solution like papaparse or csv. Their streaming capability will make processing large files memory-efficient, and they include built-in support for character encoding, custom delimiters, and more nuanced error handling. 

Second, assess your edge case requirements. If you’re expecting to see malformed data, make sure to ask the following questions before deciding on your parser:

  1. How does the parser handle incorrectly formatted lines or missing data?
  2. Is it tolerant of extra or missing delimiters, line breaks within cells, or unclosed quotes?
  3. How detailed are the error messages?
  4. How extensive is the documentation / How large is the community?

The “best” CSV parser is a subjective and contextual question. Define the needs of your project, the nature of your data, and the environments in which the parser will run. This will get you 90% of the way there. Lastly, you should consider how your team writes code. Is there a pattern or interface that aligns better with your existing codebase? Striving for codebase consistency will ensure your team can work efficiently and scale effectively.

Want a CSV Experience Users Will Actually Enjoy?

If performance and user experience are top concerns, OneSchema offers a powerful option for CSV parsing and importing. Validate and transform files up to 4GB in under a second with plug-and-play components for vanilla JavaScript, React, Angular and Vue projects. OneSchema goes beyond an exceptional developer experience by providing a rich CSV error correction toolset, empowering end users to clean and validate their data in a streamlined workflow.

Increase your onboarding conversion rates, save on engineering cycles, and prevent excessive support tickets with OneSchema. Get started for free.

Stay Connected

Subscribe to receive our newsletter.