How to convert CSV to JSON in Node.js

I am trying to convert csv file to json. I am using .

Example CSV:

a,b,c,d
1,2,3,4
5,6,7,8
...

Desired JSON:

{"a": 1,"b": 2,"c": 3,"d": 4},
{"a": 5,"b": 6,"c": 7,"d": 8},
...

I tried node-csv parser library.But the output is like array not like I expected.

I'm using Node 0.8 and express.js and would like a recommendation on how to easily accomplish this.

Node.js csvtojson module is a comprehensive nodejs csv parser. It can be used as node.js app library / a command line tool / or browser with help of browserify or webpack.

the source code can be found at: https://github.com/Keyang/node-csvtojson

It is fast with low memory consumption yet powerful to support any of parsing needs with abundant API and easy to read documentation.

The detailed documentation can be found here

Here are some code examples:

Use it as a library in your Node.js application (csvtojson@2.0.0 +):

  1. Install it through npm

npm install --save csvtojson@latest

  1. Use it in your node.js app:
// require csvtojson
var csv = require("csvtojson");

// Convert a csv file with csvtojson
csv()
  .fromFile(csvFilePath)
  .then(function(jsonArrayObj){ //when parse finished, result will be emitted here.
     console.log(jsonArrayObj); 
   })

// Parse large csv with stream / pipe (low mem consumption)
csv()
  .fromStream(readableStream)
  .subscribe(function(jsonObj){ //single json object will be emitted for each csv line
     // parse each json asynchronousely
     return new Promise(function(resolve,reject){
         asyncStoreToDb(json,function(){resolve()})
     })
  }) 

//Use async / await
const jsonArray=await csv().fromFile(filePath);

Use it as a command-line tool:

sh# npm install csvtojson
sh# ./node_modules/csvtojson/bin/csvtojson ./youCsvFile.csv

-or-

sh# npm install -g csvtojson
sh# csvtojson ./yourCsvFile.csv

For advanced usage:

sh# csvtojson --help

You can find more details from the github page above.

How to convert CSV to JSON in Node.js, Node.js csvtojson module is a comprehensive nodejs csv parser. It can be used as node.js app library / a command line tool / or browser with help of browserify  Reading a CSV file and parsing it to JSON is an easy-peasy job in Node.JS. Something important that you need to take into consideration before continue to implement the solutions below is your source file size. If you are having a huge CSV file with a lot of records (rows), probably you will have to use the solution with streams.

You can try to use underscore.js

First convert the lines in arrays using the toArray function :

var letters = _.toArray(a,b,c,d);
var numbers = _.toArray(1,2,3,4);

Then object the arrays together using the object function :

var json = _.object(letters, numbers);

By then, the json var should contain something like :

{"a": 1,"b": 2,"c": 3,"d": 4}

convert-csv-to-json, Keywords. csv · to · json · csvtojson · converter · css · json · parser · javascript · nodejs · node · npm · npm-package · npm-module · csv-parser  That's all folks for converting a Comma-Separated Values (CSV) file into a JSON array in a Node.js application by using the csvtojson package. We also discussed how to write the JSON array to a file for further processing. The csvtojson package is a lightweight yet powerful library to convert CSV to JSON in Node.js and browsers. You can even

Here is a solution that does not require a separate module. However, it is very crude, and does not implement much error handling. It could also use more tests, but it will get you going. If you are parsing very large files, you may want to seek an alternative. Also, see this solution from Ben Nadel.

Node Module Code, csv2json.js:
/*
 * Convert a CSV String to JSON
 */
exports.convert = function(csvString) {
    var json = [];
    var csvArray = csvString.split("\n");

    // Remove the column names from csvArray into csvColumns.
    // Also replace single quote with double quote (JSON needs double).
    var csvColumns = JSON
            .parse("[" + csvArray.shift().replace(/'/g, '"') + "]");

    csvArray.forEach(function(csvRowString) {

        var csvRow = csvRowString.split(",");

        // Here we work on a single row.
        // Create an object with all of the csvColumns as keys.
        jsonRow = new Object();
        for ( var colNum = 0; colNum < csvRow.length; colNum++) {
            // Remove beginning and ending quotes since stringify will add them.
            var colData = csvRow[colNum].replace(/^['"]|['"]$/g, "");
            jsonRow[csvColumns[colNum]] = colData;
        }
        json.push(jsonRow);
    });

    return JSON.stringify(json);
};
Jasmine Test, csv2jsonSpec.js:
var csv2json = require('csv2json.js');

var CSV_STRING = "'col1','col2','col3'\n'1','2','3'\n'4','5','6'";
var JSON_STRING = '[{"col1":"1","col2":"2","col3":"3"},{"col1":"4","col2":"5","col3":"6"}]';

/* jasmine specs for csv2json */
describe('csv2json', function() {

    it('should convert a csv string to a json string.', function() {
        expect(csv2json.convert(CSV_STRING)).toEqual(
                JSON_STRING);
    });
});

Converting CSV to JSON using NodeJS, in this tutorial I'll be demonstrating how you can convert a csv file to JSON using NodeJS. In this tutorial I’m going to be showing you how we can create a NodeJS script that takes in a csv file and outputs the contents of that CSV file as JSON. In order to do this conversion we’ll be using the csvtojson node package.

Had to do something similar, hope this helps.

// Node packages for file system
var fs = require('fs');
var path = require('path');


var filePath = path.join(__dirname, 'PATH_TO_CSV');
// Read CSV
var f = fs.readFileSync(filePath, {encoding: 'utf-8'}, 
    function(err){console.log(err);});

// Split on row
f = f.split("\n");

// Get first row for column headers
headers = f.shift().split(",");

var json = [];    
f.forEach(function(d){
    // Loop through each row
    tmp = {}
    row = d.split(",")
    for(var i = 0; i < headers.length; i++){
        tmp[headers[i]] = row[i];
    }
    // Add object to list
    json.push(tmp);
});

var outPath = path.join(__dirname, 'PATH_TO_JSON');
// Convert object to string, write json to file
fs.writeFileSync(outPath, JSON.stringify(json), 'utf8', 
    function(err){console.log(err);});

How to convert CSV to JSON in Node.js, A quick guide to learn how to convert a CSV (Comma-Separated Values) file into a JSON array using Node.js. This article is a continuation of How to convert CSV to JSON in Node.JS.. We will keep up with the same pattern used there – introducing two solutions to our problem. The first solution will use the standard readFile method, i.e. reading the whole JSON file, when everything is read – transform the data, and write it in a new CSV file.

I haven't tried csv package https://npmjs.org/package/csv but according to documentation it looks quality implementation http://www.adaltas.com/projects/node-csv/

How to convert Csv file data to Json in Nodejs?, npm is basically the package manager which provides the JavaScript runtime environment in the Node.js. Installation: copytext. npm install csvjson --save. In this article, we looked at how to use the json-2-csv module to convert a JSON array to a CSV file in a Node.js Application. Take a look at the json-2-csv module documentation to learn more about all the available options. You can also use it to convert the CSV string back into the original array of JSON documents.

Convert Data Between CSV And JSON With Simple JavaScript , Learn how to quickly and easily convert comma separated value (CSV) data to JSON and back to CSV using simple Node.js and JavaScript. I'm currently learning how to parse a JSON object to a CSV file using the json2csv node module. Have never worked with JSON before, so this is all new to me. My JSON object is formatted like this:

Convert Data Between CSV And JSON With Simple JavaScript, csvtojson module is a comprehensive nodejs csv parser to convert csv to json or column arrays. It can be used as node.js library / command line tool / or in  I want to convert a json array of elements to csv in node.js. I've found some module doing that like json2csv or json-csv but they are not complete. For example json2csv only support a flat structure where fields are direct children of the json root and also the schema should be the same for all json objects. In my case, I want that.

Keyang/node-csvtojson: Blazing fast and Comprehensive , I tried node-csv parser library.But the output is like array not like I expected. I'm using Node 0.8 and express.js and would like a recommendation on how to  convert.js; Step1: Create CSV Convert Form HTML. As we will cover this tutorial with live example, so in index.php file, we will create HTML Form with textarea to enter CSV data and display convert JSON in another textarea.

Comments
  • apievangelist.com/2013/09/24/… and kinlane.github.io/csv-converter looks impressive.
  • I wrote a small blog post on a similar solution as proposed by brnrd: thinkingonthinking.com/scripting-a-csv-converter
  • Code has been added. See more detailed documentation here github.com/Keyang/node-csvtojson
  • Since version 0.3.0, csvtojson does not depend on any other lib. It will behave like a proper Stream object.
  • The link to the blog is dead.
  • Updated. Thanks for letting me know.
  • I don't know if this is happening only to me, but for a large CSV file this is to slow. Like 10 seconds slower than d3
  • this one didn't work as well as csvtojson. When I had "Aug 23, 2016" it split aug23 and 2016 into different fields
  • Bummer. You could wrap the date in quotes to fix it?
  • Or even var json = f.map(function(d, i){ ... return tmp; }
  • async and underscore was too much for you?
  • @Spencer, at the time that I posted, the dependencies were different: github.com/Keyang/node-csvtojson/blob/… Pulling in express for a csv conversion felt unnatural
  • Oh, yeah that is a crazy dependency. My bad.
  • The link to this library is dead - perhaps it was moved to somewhere else on Github (or forked?). Please update link.
  • Thank you @RohitParte . This is one of my first modules in NodeJs. While some features work fine, it is missing a lot of features. I become extremely busy with other things (Reliability Engineering, DevOps, .. so on).