Easier way to update data with node-postgres?

postgres update
node-postgres sql injection
node-postgres parameterized query
node js postgresql tutorial
node-postgres named parameters
node-postgres create database
node-postgres async/await
node-postgres promises

I'm using the superb plugin node-postgres, https://github.com/brianc/node-postgres

I have this update rest call. I have a about 30 columns in my in my table. Is there any easier way to update these then this way?

/*
 Post /api/project/products/:pr_id HTTP/1.1
 */
exports.updateProduct = function(req, res){
  pg.connect(cs, function(err, client, done) {
    var query = "UPDATE products SET pr_title = ($1), pr_usercode = ($2) WHERE pr_id=($3)";
    client.query(query, [req.body.pr_title, req.body.pr_usercode, req.params.pr_id], function(err, result) {
      if (handleErr(err, done)) return;
      done();
      sendResponse(res, result.rows[0]);
    })
  });
};

I only have three columns here. It will be messy and hard to maintain when I write all 30 columns. Must be a way where just with a simple line update all columns in req.body?

Any ideas?

You could always roll out a function like so:

function updateProductByID (id, cols) {
  // Setup static beginning of query
  var query = ['UPDATE products'];
  query.push('SET');

  // Create another array storing each set command
  // and assigning a number value for parameterized query
  var set = [];
  Object.keys(cols).forEach(function (key, i) {
    set.push(key + ' = ($' + (i + 1) + ')'); 
  });
  query.push(set.join(', '));

  // Add the WHERE statement to look up by id
  query.push('WHERE pr_id = ' + id );

  // Return a complete query string
  return query.join(' ');
}

And then use it as such:

/*
 Post /api/project/products/:pr_id HTTP/1.1
 */
exports.updateProduct = function(req, res){
  pg.connect(cs, function(err, client, done) {

    // Setup the query
    var query = updateProductByID(req.params.pr_id, req.body);

    // Turn req.body into an array of values
    var colValues = Object.keys(req.body).map(function (key) {
      return req.body[key];
    });

    client.query(query, colValues, function(err, result) {
      if (handleErr(err, done)) return;
      done();
      sendResponse(res, result.rows[0]);
    });
  });
};

Or, if an ORM is something you need because you'll be doing a lot like the above, you should check out modules like Knex.js

How to Update a PostgreSQL Table with Node, Introduction. In this article, we will discuss how to integrate PostgreSQL with Node.js. We will be using simple javascript ES6 syntax in this article. There are a To update data that already exists, we can use the UPDATE statement: UPDATE� Using ES6's Object.keys function, we extract the keys from the update object and tell the TodoItem Sequelize model to only update the fields that are present in the update data object. If we have a field in our model that's missing from the update object, the update operation will leave that field untouched.

Good answers have already been given, but IMHO not good enough in one aspect, they all lacks good abstraction. I will try to provide more abstracted way of updating your data in postgres using node-postgres.

It is always good practice to follow official documentation, following code structure was taken from node-postgres, you can extend it however you like:

here is mine, this is where you interact with your database

const { Pool } = require("pg");
const connection = require("./connection.json");
const pool = new Pool(connection);
const { insert, select, remove, update } = require("./helpers");


/**
 * The main mechanism to avoid SQL Injection is by escaping the input parameters.
 * Any good SQL library should have a way to achieve this.
 * PG library allows you to do this by placeholders `($1, $2)`
 */
module.exports = {
  query: (text, params, callback) => {
    const start = Date.now();

    return pool.query(text, params, (err, res) => {
      const duration = Date.now() - start;
      console.log("executed query", { text, duration, rows: res.rowCount });
      callback(err, res);
    });
  },

  getClient: callback => {
    pool.connect((err, client, done) => {
      const query = client.query;
      // monkey patch the query method to keep track of the last query executed
      client.query = (...args) => {
        client.lastQuery = args;
        return query.apply(client, args);
      };
      // set a timeout of 5 seconds, after which we will log this client's last query
      const timeout = setTimeout(() => {
        console.error("A client has been checked out for more than 5 seconds!");
        console.error(
          `The last executed query on this client was: ${client.lastQuery}`
        );
      }, 5000);
      const release = err => {
        // call the actual 'done' method, returning this client to the pool
        done(err);
        // clear our timeout
        clearTimeout(timeout);
        // set the query method back to its old un-monkey-patched version
        client.query = query;
      };
      callback(err, client, release);
    });
  },

  /**
   * Updates data
   *
   * entity: table name, e.g, users 
   * conditions: { id: "some-unique-user-id", ... }
   * fields: list of desired columns to update { username: "Joe", ... }
   */
  updateOne: async (entity, conditions, fields) => {
    if (!entity) throw new Error("no entity table specified");
    if (Utils.isObjEmpty(conditions))
      throw new Error("no conditions specified");

    let resp;   
    const { text, values } = update(entity, conditions, fields);

    try {
      rs = await pool.query(text, values);
      resp = rs.rows[0];
    } catch (err) {
      console.error(err);
      throw err;
    }

    return resp;
  },

  createOne: async (entity, data) => {
  },

  deleteOne: async (entity, conditions, data) => {
  },

  findAll: async (entity, conditions, fields) => {
  },

  // ... other methods
};

here is helper methods for CRUD operations, they will prepare query text with prepared values:

/**
 * tableName: `users`
 * conditions: { id: 'joe-unique-id', ... }
 * data: { username: 'Joe', age: 28, status: 'active', ... }
 *
 *  "UPDATE users SET field_1 = $1, field_2 = $2, field_3 = $3, ... ( WHERE ...) RETURNING *";
 */
exports.update = (tableName, conditions = {}, data = {}) => {
  const dKeys = Object.keys(data);
  const dataTuples = dKeys.map((k, index) => `${k} = $${index + 1}`);
  const updates = dataTuples.join(", ");
  const len = Object.keys(data).length;

  let text = `UPDATE ${tableName} SET ${updates} `;

  if (!Utils.isObjEmpty(conditions)) {
    const keys = Object.keys(conditions);
    const condTuples = keys.map((k, index) => `${k} = $${index + 1 + len} `);
    const condPlaceholders = condTuples.join(" AND ");

    text += ` WHERE ${condPlaceholders} RETURNING *`;
  }

  const values = [];
  Object.keys(data).forEach(key => {
    values.push(data[key]);
  });
  Object.keys(conditions).forEach(key => {
    values.push(conditions[key]);
  });

  return { text, values };
};

exports.select = (tableName, conditions = {}, data = ["*"]) => {...}
exports.insert = (tableName, conditions = {}) => {...}
exports.remove = (tableName, conditions = {}, data = []) => {...}

And finally you can use this in you route handlers without cluttering your codebase:

const db = require("../db");

/**
 *
 */
exports.updateUser = async (req, res) => {
  try {
    console.log("[PUT] {api/v1/users}");
    const fields = {
      name: req.body.name,
      description: req.body.description,
      info: req.body.info
    };
    const userId = req.params.id;

    const conditions = { id: userId };
    const updatedUser = await db.updateOne("users", conditions, fields);

    if (updatedUser) {
      console.log(`team ${updatedUser.name} updated successfully`);
      return res.json(updatedUser);
    }
    res.status(404).json({ msg: "Bad request" });
  } catch (err) {
    console.error(err);
    res.status(500).send({ msg: "Server error" });
  }
};

Convenient utilities:

const Utils = {};
Utils.isObject = x => x !== null && typeof x === "object";
Utils.isObjEmpty = obj => Utils.isObject(obj) && Object.keys(obj).length === 0;

Integrating PostgreSQL with Node.js and node-postgres, The prepareValue function provided can be used to convert nested types to raw data types suitable for the database. Otherwise if no toPostgres method is defined� Create a JavaScript file for the ‘UPDATE’ PostgreSQL Table Node app Now that we’ve reviewed the prerequisites, let’s dive into our JavaScript code. Using a text editor or an IDE editor that supports JavaScript syntax, create a new file with an extension of.js. We’ll be using this file to store the code for the Node application.

I like to use knexjs, which works with postgre. It's also a fun javascript way to write queries (without all that nasty SQL-string manipulation).

Take for example this method, that stores some contact information. The JSON schema of that contact information is defined elsewhere (also useful when I validate). The result is a code-generated query, which contains only columns passed in.

function saveContactInfo( inputs, callback ) {
  var setObj = {};
  for( var property in inputs.contact )
  {
    //assumes properties are same as DB columns, otherwise need to use some string-mapping lookup.
    setObj[ property ] = inputs.contact[property];
  }
  setObj[ "LastModified" ] = new Date();

  var query = knex( "tblContact" ).update( setObj ).where( "contactId", inputs.contact.contactId );
  //log.debug("contactDao.saveContactInfo: " + query.toString());
  query.exec( function(err, results ){
    if(err) return callback(err);
    //Return from DB is usually an array, so return the object, not the array.
    callback( null, results[0] );
  });    
}

Knexjs also has some nifty postgre-only options (which would have been useful for me, had I not been using MySQL)

Queries, For DML commands (INSERT, UPDATE, etc), it reflects how many rows the in the future to provide different information in some situations so it'd be best not to� If you’re using a NodeJS application to interact with PostgreSQL, you’ll soon find that you can perform a wide variety of database operations from your code. Not only can you query tables and update them with new records, but you can even create a new database. In this article, we’ll show you how to create a Postgres database with NodeJS using a simple application as our example.

Quick example from me:

async update(objectToSave) {
    const dbID = objectToSave.id;

    const args = Object.values(objectToSave);
    const keys = Object.keys(objectToSave).join(',');
    const argKeys = Object.keys(objectToSave).map((obj,index) => { 
      return "$"+(index+1) 
    }).join(',');        

    const query = "UPDATE table SET ("+keys+") = ("+argKeys+") WHERE id = "+dbID;

    try {
        const res = await client.query(query, args)
        return true;
    } catch (err) {
        console.log(err.stack)
        return false;
    }
}

pg.Result, node-postgres is a collection of node.js modules for interfacing with your PostgreSQL database. It has support for callbacks, This is the simplest possible way to connect, query, and disconnect with async/await: const { Client } = require( 'pg'). Create a PostgreSQL database and table for the NodeJS app. Now that we’ve reviewed the prerequisites, let’s create a PostgreSQL database and table. We’ll insert some record data into our table using the psql command-line interface. The first step is to create a database using the following SQL statement:

Create Insert Query

exports.createInsertQuery = (tablename, obj) => {
    let insert = 'insert into ' + tablename;
    let keys = Object.keys(obj);
    let dollar = keys.map(function (item, idx) { return '$' + (idx + 1); });
    let values = Object.keys(obj).map(function (k) { return obj[k]; });
    return {
        query: insert + '(' + keys + ')' + ' values(' + dollar + ')',
        params: values
    }
}

Usage

let data = {firstname : 'hie' , lastname : 'jack', age : 4}
let yo = createInsertQuery('user',data) 

client.query(yo.query, yo.params ,(err,res) =>{
 console.log(res)
})

So like wise you can create update , delete query

node-postgres: Welcome, You will learn how to install PostgreSQL and work with it through the a developer can create a CRUD system – create, read, update, delete. We're now inside psql in the postgres database. Now for each endpoint, we'll set the HTTP request method, the endpoint URL path, and the relevant function. node-postgres is a collection of node.js modules for interfacing with your PostgreSQL database. It has support for callbacks, promises, async/await, connection pooling, prepared statements, cursors, streaming results, C/C++ bindings, rich type parsing, and more!

Setting up a RESTful API with Node.js and PostgreSQL, Node.js has very good modules to work with PostgreSQL database. from How to Install PostgreSQL on Ubuntu Linux: The Easy Way and Node.js and To update an existing row of PostgreSQL's users table, run the following SQL query:. It is widely used for almost all types of applications. Node.js has very good modules to work with PostgreSQL database. In this article, I am going to show you how to connect to a PostgreSQL database from Node.js. I will use a Node.js module called ‘node-postgres’ to integrate Node.js and PostgreSQL. Let’s get started. Pre-requisites:

PostgreSQL NodeJS Tutorial – Linux Hint, Intro to PostgreSQL (for NodeJS & Express) (10 Part Series) 9) NodeJS & PostgreSQL: How To Connect Our Database To Our Simple Express Server (with an ORM) 10) Now we want to learn how to update & delete data. To make it easier to find and access your settings in index.js, your configuration settings are organized into logical objects: the Okta settings have their own object, as do the database and server settings. The database settings should match the values you input when setting up Postgres. The Okta settings will come from your Okta account.

PostgreSQL: How To Update & Delete Data, The Node Postgres tutorial shows how to work with PostgreSQL to the PostgreSQL database and return a simple SELECT query result. first.js. With Postgres it is possible to shoot yourself in the foot, but Postgres also offers you a way to stay on target. These are some of the important do’s and don’ts that we’ve seen as helpful when working with users to migrate from their single node Postgres database to Citus or when building new real-time analytics apps on Citus.

Comments
  • Please keep in mind that id in updateProductByID should instead be a parameterized value. Instead of i + 1, use i + 2, then unshift the product ID value to colValues to ensure that no SQL injection can be performed by inserting a malicious ID.
  • filter function returns boolean expression, not value. you should replace the 'filter' with 'map' which is the correct function to return values
  • @NirO. Good catch — this was written as pseudo code and not actually executed. Will update.
  • Great answer as well. I approved the other one since his solution was without plugins. But I will definitely try out knexjs. Seems to be very convenient Thx!