Control flow issue with node/redis and callbacks?

what is control flow function in node js
control flow function example
node js define function with callback
nodejs callback example
node js callback pattern
event loop in node js
callback in node js w3schools
async.queue nodejs

Please could I ask for some advice on a control flow issue with node and redis? (aka Python coder trying to get used to JavaScript)

I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.

Basically I'd like to query a set, and then when I have the results for the set, I need to carry out a get for each result. When I've got all the data, I need to broadcast it back to the client.

Currently I do this inside two callbacks, using a global object, which seems messy. I'm not even sure if it's safe (will the code wait for one client.get to complete before starting another?).

The current code looks like this:

var all_users = [];
// Get all the users for this page.
client.smembers("page:" + current_page_id, function (err, user_ids ) {
  // Now get the name of each of those users.
  for (var i = 0; i < user_ids.length; i++) {
     client.get('user:' + user_ids[i] + ':name', function(err, name) {
       var myobj = {};
       myobj[user_ids[i]] = name;
       all_users.push(myobj);  
       // Broadcast when we have got to the end of the loop, 
       // so all users have been added to the list - 
       // is this the best way? It seems messy.  
       if (i === (user_ids.length - 1)) {
           socket.broadcast('all_users', all_users); 
       }
     });       
   }
 });

But this seems very messy. Is it really the best way to do this? How can I be sure that all lookups have been performed before calling socket.broadcast?

scratches head Thanks in advance for any advice.

7. Control flow - Mixu's Node book - Books at mixu.net, discuss nested callbacks and control flow in Node; introduce three essential When you start coding with Node.js, it's a bit like learning programming the first time. This has several problems: first, we'd like to wait until all the file reads are � Using libraries to control async code-flow (Async.js, Step.js, etc.) If spaghetti-style code is what you think mess is, define some functions to process results and pass them as parameters instead of anonymous ones.

If you totally dislike writing stuff callback-style, you might want to try streamlinejs:

var all_users = [];
// Get all the users for this page.
var user_ids = client.smembers("page:" + current_page_id, _);
// Now get the name of each of those users.
for (var i = 0; i < user_ids.length; i++) {
  var name = client.get('user:' + user_ids[i] + ':name', _);
  var myobj = {};
  myobj[user_ids[i]] = name;
  all_users.push(myobj);  
}
socket.broadcast('all_users', all_users);

Note that a disadvantage of this variant is that only one username will be fetched at a time. Also, you should still be aware of what this code really does.

What are callbacks?, In a synchronous program, you would write something along the lines of: function processData () { var data = fetchData (); data += 1; return data� I've been exploring javascript more deeply lately, playing around with a node,redis,socket.io,express simple webapp. I think I've come to a problem that is an example of the fundamental difference

Async is a great library and you should take a look. Why ? Clean code / process / easy to track .. etc

Also, keep in mind that all your async function will be processed after your for loop. In you exemple, it may result in wrong "i" value. Use closure :

 for (var i = 0; i < user_ids.length; i++) { (function(i) {
 client.get('user:' + user_ids[i] + ':name', function(err, name) {
   var myobj = {};
   myobj[user_ids[i]] = name;
   all_users.push(myobj);  
   // Broadcast when we have got to the end of the loop, 
   // so all users have been added to the list - 
   // is this the best way? It seems messy.  
   if (i === (user_ids.length - 1)) {
       socket.broadcast('all_users', all_users); 
   }
 });       
})(i)}

What you should do to know when it's finish is use a recursive pattern like async ( i think ) do. It's much simple then doing it yourself.

async.series({
  getMembers: function(callback) {
     client.smembers("page:" + current_page_id, callback);
  }
}, function(err, results) {
   var all_users = [];
   async.forEachSeries(results.getMembers, function(item, cb) {
    all_users.push(item);
    cb();
   }, function(err) {
      socket.broadcast('all_users', all_users); 
   });
});

This code may not be valid, but you should be able to figure out how to do it.

Step library is good too ( and only 30~ line of code i think)

Control flow, Note: The node.js core APIs, by default, use callbacks. These guides will assume you are using songbird to expose the promise = module.promise.method(. With REdis nodejs, Node_redis manages connections automatically, so there is no need to wait for connection or callback. Since Node.js and Redis are both effectively single threaded there is no need to use multiple client instances or any pooling mechanism save for a few exceptions; the most common exception is if you’re subscribing with Pub/Sub or blocking with streams or lists, then you

I don't understand why client.smembers and client.get (Redis lookups) need to be callbacks rather than simply being statements - it makes life very complicated.

Right, so everyone agrees callback hell is no bueno. As of this writing, callbacks are a dying feature of Node. Unfortunately, the Redis library does not have native support for returning Promises.

But there is a module you can require in like so:

const util = require("util");

This is a standard library that is included in the Node runtime and has a bunch of utility functions we can use, one of them being "promisify": https://nodejs.org/api/util.html#util_util_promisify_original

Now of course when you asked this question seven years ago, util.promisify(original) did not exist as it was added in with the release of -v 8.0.0, so we can now update this question with an updated answer.

So promisify is a function and we can pass it a function like client.get() and it will return a new function that take the nasty callback behavior and instead wraps it up nice and neat to make it return a Promise.

So promisify takes any function that accepts a callback as the last argument and makes it instead return a Promise and it sounds like thats the exact behavior that you wanted seven years ago and we are afforded today.

const util = require("util");
client.get = util.promisify(client.get);

So we are passing a reference to the .get() function to util.promisify().

This takes your function, wraps it up so instead of implementing a callback, it instead returns a Promise. So util.promisify() returns a new function that has been promisified.

So you can take that new function and override the existing one on client.get(). Nowadays, you do not have to use a callback for Redis lookup. So now you can use the async/await syntax like so:

const cachedMembers = await client.get('user:' + user_ids[i]);

So we wait for this to be resolved and whatever it resolves with will be assigned to cachedMembers.

The code can be even further cleaned up to be more updated by using an ES6 array helper method instead of your for loop. I hope this answer is useful for current users, otherwise the OP was obsolete.

Flow Control in Modern JS: Callbacks to Promises to Async/Await , Craig Buckler digs into key problems related to asynchronous Flow Control in Modern JS: Callbacks to Promises to Async/Await On the server, a Node.js application would not be able to process further user requests. This tutorial explains various ways of interacting with Redis from a Node.js app using the node_redis library. Installing node_redis. node_redis, as you may have guessed, is the Redis client for

Node.js Async Best Practices & Avoiding Callback Hell, operations. Learn how to avoid the callback hell ! The Problem with Node.js Async Avoiding Callback Hell with Control Flow Managers. javascript - Control flow issue with node/redis and callbacks? ruby on rails - ActionController::Live + Redis issue; mongodb - In NoSql how to ensure uniqueness of two columns, for example Username and Email should be unique; Redis server reports Reading from client: Connection reset on amazon ec2 c1.medium instance

Control Flow in Node - How To Node, Control Flow in Node For example I can do all of the file reading at once without any problems, but I can't simple-scanner.js Nesting callbacks is a great way to ensure they run synchronously. scanner-with-callback.js node-red-contrib-redis 1.3.3. Node Red client for Redis with pub/sub, list, lua scripting and other commands support. npm install node-red-contrib-redis

Asynchronous flow control using async, In order to manage flow control, and render pages when we have all the required information available, we'll use the popular node async module. By convention in Express, callback functions pass an error value as the first The problem with this approach is that our requests would have to be run in� javascript - Control flow issue with node/redis and callbacks? ruby on rails - ActionController::Live + Redis issue; mongodb - In NoSql how to ensure uniqueness of two columns, for example Username and Email should be unique; Redis server reports Reading from client: Connection reset on amazon ec2 c1.medium instance

Comments
  • Totally agree with the questioners observations, I am struggling too with these concepts. That's what Node is seems like it's a bit snobby, sorry.
  • @SSHThis right on the main page of Node's website it explicity says so — async I/O. I can only speculate on reasons that made you choose technology that relies on concepts you don't (and not willing to) understand. Try to get it and you'll love it, if not — choose something different.
  • Good answer, thanks. I'd like to know though: what is the correct way to write it callback-style? Just so I know the right way before doing it the simple way :)
  • For the loop, he could also just do .forEach