Hot questions for Using ZeroMQ in socket.io

Question:

I have this data coming at a rate of around 30 times per second and it's being fed to my app via these event handlers. I'm using the btford.socket-io for that and the 0MQ is the responsible for getting the data to my node.js server.

I've tracked down the data coming from the node server and realized that it's coming correctly, so the duplication is actually happening on the Angular side.

What basically happens is that every time when I leave a view and come back to it, all the events are fired twice as much as before.

In the btford API Reference, he gives a suggestion of using

socket.forward('dataUpdate', $scope);
$scope.$on('socket:dataUpdate', function(ev, data) {
   $scope.someVar1 = data;
   $scope.someVar2 = data;
   $scope.someVar3 = data;
   $scope.someVar4 = data;
});

which helped for another situation. But, as I said, in this case, this event is being called 30 times a second and handled 4 times for there are 4 variables. So each time I change back to the route showing these variables, instead of doing the work 4 times, it does 8 times and then 12, 16 and goes on. With that, I'm getting a memory leak and eventually the browser crashes.

Does anybody have any ideas on how I could make it better?


Answer:

Event listener will never get removed directly, you need to remove them forcefully. You need to deregistered the listener while $destroying controller

Code

socket.forward('dataUpdate', $scope);
var socketEvent = $scope.$on('socket:dataUpdate', function(ev, data) {
   $scope.someVar1 = data;
   $scope.someVar2 = data;
   $scope.someVar3 = data;
   $scope.someVar4 = data;
});

$scope.$on('$destroy', function(){
   socketEvent(); //deregistering event while destroying controller scope.
})

Question:

I have started building PHP and Node Js app for my real time sports score updates. I have my php and node server working fine with node modules express, socket io and zeromq. I have a large data that I receive from API where in the php file I package and send the json data to node server (through zeromq), the data is then received in the node js server file from where it's sent to the client side. Now, the set up is totally working fine for small set of data. But when it's large file, the node server fails to process further with error listed below.

This is the error i get while trying to send to the client through socket io in node server

node: ../node_modules/nan/nan.h:822: Nan::MaybeLocal Nan::NewBuffer(char*, size_t, node::Buffer::FreeCallback, void*): Assertion `length <= imp::kMaxLength && "too large buffer"' failed. Aborted (core dumped)

This is the main node_socket.js

  var express = require('express'); 
  var app = express();
  var fs = require('fs');

  var options = {
  key: fs.readFileSync('/etc/letsencrypt/live/example.com/privkey.pem'),
  cert: fs.readFileSync('/etc/letsencrypt/live/example.com/fullchain.pem'),
  ca: fs.readFileSync('/etc/letsencrypt/live/example.com/chain.pem')
 };

 var https = require('https').Server(options, app);

  var zmq = require('zeromq')
 , sock = zmq.socket('pull');
sock.bind('tcp://10.150.0.6:1111');

var io = require('socket.io')(https); 

io.on('connection', function(socket){ 

socket.on('disconnect',function(){
    console.log("client disconnected");
})  

sock.on('message',function(msg){    
 console.log('work: %s', msg.toString());   
 socket.emit('latest_score',msg.toString());    

});     

 });

 https.listen(3333);
 sock.on('connect', function(fd, ep) {console.log('connect, endpoint:', ep);});

 console.log('App connected to port 3333');

Please note the app works fine with small data but just won't able to handle large json data being sent from php file. I tried with few different things since few days but to no avail. I also hired few node js developer from fiverr.com but they couldn't solve the problem as well. I am hoping someone here will guide me in right direction.


Answer:

From the nodejs documentation on buffer (https://nodejs.org/api/buffer.html)

buffer.constants.MAX_LENGTH#
Added in: v8.2.0
<integer> The largest size allowed for a single Buffer instance.
On 32-bit architectures, this value is (2^30)-1 (~1GB). On 64-bit architectures, this value is (2^31)-1 (~2GB).

This value is also available as buffer.kMaxLength.

So assuming you're on a 64-bit system you seem to send more than 2GB of data at once. Assuming you have a big JSON Array of data the easiest way would be to split the array up into chunks and send them through the socket individually.

So here:

sock.on('message',function(msg){    
   console.log('work: %s', msg.toString());   
  socket.emit('latest_score',msg.toString());    
}); 

you need to JSON.parse() the data, split it up, JSON.stringify() the data and send it individually through the socket.

See here how to split the array: Split array into chunks

Update: (because of comment) If you absolutely can't split the data you could store it in php (in a database or a file) and build a REST api to query the data. Then just send the id of the file/row through ZeroMQ and NodeJs. The client then has to call the REST api to get the actual data.

Question:

I'm using PHP over IIS 7.5 on Windows Server 2008.

My web application is requesting repeatedly with Ajax in the background 3 different JSON pages:

  • page 1 Every 6 seconds
  • page 2 Every 30 seconds
  • page 3 Every 60 seconds

They retrieve data related with the current state of some tables. This way I keep the view updated.

Usually I have no much trouble with it, but lately I saw my server saturated with hundreds of unanswered requests and I believe the problem can be due to a delay in one of the request.

If page1, which is being requested every 6 seconds, needs 45 seconds to respond (due to slow database queries or whatever), then it seem to me that the requests start getting piled one after the other. If I have multiple users connected to the web application at the same time (or with multiple tabs) things can turn bad.

Any suggestion about how to avoid this kind of problem?

I was thinking about using some thing such as ZMQ together with Sockets.io in the client side, but as the data I'm requesting doesn't get fired from any user action, I don't see how this could be triggered from the server side.


Answer:

I was thinking about using some thing such as ZMQ together with Sockets.io in the client side...

This is almost definitely the best option for long-running requests.

...but as the data I'm requesting doesn't get fired from any user action, I don't see how this could be triggered from the server side.

In this case, the 'user action' in question is connecting to the socket.io server. This cut-down example is taken from one of the socket.io getting started docs:

var io = require('socket.io')(http);

io.on('connection', function(socket) {
  console.log('a user connected');
});

When the 'connection' event is fired, you could start listening for messages on your ZMQ message queue. If necessary, you could also start the long-running queries.