6

I am having trouble serving concurrent requests from my simple node.js server using a mongoDB.

What i am doing is sending querys in mongo shell format, parsing them and sending back data from mongoDB.

But it seems that it cannot serve multiple requests in the same time for some reason.. (BTW i am using it locally)

EDIT: I sometimes get this error in my client app :

"Stream Error. URL: http://localhostlink:1337"

which i also get when the server is not running...

EDIT2: I removed the db.close statements.

Is cluster really the right solution here because i will be just upgrading the app to serve 4 concurrent requests which will still not be enough.

Should i completely refactor the server in some other way? I just want to be able to serve multiple requests for data from mongoDB.

EDIT3:

Is it ok that i am first creating the server and then connection to mongo, or should i create the server inside the MongoClient.connect(...) function?

This is the code of my (unoptimised) server:

 var http = require('http'); var qs = require('querystring'); var mongo =require('mongodb'); var MongoClient = mongo.MongoClient; var result; var response; var ObjectId = require('mongodb').ObjectID; var myDb; http.createServer(function (request, res) { console.log("creating server..."); MongoClient.connect("mongodb://127.0.0.1:27017/lalal", function(err, db) { if(err) { return console.dir(err); } if (request.method == 'POST') { var body = ''; response = res; request.on('data', function (data) { body += data; // 1e6 === 1 * Math.pow(10, 6) === 1 * 1000000 ~~~ 1MB if (body.length > 1e6) { // FLOOD ATTACK OR FAULTY CLIENT, NUKE REQUEST request.connection.destroy(); } }); request.on('end', function () { var clientData = qs.parse(body); var parts = clientData.data.split("."); var collectionName = parts.shift(); var queryBig = parts.join("."); var queryParts = queryBig.split("("); var method = queryParts[0]; var query = queryParts.join("("); console.log("query:"+query); console.log("method:"+method); console.log("collection:"+collectionName); var callback; switch(method){ case 'find': callback = '.toArray(findCallback);'; break; case 'insert': query = query.substring(0, query.length - 1); callback = ',insertCallback);'; break; case 'remove': query = query.substring(0, query.length - 1); callback = ',removeCallback);' break; case 'save': query = query.substring(0, query.length - 1); callback = ',saveCallback);' break; case 'update': query = query.substring(0, query.length - 1); callback = ',updateCallback);' break; } if(query.indexOf('"_id"') != -1) { var indexHelper = query.indexOf('"_id"')+7; var s = query.substring(indexHelper, query.length); var indexOfQuote = s.indexOf('"') var restOfQuery = s.substring(indexOfQuote+1,s.length); var key = s.substring(0,indexOfQuote); query = query.substring(0,indexHelper-1) + 'new ObjectId("'+key +'")'+restOfQuery; } // Connect to the db // myDb = db; var collection = db.collection(collectionName); var command = 'collection.'+query+callback; console.log("command:"+command); eval(command); function findCallback(err, items){ console.log(items); response.writeHead(200, {'Content-Type': 'text/plain'}); response.end(JSON.stringify(items)); } function insertCallback(err, objects) { console.log(objects); if (err) console.warn(err.message); if (err && err.message.indexOf('E11000 ') !== -1) { response.writeHead(200, {'Content-Type': 'text/plain'}); response.end('"error":"_id already exists"'); } else{ response.writeHead(200, {'Content-Type': 'text/plain'}); response.end(JSON.stringify(objects)); } } function removeCallback(err, numberOfRemovedDocs) { response.writeHead(200, {'Content-Type': 'text/plain'}); response.end(JSON.stringify(numberOfRemovedDocs)); } function saveCallback(err, result) { response.writeHead(200, {'Content-Type': 'text/plain'}); response.end(JSON.stringify(result)); } function updateCallback(err, numberOfUpdatedDocs) { response.writeHead(200, {'Content-Type': 'text/plain'}); response.end(JSON.stringify(numberOfUpdatedDocs)); } }); } }); }).listen(1337, '127.0.0.1'); console.log('Server running at http://127.0.0.1:1337/'); 
3
  • 1
    While there's a lot of code there and I certainly haven't reviewed it all, you should only open the DB connection once and leave it open for the lifetime of the Node process. Commented Sep 14, 2013 at 11:40
  • Could it be that this part "myDb = db;" in MongoClient.connect is where the problem is coming from? because it is no longer async right? Because i am passing the reference to a globally accessible variable, so when the new request comes it takes over Commented Sep 14, 2013 at 11:50
  • db.close() why are you closing the connection? Commented Sep 14, 2013 at 14:55

1 Answer 1

3

The problem you are seeing is because node.js is single threaded. That is, it will dispatch one request at a time (this is actually good, since it helps avoiding bugs caused by global variable handling). If you sent the response before executing your queries you would see parallel query execution. However, given the way you structured your program, you might be better off using the 'cluster' module. The code below will start four concurrent processes.

var cluster = require('cluster'); if (cluster.isMaster) { for (var i = 0; i < 4; i++) { cluster.fork(); } cluster.on('exit', function (worker, code, signal) { cluster.fork(); }); } else { // run your node.js + MongoDB code here } 

PS. You don't need to close the db connection when using MongoClient.connect, as this API uses the connection pool, which manages your connections.

Sign up to request clarification or add additional context in comments.

5 Comments

Is the cluster really the right solution here? wont i just be upgrading my server to serve 4 concurrent requests? That is still not enough for me.
@deloki Each cluster instance has its own connection pool so there can be up to 4*(pool size) MongoDB queries in progress at a given time. Pool size defaults to 5 but you can make it however big you want.
@JohnnyHK Ok, thank you. So this is the best practice way of achieving concurrent DB requests with sending response data as payload? If not what would be the best practice way?
I would say is it a good practice. BTW, you can start more processes if you want. 2 to 4 times the number of cores in your system is a good rule (I have seen this concurrency number working fine in thread pool implementations based on I\O completion ports, which normally do intensive I\O).
@ruiz can we use forky instead of cluster if yes can u please give me an example.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.