Last year, (2015) Raygun tested Node performance against Hapi, Express.js and Restify, bench-marking some common node frameworks’ performance. (Read the latest Node.js performance test here.)
In this article, we’ve run a new round of tests for 2016. We’ve broken the results down and by popular request have included how to reproduce the test.
These new round of tests are intended to show how the performance of these frameworks has changed since last year. Tests were performed on a fresh debian install running within VirtualBox on Windows 10. Due to the fact that these tests only utilize the most basic capabilities of the frameworks in question the main goal was to show the relative overhead these frameworks add to the handling of a request rather than the absolute performance and utility they provide, as this will vary greatly depending on the environment, network conditions and the applications built on top of them.
However, it may be worth a note that this test isn’t designed to highlight which framework you should be using to build a website. These frameworks do offer a lot more to folks building rich web applications, and the features offered are likely more important than raw throughput.
Let’s take a look at what we found last year: (you can see the full test from last year here)
Last year’s results:
As you can see from the graph, in 2015 we found Restify was the clear winner. (However it’s important to use these results in context).
As in the previous test of node performance we are looking at the simplest functionality the frameworks provide.
In this test, we are receiving a request and returning a static string (“Hello World”). While this use case won’t compare exactly to any real world use, it is still useful to know the theoretical maximum number of requests that can be handled per second.
At Raygun this is interesting to us as our ingestion endpoints have to deal with a large volume of incoming requests most of which are quick returning and responded to with either a status code of simple error message.
This test is similar enough to our real world use case that the performance implications discovered may impact future development of specific parts of our infrastructure.
Requests per second is the main piece of data we want to focus on as it gives the most useful metric provided for our service. This is true for many web applications where down to the millisecond latency is not as important as absolute throughput.
To perform the test we utilized Apache HTTP server benchmarking tool (known as ab – it’s executable name) to create the requests to the server. We chose to use relatively standard parameters for ab for the test, these were 50,000 connections total with 100 concurrent connections and and a 20 second timeout per request. The entire test run was executed 5 times and the average was taken to get the final graphed result set.
Due to how different frameworks handled connections we also had to set a Connection header for all responses to ‘close’ to ensure no connections were left open.
Specific Versions Tested
|OS: Debian 8.5||Runtime: Node.js 6.2.2|
|Express 4.14.0||Hapi 13.4.1||Koa 1.2.0||Restify 4.1.1|
The results of the node performance test
Unsurprisingly the Raw node.js application was fastest but by a lesser amount than expected in this particular node performance test. Koa and Restify were within 15% of the requests per second and only Hapi was significantly slower at just above 50% the speed of the raw node app.
These results of the node performance test are quite similar to the last set with Restify being the fastest framework, Express just behind and Hapi still being the slowest.
The relative difference in performance, however, has decreased and the total set of results only range by a factor of 2X compared to the 4X difference seen in the previous tests.
This could be caused by optimisations made within each framework as they have matured or by the node.js runtime having improved throughout the past year to better optimise all running code through advances in it’s JIT compilation.
Because of the relatively smaller performance differences than previously we can say that choosing a lightweight framework is probably more appropriate than ever (even for high performance applications) given the range of features and the structure they bring and the relatively small performance impact they add.
As with anything we’d recommend diving deeper into the performance considerations of each framework before switching or committing to a specific framework. Hopefully this gives some idea about how well each stack up – especially for simple tasks.
Here’s what our results mean for you:
1. Restify wins performance-wise again this year. Especially if you’re building a service that receives lots of requests from the same clients and want to move quickly. You of course get a lot more bang for buck than naked Node since you have features like DTrace support baked in.
2. If the cost of your servers is becoming a problem due to volume, rewriting to raw Node.JS will help you and your team to handle more requests, but be wary of what you lose feature-wise. (e.g. DTrace support, logging features, etc).
3. If you’re building a richer web app, you’d want to review the offerings based on feature set. Restify is not designed for heavy browser apps, although Hapi and Express are.
4. If you’re building a web app that needs thousands of requests per second, you likely have a high quality problem and should deep dive yourself by cloning the repository yourself (below.)
As an example, here at Raygun, we use Express for our API endpoints which ingest data for both Raygun Pulse and Raygun Crash Reporting. Our company has peaked in the past at more than 110,000 requests per second coming through to Raygun, so performance of our API is important.
We’ve found Express to be satisfactory but the other frameworks I tested seemed to have similar capacity in terms of performance and features. Each of the apps built for these test were less than 30 lines of code so picking a framework on these test results alone is probably not wise.
Hopefully this article can act as one of many datapoints that helps you decide on the framework best for you.
Reproduction of the node performance test
These tests can be replicated by cloning this repository and running:
The script installs all the required tooling to run the test as well as npm and the latest version of node. It then runs all the servers and benchmarks in sequence outputting the results to a file. We highly recommended reviewing the script to ensure it will not interfere with any other processes on your machine before running it.
As pointed out by Eduard in the comments there was an issue with the script causing the Restify test to be conducted on an unintended route. This caused the connection to be kept alive causing improved performance due to reduced overhead.
I reran the tests using the latest versions of all frameworks. These are: node v7.0.0, firstname.lastname@example.org, email@example.com, firstname.lastname@example.org, email@example.com
The new results are:
This shows that indeed Restify is slower than reported in my initial test. It also shows that Hapi has become slower relative to the other frameworks since the initial run.
Tired of chasing software errors? Raygun Crash Reporting keeps a watchful eye over your applications and lets you know when problems arise. Take a free 14 day trial here, or book a demo with a friendly team member here.