Experiment:
When it comes to web performance, study after study has proven: fast and scalable wins the race. But with thousands of WordPress hosting providers, how do you know which one is fast and scalable?
That is where ReviewSignal.com comes in. Their business is all about helping people identify which hosting provider is the best choice for them. Kevin Ohashi from ReviewSignal has been working with LoadStorm to run a series of load tests on some of the top WordPress hosting providers to determine which is the best for companies who need scalable websites.
Our performance engineers have teamed up with Kevin to analyze the multitude of data and provide this report of the top WordPress hosting providers for web performance. Providers included in this study are: A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Nexcess, Pagely, Pantheon, and WebSynthesis. These providers were included in the 2,000 user load test because they didn’t struggle with the first test of 1,000 concurrent users.
This analysis only looks at the final load test of 2,000 concurrent users, but Kevin’s article analyzes the results of both tests and looks at long term up-time reliability. Check out Review Signal’s report of the full study here.
Parameters:
All tests were performed on identical WordPress dummy websites hosted on 10 different hosting services. All sites were tested with the same plugins except in cases where hosts added extra plugins. The websites had identical scripts that included browsing and login. The load tests were run in LoadStorm PRO for 30 minutes with a linear 20 minute ramp up from 500 to 2,000 virtual users and holding at the peak for the for the remaining 10 minutes.

Scoring:
In order to rank the top providers, we have broken our analysis down by the key web performance metrics:
- Error Rate
- Average Response Time
- Peak Response Time
- Average Page Completion
- Throughput
To fairly rank the top providers, we ranked each provider by each performance metric at the 20 minute mark in the test, when all sites were under full load of 2,000 users. For each metric, the providers were ranked (1st through 10th) according to their performance and then a point value was assigned to each. Then we determined our final ranking position based on their total score, the sum of all points from all the metrics.
Test Data:
To view the full test results with interactive graphs in LoadStorm PRO, click on each hosting provider below:

Metrics:
Error Rate
Error rate is probably the most important metric for businesses wanting to be certain that a website won’t crash under high traffic. High error rates mean one thing: Lost customers.
Surprisingly, we had a 7-way tie for first place with 0% error rates. Overall, this speaks volumes to the scalability of all the websites included in the study. Flywheel started to fail at around 1500 concurrent users and began returning 502 errors, which explains its high error rate.
Average Response Time
Average Response Time is very significant because it directly affects the user experience and perceived load time. This metric measures the time each request takes “round trip” from the browser sending the request to the server, the server processing the request, and then the response from the server back to the browser. The Average Response Time takes into consideration every round trip request/response cycle for that minute interval and calculates the mathematical mean of all response times.
Peak Response Time
This metric also measures the same “round trip” that the Average Response Time does, but instead of averaging the time for all requests, Peak Response Time is simply the single longest (slowest) time for a single request.
Average Page Completion
Average Page Completion Time is a metric that measures the amount of time from the start of the first request to the end of the final request on a page.
In regards to the specific times in this study, the test shows unusually fast Average Page Completion times. After investigating why the pages were loading so quickly, it turns out that some of the pages on the dummy website were very simple with very few requests each. While users with real websites on these providers would expect to see slower average page completion times, the tests are still valid because all providers had the same simple pages.
Throughput
Throughput is measured by the number of kilobytes per second that is being transferred. This measurement shows how data is flowing back and forth from the server(s). High throughput is a mark of good web performance under load because it shows that there aren’t any bottlenecks blocking and slowing the data transfer. Low throughput, as seen in WebSynthesis, signifies that the server is overwhelmed and is struggling to pass data to and from the server.
Interestingly, GoDaddy pushed triple the amount of data through because their admin screen had more resources being loaded. Which is why the average throughput is so high. Despite the extra data to process, they still had significantly higher average response times than most of the other providers. Anytime a site makes more requests, it slows down performance. Therefore, without so much extra data it is fair to say that GoDaddy could have possibly been faster than all the others.
Ranking
From the final point tallies, we can see that there are three clear sections.
Top Performers: Pantheon, MediaTemple, GoDaddy, and Kinsta.
Good Performers: Nexcess, LightningBase, A Small Orange, and Pagely.
Fair Performers:: FlyWheel and WebSynthesis.
Conclusion:
Overall, most of the providers did surprisingly well under the full load of 2,000 concurrent users. Even though we wanted to rank them in a definitive order, the fact of it is that most providers did not reach failure rates at all in the test. So while we were able to rank them, there were several metrics where the difference between points was negligible (ie: 1 ms average response time difference between GoDaddy and Kinsta) but still calculated in our scores.
Additionally, the test utilized in our report is only part of the full ReviewSignal study. ReviewSignal ran tests at 1,000 users and the providers that crashed were not included in the tests at 2,000. Therefore, all of the providers included in this ranking should be considered great choices for scalable WordPress hosting.
This level of high performance in all 10 providers was unexpected with such a heavy load and we were very impressed by the results.





Does WP Engine not fit this category anymore?
Nick,
I’m the original source of this testing. LoadStorm analyzed the data I collected from testing using their service.
So WPEngine was included in the tests, however, their performance wasn’t up to par.
BlueHost, CloudWays, DreamHost, SiteGround, and WPEngine were all eliminated from the final round of testing. All of them failed to handle the load at lower level (with the exception of SiteGround which ran into unresolveable security issues blocking testing). If you’re interested in seeing the earlier test, take a look at http://reviewsignal.com/blog/2014/11/03/wordpress-hosting-performance-benchmarks-november-2014/
Gotcha! Thanks for answering.
It’s interesting that the companies that don’t make the cut anymore are the ones that have been at it the longest (ie. the least to prove). I guess that’s the sad reality with hosting… Once they’re well established, they focus less and less on making a splash in these kinds of roundups.
I guess it would be nice for my ego to have the company I’ve got all my sites hosted with show up atop these lists… But I’ll settle for my anecdotally great results and customer experience after several years with them.
With WP Engine it’s a bit interesting. These reviews don’t cover our Mercury platform and some other factors. But those WP Engine products are super new so might have been missed.
Hey Thomas! I do have one major complaint.
Austin (who I don’t think works there anymore) had promised me some WPE swag for my other instances of being an unbiased and friendly brand ambassador of yours… like over a year ago.
This makes me a sad brand ambassador 🙁
Haha Nick! Email my first name at WP Engine and I got you rocked out through the Labs team. Super excited to have conversations here though.
Lumping every WP Engine product offering into a single comparison is a bit misleading. It’s like saying hey every Mercedes car performs the same. We have stuff that’s multi-geo insane performance that delivers 70 million + uniques a day. We have some stuff that’s not as insane. But I get the thought process in some ways.
That fact that you’re here, and not being all defensive and shitty, is a pretty big statement in itself.
Tomas, I completely agree that the testing doesn’t touch on the larger offerings the companies have. I know Kevin at ReviewSignal did his best to try to compare apples to apples, but there is always something else that we can test and analyze.
I would love to chat with you and Kevin about doing some additional experiments on WP Engine. We could even compare performance on the different levels you offer!
Hey Arwen,
Feel free to email me (it’s up higher in the thread) and lets chat.
Hi Kevin,
Would it be possible to get the WordPress website that you used for these tests? I would guess that its a dummy website used just for this purpose so I would love to see the modifications and plugins on that website.
It was 2014 theme, no special plugins, just some lorem ipsum pages. Creating a representative site with different themes/plugins was something I attempted in round 1 and I don’t think it added much value. Each site is different and figuring out what should and shouldn’t be included seemed like it was just arbitrary and asking for trouble. So it was just testing the most basic of WP sites under the assumption the relative performance should scale across providers.
I am wondering the same thing. I have had great success moving clients over to WP Engine. Would love to see statistics on them. Great job Kevin!
You definitely have a valid point Nick! However, our analysis only included the final round of testing with LoadStorm at 2,000 concurrent users. From the 16 providers that Kevin started with, only 10 did not struggle with the 500 or 1,000 concurrent user tests and were then included in the 2,000 user test. Unfortunately, WP Engine did not make it to the final round of testing so was not included in our analysis. But you can check out their stats in ReviewSignal’s article: http://reviewsignal.com/blog/2014/11/03/wordpress-hosting-performance-benchmarks-november-2014/.
From Kevin’s article: “‘WPEngine had some issues. Uptime was not one of them, they were perfect or upwards of 99.9% in that department. However, their performance shortcomings became apparent during the load tests.’ They didn’t even make it to the final round of Load Storm testing.”
Understood. Thanks for addressing the question! I got a similar answer from Kevin, above 🙂
As I stated above I’d be interested to see this with the newer WP Engine technologies being used.
Awesome to once again see Kinsta performing awesome!
This test is very misleading when looking at the final result.
LightningBase for example got 17 less points than number one.
7 of these points from having a fail rate of 0.02% another 4 points for being 17ms slower on page completion and finally 2 points for having a throughput of 40kb/s less than number 2.
What happens if you put an acceptable range on your grading?
Also the fact that you are saying:
“Interestingly, GoDaddy pushed triple the amount of data through because their admin screen had more resources being loaded. Which is why the average throughput is so high.”
And still rate GoDaddy as number one is just a complete joke. More resources needed on a page is not equal to better performance.
Aleksander, you bring up an excellent point. Which is why I stated in the conclusion: “Even though we wanted to rank them in a definitive order, the fact of it is that most providers did not reach failure rates at all in the test. So while we were able to rank them, there were several metrics where the difference between points was negligible (ie: 1 ms average response time difference between GoDaddy and Kinsta) but still calculated in our scores.”
If you take a look at the ReviewSignal analysis of the full testing done (http://reviewsignal.com/blog/2014/11/03/wordpress-hosting-performance-benchmarks-november-2014/), Kevin does an excellent job of categorizing the providers based upon industry standards for acceptable performance. Kevin’s results are divided into providers without any major issues, providers with minor issues, and providers with major issues. All 10 of the providers included in our study were ranked in the “providers without any major issues” category. So really, you are correct that LightningBase did perform well. But the point of our analysis was to go one step further and rank them definitively and mathematically.
It is my personal opinion that all 10 providers ranked in this study performed admirably and were overall much better than we expected. For any provider to handle 2,000 concurrent users without any performance tuning is very impressive and I feel that all of the providers that made it to the final round of testing (the 2,000 user test) deserve commendation.
Thanks for the great post and sharing all this data!
Could you please include the WP configuration tested? All I saw was “All sites were tested with the same plugins except in cases where hosts added extra plugins” which is kind of assumed… It’s helpful however to know active theme, plugins and some description of the DB (size at least).
One can’t really assume a host that screams with TwentyFourteen, Akismet and a nearly empty database performs the same when confronted with a “feature rich” (aka bloated) theme, a dozen plugins (including related posts, share counts, etc) and a 1GB database full of postmeta.
It’d also be pretty cool to leave a demo site up as an example for people to see (just on one host, not all of them)
Jon,
It was a dummy site running 2014. No special plugins added and the database would have had dummy users and lorem ipsum posts with images. Very small. I attempted to create a ‘representative’ site in my first round of tests but really that did not add much value. The issue is, every theme, plugin and addition will add more complexity and may behave differently on different hosts. It’s impossible for me to create a good ‘representative’ site, so I stuck with the baseline. What I can probably be assumed is that no site will outperform these benchmarks in a real world scenario. I would expect the relative performance to scale between hosts though more or less.
Thanks for sharing your results. Shame Loadstorm is too expensive for me to test as I want to compare my WordPress test blog at http://wordpress7.centminmod.com/74/wordpress-super-cache-benchmarks-blitz-io-load-test-237-million-hitsday/ (CentOS 7.0 + PHP 7.0.0-dev + Centmin Mod Nginx + MariaDB 10.0.x MySQL on 2GB DigitalOcean KVM VPS) with the above results. Would love to see comparion using Blitz.io as you get free 10 credits per month enough for 1,000 to 10,000 user tests. Cheers
George,
We appreciate that you want to compare your results with ours and I’m sorry that you feel LoadStorm is too expensive. LoadStorm LITE offers a free subscription of 25 VUsers over an hour long test every month and LoadStorm PRO offers a one time free trial of 50 concurrent VUsers.
One thing to note is that the 10 free credits for Blitz only works for a 2 minute test run. We highly recommend that all load tests run for no less than a half hour to allow for appropriate ramp times and test durations. This allows for realistic test results that represent what real users would see during times of heavy traffic. It is a recommended best practice to run tests for a full hour with either linear or step up ramp patterns to get the most valuable results. A test duration of 2 minutes is not really comparable to the testing done in the LoadStorm section of this experiment.
However, if you take a look at the article published by ReviewSignal (our partner in this study) you will see that they did do some one minute testing with Blitz. You can view their full article at: http://reviewsignal.com/blog/2014/11/03/wordpress-hosting-performance-benchmarks-november-2014/
Great article. I’m trying to decide on what host to migrate my sites over but am still unclear after reading lots of reviews and pages like this. First off, are these tests based on shared hosting, VPS or SSD (I see these offered by different webhosts and am unsure how to choose or why one matters over the other).
Also, how does one determine how much RAM and space they even need for their sites? (actually, space makes sense, but the RAM I’m not sure)
If you have some articles to help this, that’d be great.
We have about 6 WP sites. 4-5 of which get mostly less than 500 hits per day. One of which gets 5,000-10,000 hits per day. They are mostly blogs but we also run business funnels through them and take orders, so we need to be sure we’ve got stellar load time and uptime.
A very interesting list. I would never have thought that Pagely and Synthesis would be somewhere at the end of a performance based list and WP Engine missing! Anyhow, goes on to prove that you cannot rest on past achievement but have to improve continuously. Here’s a list I recently updated on Managed WordPress Hosting under $30 : http://tech-vise.com/vise-feature-best-managed-wordpress-hosting-platforms/
@Kevin, if this is just a re-presentation of your original study then the Flywheel account you tested was their $15/mo. “Tiny” plan that is known to be throttled at a maximum of 1500 concurrent users, right? So the error rates coming up then are from that and do not reflect an inability to scale systemwide on other accounts. Is it a similar story with WebSynthesis? This seems to merit the kind of prominent note here that was included in the original report.