WordPress Hosting Providers Study: Web Performance & Scalability – LoadStorm

When it comes to web performance, study after study has proven: fast and scalable wins the race. But with thousands of WordPress hosting providers, how do you know which one is fast and scalable?

That is where ReviewSignal.com comes in. Their business is all about helping people identify which hosting provider is the best choice for them. Kevin Ohashi from ReviewSignal has been working with LoadStorm to run a series of load tests on some of the top WordPress hosting providers to determine which is the best for companies who need scalable websites.

Our performance engineers have teamed up with Kevin to analyze the multitude of data and provide this report of the top WordPress hosting providers for web performance. Providers included in this study are: A Small Orange, FlyWheel, GoDaddy, Kinsta, LightningBase, MediaTemple, Nexcess, Pagely, Pantheon, and WebSynthesis. These providers were included in the 2,000 user load test because they didn’t struggle with the first test of 1,000 concurrent users.

This analysis only looks at the final load test of 2,000 concurrent users, but Kevin’s article analyzes the results of both tests and looks at long term up-time reliability. Check out Review Signal’s report of the full study here.

Parameters:

All tests were performed on identical WordPress dummy websites hosted on 10 different hosting services. All sites were tested with the same plugins except in cases where hosts added extra plugins. The websites had identical scripts that included browsing and login. The load tests were run in LoadStorm PRO for 30 minutes with a linear 20 minute ramp up from 500 to 2,000 virtual users and holding at the peak for the for the remaining 10 minutes.

Scoring:

In order to rank the top providers, we have broken our analysis down by the key web performance metrics:

  • Error Rate
  • Average Response Time
  • Peak Response Time
  • Average Page Completion
  • Throughput

To fairly rank the top providers, we ranked each provider by each performance metric at the 20 minute mark in the test, when all sites were under full load of 2,000 users. For each metric, the providers were ranked (1st through 10th) according to their performance and then a point value was assigned to each. Then we determined our final ranking position based on their total score, the sum of all points from all the metrics.

Test Data:

To view the full test results with interactive graphs in LoadStorm PRO, click on each hosting provider below:

Metrics:

Error Rate

Error rate is probably the most important metric for businesses wanting to be certain that a website won’t crash under high traffic. High error rates mean one thing: Lost customers.

Surprisingly, we had a 7-way tie for first place with 0% error rates. Overall, this speaks volumes to the scalability of all the websites included in the study. Flywheel started to fail at around 1500 concurrent users and began returning 502 errors, which explains its high error rate.

Average Response Time

Average Response Time is very significant because it directly affects the user experience and perceived load time. This metric measures the time each request takes “round trip” from the browser sending the request to the server, the server processing the request, and then the response from the server back to the browser. The Average Response Time takes into consideration every round trip request/response cycle for that minute interval and calculates the mathematical mean of all response times.

Peak Response Time

This metric also measures the same “round trip” that the Average Response Time does, but instead of averaging the time for all requests, Peak Response Time is simply the single longest (slowest) time for a single request.

Average Page Completion

Average Page Completion Time is a metric that measures the amount of time from the start of the first request to the end of the final request on a page.

In regards to the specific times in this study, the test shows unusually fast Average Page Completion times. After investigating why the pages were loading so quickly, it turns out that some of the pages on the dummy website were very simple with very few requests each. While users with real websites on these providers would expect to see slower average page completion times, the tests are still valid because all providers had the same simple pages.

Throughput

Throughput is measured by the number of kilobytes per second that is being transferred. This measurement shows how data is flowing back and forth from the server(s). High throughput is a mark of good web performance under load because it shows that there aren’t any bottlenecks blocking and slowing the data transfer. Low throughput, as seen in WebSynthesis, signifies that the server is overwhelmed and is struggling to pass data to and from the server.

Interestingly, GoDaddy pushed triple the amount of data through because their admin screen had more resources being loaded. Which is why the average throughput is so high. Despite the extra data to process, they still had significantly higher average response times than most of the other providers. Anytime a site makes more requests, it slows down performance. Therefore, without so much extra data it is fair to say that GoDaddy could have possibly been faster than all the others.

Ranking

From the final point tallies, we can see that there are three clear sections. Top Performers: Pantheon, MediaTemple, GoDaddy, and Kinsta. Good Performers: Nexcess, LightningBase, A Small Orange, and Pagely. Fair Performers:: FlyWheel and WebSynthesis.

Similar Posts