You probably have been wondering why I’ve posted so infrequently over the past year. We have been bombarded with emails and phone calls demanding more blogging that includes my extremely dry, obtuse humor. So, in the interest of global stability and national security, I must acquiesce to the will of the masses.

Right. That’s a joke. I’m only slightly more popular than Justin Bieber. If you are a serious tech geek, you’ll need to look up that Bieber reference.

Web performance is why you read our blog, and web performance is my life. I want our blog to contain the most interesting information about load testing, page speed, and application scalability. In order to deliver on that goal, we came up with the concept of using LoadStorm and other tools to gather actual performance data regarding as many web applications as possible.

Thus, the Web Performance Lab was born.

Why Create a Web Performance Lab?

Amazon EC2 General Purpose Types of Server Instances

The Web Performance Lab is designed to be a virtual sandbox where we install software and run experiments to find out how the software performs. Why? Because I sit around and wonder if an AWS EC2 m3.2xlarge instance will run a web app four times faster than a m3.large. It should, since it has 8 virtual CPUs compared to 2, and it has 30 GB of memory compared to 7.5. That’s simple math, but rarely does linear algebra work out in web performance. Just because we have 4x the horsepower on the hardware does NOT result in 4x the speed or scalability.

WPL (I’ll abbreviate because we geeks love acronyms) gives us a playground to satisfy our curiosity. I want to know:

  • What is the most scalable ecommerce application?
  • How many concurrent users can WordPress handle on a small server?
  • Which cloud platform gives me the biggest bang for the buck?
  • Does Linux or Windows provide a better stack for scalability?
  • Can I fairly compare open source software performance to commercial solutions?
  • Who will help me create statistically accurate load testing scenarios for experiments?
  • Are other performance engineers as curious as me about these comparison test results?
  • How can I involve other application experts to achieve more useful performance results?
  • What other tools (not LoadStorm) should I put in the WPL toolkit for testing?
  • Can we identify the top 10 best performance optimizations that apply to each web application?
  • Who are the top experts on each of the most important web applications?
  • Do they know anything about tuning their apps for scalability?
  • Will they want to participate?
  • Are they able to share their knowledge in an objective set of experiments?
  • What are the decision criteria in picking the best server monitoring tool?
  • Nginx vs. Apache – what are the real numbers to compare?
  • How many types of caching exist?
  • What type of caching gives the highest level of increase to scalability?
  • Can we empirically conclude anything from running a Webpagetest or Pagespeed or Yslow analysis?
  • Are the advertised “free” server monitoring tool actually free?
  • Will most web applications get memory bound before being CPU-constrained?
  • NewRelic vs. AppDynamics – which is best for what type of architecture?
  • How do open source load testing tools such as JMeter compare to cloud solutions such as LoadStorm?
  • Drupal vs. WordPress vs. Joomla vs. Redaxscript vs. concrete5 vs. pimcore?
  • Magento vs. osCommerce vs. OpenCart vs. WooCommerce vs. Drupal Commerce vs. VirtueMart?

There are hundreds of similar questions that I ponder with Elijah Craig. He and I are very productive in the evenings, and he helps me plan experiments to solve the riddles of web performance that appear to me in visions after a long day of work in the load testing salt mines.

Ecommerce Application Performance is High Priority

U.S. Online Retail Sales 2012-2017

That last question in the list is worthy of assigning highest priority in the web performance lab. Online retail is a great place to start because it has so much at stake. There are so many good test scenarios to build into experiments. Ecommerce provides excellent examples of business processes that are important to measure.

With over $250 billion of online sales in the U.S. alone during 2013, and with over 15% annual growth, how could we ignore ecommerce? It’s the biggest market possible for our Web Performance Lab to address. The stakes are enormous. My hope is that other people will be as interested as I am.

Cyber Monday 2013 generated $1.7 billion in sales for a single day! What ecommerce applications are generating the most money? I doubt we will ever know, nor will the WPL answer that question. However, some of you reading this blog will want to get your share of that $300 billion this year, and the $340 billion in 2015, so I’m certain that you need to understand which online retail platform is going to perform best. You need to know, right?

Cyber Monday sales data

We will run experiments to gather objective performance measurements. How many combinations and permutations can we try?

  • Each app with out-of-the-box configuration (no tuning) on an EC2 large instance, c3.8xlarge, r3.8xlarge, hs1.8xlarge, m1.small
  • Each app (no tuning) running on a similar server in Google Compute Engine, Rackspace, Azure, Joyent, Bluelock, Savvis
  • Each app (no tuning) running on same hardware using Ubuntu, Fedora, CentOS, Debian, Red Hat, Gentoo, Windows
  • Each app tuned with web application caching such as memcached, Varnish, JCS
  • Each app tuned with a CDN such as Cloudfront, Akamai, Edgecast, Cloudflare, MaxCDN
  • Each app tuned with different database configurations
  • Each app tuned with different web server configurations
  • Each app tuned with different database configurations

We have been running some of these experiments during the past few months. Esteban shared some of his results in blog posts earlier. My problem with his work is that some of the conclusions aren’t as solid as I would prefer. I spent some time reviewing his results with him in a conference room recently, and I poked some holes in his logic.

Now don’t get me wrong, I am an Esteban fan. He is a friend and high character guy. That said, we all learn from experiments. We try, we ponder, we learn. That’s how humans gain understanding. As a child you figure out the world by putting your hand on a hot stove. You register that learning experience in your brain, and you don’t do that again. You find out the best ways to accomplish objectives by failing. Just ask Edison. He figured out 1,000 ways how NOT to create a functional lightbulb before he found the correct way. So it is with WPL. We are learning from trying.

Therefore, we are beginning a new series of experiments on ecommerce platforms. We will be publishing the results more quickly and with less filtering. We hope you find it useful and interesting. Please feel free to comment and make suggestions. Also, if you disagree with our statistical approach or calculations, please let us know. Recommendations are also welcome for ways to improve our scientific method employed during the experiments.

Similar Posts