Please take the opportunity to ask a friend to visit your web site. Then ask them how long they believe it took to pull up the home page. If your friend is like the average person, they will think your site was running more than one-third slower than it actually performed. According to Stoyan Stefanov, the architect of YSlow 2.0 and Smush.it, users have faulty perceptions of time, and when recalling how long a page loads, they will remember it took about 35% longer than it really did.

Recently, Tom Cagley of Software Process and Measurement Cast interviewed me concerning load testing. Tom’s blogs and podcasts are focused on interviews, essays, facts and tips about process improvement and measurement in the Information Technology arena. INTERVIEW IS HERE

Tom is Vice President, Director of Process Improvement and Measurement at David Consulting Group, and the author of the hit book Mastering Software Project Management: Best Practices, Tools and Techniques. The book has received all 5-star ratings on Amazon.com.

Our load testing interview not only talks about web performance and stress testing, but it touches on several ancillary subjects of interest to Tom from his perspective with Agile software development. I have worked with LEAN manufacturing process, and Tom applies many of the best practices from this successful discipline in his approach to software development. One mantra of LEAN is, “You can’t control what you can’t measure.” Tom has a keen emphasis on measurement of any metric that can help a team deliver better software.

In fact, he has written several blog posts about Seven Deadly Sins of Measurement Programs that could be quite beneficial to developers studying load testing because measurement of metrics is critical to all web system performance.

Tomorrow is a holiday here, and it puts me in a good mood. Thanksgiving forces me to take some quiet time to ponder how truly blessed I am. So today I’m in an appreciative frame of mind. It also makes me feel somewhat child-like because I remember the smell of huge Thanksgiving dinners with my grandfather’s family.

This atmosphere has put me in kid mode. That’s the only way I can explain why I found it so fun to get wrapped up in cool web statistics that I found at StatCounter.com. Am I the last geek on the planet to find this site? StatCounter, I’m grateful for you.

What at first caught my eye was the percentage of usage for each web browser. Knowing how many people are using Firefox vs. IE vs. Chrome has value to me since the browser affects performance of websites (user perspective). It’s also good to know IE 6 vs IE 7 vs IE 8. When Firefox came out about 6 years ago, I started using it primarily because it performed better. Now, Chrome and Firefox seem to be implementing the best new functionality and speed improvements, so I fire them up almost interchangeably – which is problematic for my bookmarks.

However, I was surprise to see the Mashable article declaring Microsoft Internet Explorer Loses Browser War. It seems to me IE still has the most share, but the drop from 99% to 49% over the past few years is noteworthy. Perhaps there are millions of nerds celebrating from a feeling of schadenfreude as they watch MS lose ground somewhere in the software industry. Come to think of it, I have a bit of thinly-veiled grudge residual from the hundreds of “blue screen moments from hell”.

The team over at OakLeaf posted another summary of some
load testing they are doing with LoadStorm. Apparently, their first round concluded that:

“These tests were sufficiently promising to warrant instrumenting the project with Windows Azure Diagnostics.”

I found an interesting blog post today through a tweet that HootSuite found referencing LoadStorm. It is cool to see someone blogging about using our load testing tool.

I’ll go out on a limb and share what I see in the tea leaves. My research department is slightly smaller and has less funding than Gartner, but I feel very confident sharing this with the world of web developers, testing professionals, and software architects:

A couple of months ago, Adron and I connected on Twitter. He fits the perfect profile of people I like in social media: a web developer, software architect, cloud computing advocate, public speaker, adrenaline junkie that is into heavy metal, transit & logistics, economics, and beautiful things. He also believes that load testing is often overlooked and performance can make or break a project (see below).

I’ve been aware of Gartner for many years because no matter what IT company I worked for, every executive swore allegiance to the Magic Quadrant. Seems like most execs believe in Gartner’s prediction of the future. It also seems to me their predictions are based on quite a bit of historical fact and good research. Well, a few weeks ago they released information on their hottest technologies for 2011.

I’m not surprised that Cloud Computing is at the top. The Amazon Elastic Cloud is what enables LoadStorm to provide the biggest bang for the buck when it comes to load testing tools. We are also developing some specific capabilities that allow load testing from virtual mobile devices. Those two are the biggest focus of Gartner, so I think we are setting the right product strategy.

I bring you a blog post I found today about performance testing and improvement, but first….

On this day in 1895, physicist Wilhelm Conrad Rontgen (1845-1923) becomes the first person to observe X-rays. Like most other tremendous breakthroughs, Rontgen’s discovery occurred accidentally in his Wurzburg, Germany, lab, where he was testing whether cathode rays could pass through glass when he noticed a glow coming from a nearby chemically coated screen. He dubbed the rays that caused this glow X-rays because of their unknown nature.

X-rays are not useful to increase the performance of your web sites, but it sure would be helpful if we had a way to easily see inside our web system architecture to spot the bottlenecks and inefficiencies. Wouldn’t it be cool to run a scan on a site and be able to quickly see a performance problem like the big bump on this guy’s hand? Maybe that was Wilhelm’s wedding ring. Or maybe he should have stopped cracking his knuckles when his mom told him to “quick that nasty habit!”

This blog was posted today and provides some quick suggestions for improving web site performance. How To Enhance The Online Performance Of Your Website by Kabir Bedi presents some basic tips that are useful.

Here are a few good insights about software performance from Robert Read in his eBook entitled, How to be a Programmer: A Short, Comprehensive, and Personal Summary. Robert dedicated the book to his colleagues at Hire.com.

My favorite parts are listed here as excerpts and included below in the original context:

  • Bottlenecks in performance canl be an example of counting cows by counting legs and dividing by four instead of counting heads.
  • The purpose of stress testing is to figure out where the wall is, and then figure out how to move the wall further out.
  • If the wall is too close to satisfy your needs, figure out which resource is the bottleneck (there is usually a dominant one.) Is it memory, processor, I/O, network bandwidth, or data contention?
  • Performance is a part of usability.
  • Most software can be made (with relatively little effort) 10 to 100 times faster than they are at the time they are first released.
  • If a well-isolated algorithm that uses a slightly fancy algorithm can decrease hardware cost or increase performance by a factor of two across an entire system, then it would be criminal not to consider it.
  • A plan for stress testing should be developed early in the project, because it often helps to clarify exactly what is expected. Is two seconds for a web page request a miserable failure or a smashing success? Is 500 concurrent users enough?
  • I’ve made errors such as failing to provide a relational database system with a proper index on a column I look up a lot, which probably made it at least 20 times slower.
  • Other examples include doing unnecessary I/O in inner loops, leaving in debugging statements that are no longer needed, and unnecessary memory allocation.
  • Stress testing is fun.
  • Who has particular knowledge about a component also constantly changes and can have an order of magnitude effect on performance.
  • Finding the expensive I/O and the expensive 10% of the code is a good first step
  • There is not much sense in optimizing a function that accounts for only 1% of the computation time.
  • Each change brings a test burden with it, so it is much better to have a few big changes.

It’s no secret that Google likes speed. They have made several announcements about the importance of speed on the web and go so far as to describe themselves as “obsessed with web speed”. In April 2010, they announced that Google search was including a new signal in their search ranking algorithms: site speed.

“Historically, we haven’t had to use it in our search rankings, but a lot of people within Google think that the web should be fast,” says Matt Cutts, Google Software Engineer. “It should be a good experience, and so it’s sort of fair to say that if you’re a fast site, maybe you should get a little bit of a bonus. If you really have an awfully slow site, then maybe users don’t want that as much.”

To this end, Google has released a web performance tool commonly called “Page Speed“. There are actually several tools related to performance profiling of web pages. According to the Google overview:

“The Page Speed family consists of several products. Web developers can use the Page Speed extension for Firefox/Firebug to analyze performance issues while developing web pages. Apache web hosters can use mod_pagespeed, a module for the Apache™ HTTP Server that automatically optimizes web pages and their resources at serving time. “

Google Page Speed is an open source Add-on for Firefox and Firebug. This add-on will help you analyze deeply your website/blog in order to improve its performance and crawling process. The performance tool not only evaluates the performance of web pages, it will also provide suggestions on performance improvement. It performs several tests on a site’s web server configuration and front-end code. These tests are based on a set of best practices known to enhance web page performance. Webmasters who run Page Speed on their pages get a set of scores for each page, as well as helpful suggestions on how to improve its performance. It is also Google’s preferred environment for introducing new performance best practices.

Roger Moore turns 83 today. As an actor, he is best remembered as James Bond in several movies during the 1970s where he always got his man (and his girl). James was an inspiration to all of us hot-blooded boys, and his exploits of saving the free world from evil villains was the epitome of performance.

On to today’s performance testing news. Apparently IE9 is faster with HTML 5 than Firefox 3.6. Who knew?! Is this the first time in history that a Microsoft browser is the fastest at anything? Read the article below to find out.

Are you getting ready for the holiday season? Can your website handle the huge influx of traffic your marketing department is about to send your way? Tis the season to be load testing. If you aren’t hammering your site and performance tuning it in the next month, it will probably be too late to prevent lost revenue.

Neil Ashizawa has written two articles in one entitled How to Test Your Web Apps to Avoid Website Downtime This Holiday Season with a second part that is called, How to remove testing headaches from holiday shopping season.

Today we had a customer push the limits for a high measurement of requests per second in our performance testing tool. Their test hit a peak of nearly 1,800 requests per second! Wow, that pretty good.

We are still making some improvements to our our own system bottlenecks in AWS, but our team is obviously making some great progress. Yeah, I know. The irony is not lost on me. 😉

So I sit here in my office on a beautiful Saturday afternoon, and I’m trying to find more good blogs on load & performance testing. Lots of sources, but many of them aren’t posting very often. I’m trying to find a steady stream of good content. Dynatrace seems to really have the best and most active web performance blogging posts.

Then I came across MSDN blogs. Seemed like a great place to find posts about load testing. Ah…here it is, just what I was looking for: VSTS Load Test Team Blog.

I ran into a Tweet today that said, “Is #loadtesting to know *where* it beaks, not *if*. That’s it does isn’t the issue, that I know where and roughly when is #load #testing”. The tweet caught my eye because of the hash tag for loadtesting.

The implication, as I interpreted the 140 character bit of wisdom, is that load testing answers the questions:

  1. How much load breaks a system?
  2. Where does the system break under load?

Novell’s site has a great article about load testing describing how The Washington Post Company implemented an identity access management system by Novell.

The article posts this as a primary challenge of their project: How to quickly, reliably and easily conduct system load testing for Novell Access Manager 3.x (Hereafter referred to as “NAM”). You might be surprised at their solution, but I think it is very, very cool.

Have you ever had a Krispy Kreme Burger? It’s definitely over the top. Too much of a good thing.

Identity Access Management (IAM) is a very complex aspect of IT. Finding someone that really understands it is a challenge. I have found one of those guys.


Corbin Links is an expert in the implementation of various forms of access management systems – commonly called Single Sign On by those of us that need to simplify. He is the author of a trilogy of books entitled “IAM Success Tips Volume 1-3”.

Corbin has gotten involved with load testing because many times his projects for large enterprises requires his team to ensure performance of IAM. We at LoadStorm are fortunate in having Corbin as a user of our load testing tool, and he has been a delightful customer. Recently, he asked if I would come on his podcast to share some thoughts about performance testing relative to IAM.

I read a good article this morning that presents a case study of scaling a web site. 6 Ways to Kill Your Servers – Learning How to Scale the Hard Way by Steffen Konerow presents some excellent points about how to avoid system crashes in your web application. Surprisingly, there are not direct mentions of load testing the site re-launch before going live. Load testing is implied throughout, yet never specifically addressed.

I have received questions from customers about load testing reports that show their server doing some unexpected things. For example, a customer sent some server monitor that showed a pattern of large peaks of CPU utilization followed by a precipitous drop to a low level of usage. He wanted to know why LoadStorm wasn’t applying a consistent load to his system as evidenced by the CPU spikes.

Similar Posts