This is the third installment of an email interview with James Christie, software testing professional from Scotland. In the first part, James talks about his views on usability testing. In the second part he discusses leaders in usability testing, KPIs, test automation tools, cloud computing, and testing blogs.

James mainly discusses load testing in this last post. It’s my favorite segment. 😉

What would you say is the difference between load testing and performance testing?

I’d say performance testing is a general term that covers load testing too. Performance testing is a rather vague term covering response times and the application’s ability to cope with heavy loads. Load testing is a more specific term. I think of it as a technique to allow you to carry out performance testing effectively. You either work your way up through a series of load levels establishing how the application performs, or hit it with the maximum loads it will have to cope with (plus a bit more to allow a margin for error).

Do you see any intersection points between usability and load testing?

Definitely! Usability testing isn’t usually thought of alongside load or performance testing, though both are lumped together under the non-functional banner. I guess amongst the IT profession load testing has had a rather more glamorous and exciting image. It appeals to the kid in all of us; “Hey, let’s see just how far we can go with this baby before we break something”. So you get to tweak the system and see just what it can do, and what you can get away with. It’s all firm, tangible and respectably technical. It’s a satisfying and responsible thing to be doing.

Usability on the other hand is horribly wishy-washy. It’s all about what happens outside the machine; what that unpredictable, irrational user is doing, thinking and, worst of all, feeling! Usability is subjective and is extremely difficult to measure in ways that are useful to system designers and testers.


The link between usability and load testing is easy to overlook, but it’s obvious. In fact when I started in IT the only form of usability testing that I ever saw was testing for response times. That was also by far the most important part of performance testing, apart from some cursory checking of batch run times.

As a developer I was obsessed with maintaining slick response times. The mantra we worked to was “sub-second response time”. It was continually running through our head while we were developing. It was only partly a question of delivering a pleasant experience to the users. It was more a matter of personal pride. It would have been humiliating to deliver an application that ran like a sick old dog. Fast was cool. However, the users certainly benefitted. Applications I designed were much easier to use than those I saw when I started out.

The need for speed dictated my design. With VM/CMS the operating system could find files incredibly quickly. However, the utilities that searched for and retrieved data from within files ran very slowly if files got too big, and you hit the wall at surprisingly low volumes. You didn’t need a stopwatch to spot the difference between 100 records and 1,000. So we’d build applications that defined the input and output files dynamically, with key values embedded into the file names. An application could have hundreds or even thousands of files that were all identical in nature but just holding different parcels of data.

You could get blisteringly fast response times doing that because even though the application had a large amount of data, any individual transaction for a single user would be accessing files with less than 100 records for each screen full of data. We didn’t have a database for these applications. These were just flat sequential files. The approach we took greatly reduced the problem of file contention, but maintainability was a problem. The complexity of these applications, especially the batch runs processing vast numbers of files, was scary for newcomers.

Please share with us an interesting story about your experience with usability testing and load testing.

The large insurance company with whom I trained developed an application to handle claims. It was written in Mantis with DB2. The trouble was that this was a combination no-one seemed to have tried before that we knew of. Still, it worked fine in development. After 80 person years the application had to be ditched in operational acceptance testing when it was discovered that response times for most offices round the UK would be nearer a minute than a second, and it would be impossible for people to do their jobs.

Note that the problem was found as part of operational acceptance testing, not user acceptance testing. Usability was so far down the list of priorities that response times weren’t tested in a realistic way in user acceptance testing. Nor had there been any attempt at benchmarking the technology mix we were going to use, or even to carry out realistic proof of concept testing. The team had forgotten the lessons that the rest of us had had drummed into us: It was madness to commit yourself to a final design and build in all the complex functionality before establishing whether the solution could work. Just like usability testing in other words. I didn’t work on this development!

It was a humiliation for the IT Division. I think it was the final straw for the company. It lost confidence in its ability to manage IT effectively and outsourced us all to IBM. I had some great experience working for IBM, so I guess the disaster was quite lucky for me.

Please provide a link to your favorite Claro Testing blog post and tell us what you like about that one.

I don’t write blogs, though I do intend to start. The trouble is that I prefer either to write something quickly that can get immediate feedback, like on the Software Testing Club, or to write something considered in more depth like the articles I’ve written for Testing Experience magazine. If I wrote a blog and no-one commented on it then it probably wouldn’t seem worthwhile.

The article I wrote about the V Model for the last issue gave me the chance to have a rant and say some things I’d wanted to talk about for a while. It wasn’t just about the V Model. It also expressed my concern about the number of testers who’re not interested in extending their knowledge and skills. The full text is on my website. It’s been great getting feedback from all round the world. I’ve submitted another article for the next issue. It’s about the implications for usability of off-shoring.

And here concludes the interview.

Our thanks to James for his time invested, his candor, and his insight. It has been a pleasure to work with him on this interview, and we expect to work with him again in the future as opportunities allow. Hopefully, you have found some of this information useful and will reach out to him.

More information about James, his company, his services, and his expertise can be found at http://www.clarotesting.com/.

Twitter: james_christie

Similar Posts