load testing interview : performance engineer shares tips


My thanks to Lawrence Nuanez for sharing his insights and testing expertise with us. In this email interview, Lawrence talks about his views on software testing, load testing, test automation, and off-shoring.

As a Senior Consultant for ProtoTest, Lawrence’s focus is mainly on load and performance testing, and he has several years of experience helping both SME and Fortune 500 clients by designing custom test plans. Use of both proprietary and open-source tools is always considered to ensure that best fit for the customer.


Lawrence has over 15 years of experience in software development. He has worked as a developer, tester, QA manager, and software consultant. He has worked with IT and business executives from many diverse industries including healthcare, financial services, environmental engineering, space imaging, and commodities exchange.

He is also a certified Scrum Master, and a Certified Tester Foundation Level.

How much involvement do you have with load or performance testing?

I do lots of load and performance testing for our clients. In fact, I have had more load and performance testing engagements over the last year than any other type.

What would you say is the difference between load testing and performance testing?

I am sure that every person you ask will come up with a different answer. My view is that performance testing is the artistic side to this type of testing. You are interested in how the application responds to various levels and types of load. You are tracking response times, throughput, time to load, etc.

Load testing is more about running at a high level and seeing how the application responds. You look for performance degradation, dropped sessions, poor garbage collection, etc.

Stress testing is purposely trying to find the breaking point. And seeing how the application recovers and seeing the effect it has on customers that were connected when it fell over. You are concerned about hung sessions, corrupted data, whether the application becomes responsive if the load drops sufficiently or if human intervention is required.

Reliability testing is seeing how the application runs for extended periods of time either with average or above average load. You are looking for memory leaks, stale sessions, slowly degrading performance.

How does cloud computing affect the future of automated testing?


This is still a little hazy to me. It has great potential to affect the future of automated testing. However the applications have to built that get the greatest potential out of the cloud computing power. The potential is enormous to have access to the kind of hardware that would previously have made most geeks wet their pants, but at times it seems like that Ferrari behind the glass – if only you could get your hands on it.

If distributed applications can be developed that allow testers to have full control over what they test while utilizing the power of ‘the cloud’ I think many people would be interested. Especially if they can keep the costs reasonable.

What are the KPIs you track for load testing?

The key performance indicators I track for load testing vary according to the system I am testing. For web applications, CPU and Memory utilization are extremely important. Disk read and writes. Size of the http messages going back and forth. I also like to track expensive SQL transactions.

How do you use them?

All of these KPIs are indications of how the application is utilizing or not utilizing resources. You begin to see if applications are ‘CPU hogs’ or ‘memory pigs’. Slow SQL transactions are usually the results of poorly planned stored procedures. The great thing is that you never know how applications will react under load until you actually stress them.

Please share with us an interesting story about your experience with performance testing or load testing.

I had an interesting experience last year with a client that has worldwide offices. It is a very large client with thousands of employees in countries all over the world. They were rolling out a new system that was going to be used by people in those offices around the world. We started to observe a very unusual phenomenon. Users that came into the application (hosted in Chicago) from Pune, India or Christchurch, NZ had better performance than people coming in from Chicago. This is very odd since those countries have some of the worst network latency of any industrialized country. It made no sense. It was some backwards bizarro type of result.


After a whole bunch of research we figured out the problem was a very poorly written SQL query statement. It was so slow that simply by delaying the request by 500MS was giving the database enough time to process requests and not let them queue up as bad as for those who had no latency and could hit the server much quicker.

This was a result that at first glance made no sense. It was the exact opposite of what it should have been yet the results were clear. Figuring out the problem was the fun part.

What are the most critical elements of software testing? Why?

To me the most critical elements of software testing are why you are testing in the first place and having the right people testing.

Understanding why you are testing is huge. If you are testing to find bugs you are in the wrong business. Testing is just not finding bugs but raising the overall quality. Of being an advocate for the users and always asking the questions ‘why are we doing it this way? Can we do it better?’ ‘What if we changed this?’ And asking those questions early in the development process so users don’t ask those questions of us later.

Having the right people is critical. At times developers view testers as either developers who couldn’t hack it (if you can’t develop you test) or as a necessary evil that just cramps their style. But in reality testing is a very unique skill. I believe there is a reason that most universities don’t teach much about software testing in school. It is because they don’t know how to teach the subject. Running the tests are easy. The real skill comes in developing the tests. A good tester will develop tests that reveal flaws and expose weakness. That is a skill that is tough to come by. If you find a good tester you will hang on to them.

How do you see testing evolving over the past few years? the next few years?

Testing has made great strides over the last few years. The move to agile development has created a great shift in how testing is done. The mindset now is ‘just enough, just in time’. This has caused testers to evaluate their tests and really put great effort into creating short, efficient tests. This has also caused a greater push for automation.


This push will also drive the future of testing. I believe the days of the black box tester are numbered. Testers will have to become more technical to survive. Agile development breaks down the roles barrier of developer and tester. People on agile teams are expected to contribute in many ways. So testers will need to improve their skill set to be able to be key contributors.

Who are the top 5 testing experts that you know?

I really don’t like that term ‘experts’. Dupuey says “An expert is someone who has stopped thinking…they already know.” I don’t know who Dupuey is I just read the quote and it stuck with me. I don’t ever want to stop thinking and think I know everything.

However, there are some very smart people that I do enjoy reading their thoughts from time to time. Elisabeth Hendrickson has some wonderful insights into automation and agile testing. Bret Petticord is a very bright guy who is on the cutting edge of open source automation tools. Cem Kaner, of course, has been around forever and helped shaped testing in the early days and even today. James Bach puts himself in the ‘expert’ category but his arrogance turns me off even when he has something insightful to say.

I think there is a big need for more people to speak more about load and performance testing. There doesn’t seem to be too many out there that claim to be knowledgeable about the field. Hopefully more will step into a leadership role in the future.

What is your specialty? Why?

Not sure I have a specific specialty. I would guess that I tend to play more in the automation and load and performance pools. I enjoy architecting automation and load and performance frameworks because they can be highly challenging. I have also learned over the years that I would rather spend my days on a river fly fishing than in a cube hunched over a laptop. So I need to have work that is challenging and interesting.

I have used many different automation and load and performance tools; from high end proprietary tools to open source alternatives. I have done work for an extremely diverse group of clients so I have seen quite a bit over the years that help me tackle just about any situation.

What do you think is the most important aspect of web application testing?

Understanding how your users use your web application is huge. You may put tons of content out there but why are users utilizing some but not others. Understanding user behavior and UI aesthetics are so important. How many sites have you gone to and said ‘yuck – this site is horrible. It is so confusing’? UI makes a big impression on people.

Also understanding how web applications work on the back end is very important. IIS, WebSphere, Apache, etc… can all be configured in a myriad of different ways, and each may have a big impact on performance and usability. Implementing a ‘cool’ feature can have a huge negative performance impact if you don’t understand its affect on the application. This happens all the time.

In what ways have you seen test automation tools evolve in the last few years?


There have been great strides made in test automation. For years your options were mortgage your business and buy from Mercury, Rational, Segue, etc. at highway robbery rates or build your own. The tools that you could buy were clunky, poorly developed, utilized proprietary languages that were weak at best and were so expensive that most could only afford a few licenses.

Open source and lower cost alternatives have greatly changes the landscape. I believe this is why nearly every original test vendor has now merged with a larger parent company. Mercury to HP, Segue to Borland, Rational to IBM, etc. I think they see the writing on the wall and realized the demand for extremely expensive, poorly made tools was going to wane.


According to Opensourcetesting.org there are 89 functional, 39 performance, 16 test management, and 41 defect tracking open source applications available. Now most are just as poorly developed as the big boys, but some are quite good and are getting better all the time. There are so many lower priced alternatives that make automation so much easier to implement.

There is also a change in thinking with regard to automation. Automation is no silver bullet. The vast majority of automation projects fail because they are implemented for the wrong reason at the wrong time. As time passes, automation as a part of the overall testing solution is becoming better understood.

Any thoughts on how the economic downturn will specifically affect the application testing business? (e.g. less testing, more offshoring, etc.)

I do think offshoring has reached its peak and will only decline. So many companies I have dealt with have had such negative experiences with offshore teams. The majority that I’ve seen are pulling everything back onshore.


The economic downturn will be very challenging for so many companies. It seems like they are all staring at each other waiting for the other to blink first and spend money. I would hope that companies see the value of holding onto every customer they have now and understand that having a high quality product is the best way to ensure that. You don’t want to give your customers a reason to look for a cheaper alternative or a reason to just dump your service altogether. I am hopeful that enough companies will see the value in application testing to keep good testers working.

What are your favorite testing blogs? Should we subscribe?

I enjoy Elisabeth Hendrickson

Our blog (ProtoTest) posts some interesting things.

I would recommend you subscribe to both.

What is your technical background?

I went to school for computer science. I had been interested in computers since I was young. When I was a child, schools were just starting to get Apple 2e computers. So I started to play with them. In school I learned all the prerequisite computer science languages – COBOL, Fortran, Pascal. I was doing COBOL programming for a small company in Albuquerque. Nothing high tech. Just creating simple school forms that were used by school districts in NY. So my training is mostly as an OO developer.

When and why did you get into software testing?

I ended up in software testing kind of by accident. I was living in Brooklyn, NY circa 1995 and was part of a R & D team that was developing new hardware and software for control systems. I was put on the team as a development resource but they really didn’t have enough work for two developers so I was asked to do the testing. It was a foreign concept to me. I had to read books and figure out what in the world ‘testing’ and ‘QA’ meant since I didn’t learn anything about that in school. But as I started to build little test harnesses to test the software and hardware I learned that I really liked testing more than development.

I then ended up working for ILX System and working for Bernie Berger. He was – and still is – a great mentor. He taught me so much about what testing really meant and I owe him a great debt.

I’ve been hooked ever since.

Similar Posts