Web Performance Optimization, Part 6: IIS Performance Tuning

In our last article on performance tuning, we examined how to squeeze the most performance out of an Apache server. In this installment, we’ll take a look at how to apply some of these same principles to Microsoft’s Internet Information Server (IIS), which ships as part of Windows Server. While its percentage of the Web server market share has declined in recent years relative to Apache, IIS still remains the second most deployed Web server on the Internet. Its deep integration with Windows and host of management utilities make it a great choice for anyone hosting Web content and applications […]

Web Performance Optimization, Part 5: Apache Server

So far in our series of Web Performance articles, we’ve addressed the three majors types of caching that Web server application developers can employ: server file caching, application caching, and data caching. We’ve looked at additional performance enhancements that Web server administrators can can activate, such as HTTP compression, file consolidation, and connection pipelining. In this latest installment of our series, we’re going a little deeper and focusing on Apache. The world’s most popular Web server, Apache currently powers over 63% of sites on the World Wide Web. While Apache runs decently out of the box, development teams and system […]

Web Performance Optimization, Part 4: Non-Caching Strategies

Welcome to the fourth part of our Web Performance Optimization series. It is our hope that these suggestions and best practices will assist you in improving the speed of your site. We view performance engineering as an iterative process whereby developers and testers will run load tests, analyze measurements, and tune the system incrementally. Our goal is to help you make your site faster and handle higher traffic. We’ve talked extensively on this blog about how server applications can use caching to improve Web server performance, often dramatically. In addition to caching, Web site developers and administrators can employ other […]

Web Performance Optimization, Part 3: Data Caching

In the previous installments of our Web performance series, we’ve examined how developers can employ Web server caching and application caching to speed up their Web applications. In this installment, we’ll see how caching entire data sets can increase platform efficiency.   What is Data Caching? Database caching is a species of application caching that caches the result of one or more database queries in the application server’s memory. For our purposes, we use application caching to refer to caching any component in an application server’s memory, and data caching to refer to caching a collection of data for future […]

Web Performance Optimization, Part 2: Application Caching

In our first installment of this series, we examined how Web server caching is implemented. Caching stores critical pieces of information in memory or on a local hard drive for subsequent rapid retrieval. In this installment, we’ll look at how Web application environments can use application caching to reduce load and increase performance.   What is Application Caching? Application caching stores calculated components in memory for future use, either by the same user or multiple users. Because of the complex architecture of modern Web applications, application caching can produce huge gains in Web site response times. A Web application is […]

Web Performance Optimization, Part 1: Web Server Caching

As we’ve discussed previously, Web site optimization directly affects a company’s bottom line. A sudden traffic spike that swamps a website’s capacity can cost a company thousands or even tens of thousands of dollars per hour. Web servers and Web applications should be built and deployed from day one with performance at the forefront of everyone’s mind. Web site administrators and web application developers have a host of tricks and techniques they can employ to deliver Web pages more quickly. From my experience, I’ve seen 1,000% performance improvement from simple web application optimization techniques. Caching is the #1 tuning trick […]

Load and Stress Testing – The New Paradigm Part 2

Yesterday, we examined how the new availability of elastic cloud computing has changed load and stress testing for the better. It lowers the cost, increases scalability, and facilitates running tests more efficiently. An article on load-testing.org about load and stress testing got me thinking about how our paradigms in this industry have changed and continue to evolve. Today, we look at the reasons for why we would want to run tests more frequently and earlier in the software product lifecycle. To borrow a Chicago axiom about voting – test early and test often. It’s the new paradigm.   Wide-spread Adoption […]

Load and Stress Testing – The New Paradigm Part 1

I read a good blog post about load and stress testing that brought out some good points, and I agree so much that this post will provide my view of the subject. In the load-testing.org article, it states that old thinking dictates buying your own dedicated servers for generating load and licensing legacy software to run test scripts. It’s expensive, slow, cumbersome, inefficient, wasteful. Not only are tools like LoadRunner expensive, they require considerable training and experience in staffing. Another key point in the article is about infrastructure costs: “The advent of Cloud Computing offers a new model – one […]

Realistic Load Testing Scenarios – Part 2

I’ll probably take some heat for this post. Most professional testers get deep into the details of arcane aspects of the science of load testing because that is their job. It’s important to them, and they have studied it for years, and they are immersed in it. I understand. It helps to differentiate them from other testers that are not as knowledgeable; thus, it is a competitive advantage to incorporate as many functions of performance as possible. Consultants certainly need to show off their advanced skills gained from decades of load testing so that the customer can be assured of […]

Realistic Load Testing Scenarios – Part 1

Response times and other performance measurements will be greatly affected by what pages are requested, what forms are submitted, and what buttons are clicked during a load test. Thus, a key aspect of being a good load tester is the ability to create test scenarios well. How should you develop test cases? Hopefully this post will give you some useful suggestions you can put into practice. The primary purpose of load testing is usually to find bottlenecks that decrease performance, and then mitigate or eliminate those bottlenecks. CEOs, CTOs, Vice Presidents of Marketing, and Product Managers want to make sure […]