2011 was an exceptional year for articles and information about web application performance. At Loadstorm, we read a lot of great articles on the subject because it interests us deeply. Since you are reading this, you must be a perf geek too – and we are glad you are here.
We went back to pick some of our favorite sources from last year, and selected 10 really good ones. They may not be the most high-profile because those tend to be about/for the big corporations and many times are paid for by sponsors (e.g. HP, IBM). Those sponsorships have an influence on the people they interview, the technologies they recommend, and the author’s perspective. We prefer digging around for the more techie, smaller sites to find data supported from facts.
I hope you find the following resources helpful about the importance of performance, benchmarking, and the mechanics of performance optimization for mobile computing.
The Importance of Web Performance
How important is good performance? The folks at Web Performance Today ran an informal analysis of how consumers on Twitter responded to under-performing sites. Answer: not well at all!
Companies whose sites lagged were often savaged on social media. The lesson is clear: poor performance can have an instant impact on a company’s reputation. WPT’s ad hoc analysis echoes a more formal study back from 2010, when Foviance and CA EMEA hooked up users to EEG skull caps and measured their stress levels in response to slow sites. Foviance found that stress and agitation increase dramatically when users are dealing with poorly performing sites. 71% of such users end up blaming the Web site owner or Web host for their pain and suffering. Additionally, if consumers encounter problems online, 40% will go to a rival website and 37% will abandon the transaction entirely. Only 18% said they would report a problem to a company,
Benchmarking and Performance Testing
By far, our favorite benchmarking articles of the year came courtesy of Nicholas Bonvin, who tested how well each of the major Linux-based Web servers on the market serve static files. His tests included Apache, Nginx, G-WAN, Varnish, and Cherokee; in each case, Bonvin used the Web server’s default settings. The clear winner of that first round of testing was G-WAN, which outperformed Cherokee by 2.25 times more request per second, and left Apache in the dust by fulfilling 9 to 13.5 times more requests per second. Wowza!
In a follow-up post, Bonvin tossed Lighttpd into the mix, and fine-tuned each server’s settings for optimal performance. In this second test battery, G-WAN still came out on top in terms of responses per second served, and also demonstrated low CPU utilization, However, Nginx demonstrated superb memory management; its memory utilization remained stable while serving multiple concurrent clients in the course of fulfilling one million requests.
On the Windows side of the equation, the folks at WebPerformance caused a stir among their readers when they ran a benchmark testing Apache, Nginx, Lighttpd, and G-WAN on a CentOS 6.0 quad-core workstation, as well as IIS 7 on a separate Windows Server 2008 SP2 partition. The result: IIS 7 came out ahead in CPU utilization, peak throughput, and average response time. The authors admitted that this stellar performance may be due to the fact that they disabled most of IIS’s extensions, as they were not relevant to serving a small number of static files. And the numbers on G-WAN are questionable, particularly when you compare their CPU utilization results to the results obtained by Bonvin. Still, the article was a good reminder that the Windows Server platform can scale when it counts.
The problem with benchmarks is that they are notoriously difficult to design. There are so many hardware and networking layers involved in Web performance, and a small change or inefficiency anywhere along the stack can skew one’s results. In a bid to help benchmarkers achieve more relevant numbers, Mark Nottingham published a comprehensive article on how to benchmark correctly. His tips include being consistent from test to test, using separate machines for the server and the load generator, gauging network capacity beforehand, and using timelines and histograms instead of just generating averages, among many other morsels of advice.
The Mobile Experience Comes Front and Center
2011 was the Year of Mobile Computing – the year when consumers en masse began using iPhones, iPads, and a potpourri of Android-powered devices (such as the Samsung Galaxy and Kindle Fire) as their primary means of staying connected to the world, both at home and on the go. The continued adoption of mobile technology means that Web site developers need to pay particular attention to how their sites performance on iOS, Android, and Windows Phone platforms.
Web Performance Today had several spotlight articles on mobile performance. In July, the site published a survey that revealed how consumers’ attitudes had changed regarding mobile performance since 2009. Back then, 58% of users expected mobile sites to perform efficiently. In 2011, that number climbed to 71%. Users are also much less tolerant of slow sites than they were in 2009, when only 20% of users said they would give up on a site that took longer than 5 seconds to load. That number now stands at an astounding 74%.
How can Web site operators improve mobile performance? WPT did some additional leg work, and found that 97% of the performance cost of mobile Web sites resides in the front end. The numbers varied between devices and operating systems, with some performing better than others. (The outlier was the iPad running iOS 4.0, which clocks a “mere” 85% of its performance in client-side rendering and script execution).
How does one optimize front end performance? Over at the Google Code Blog, Ramki Krishnan reported on a talk given by the CEO of Blaze.io, Guy Podjarny, who talked about improving front end performance through improvements in software, hardware, and the mobile network. (The full talk is available on YouTube.)
Of course, let’s not forget that there’s a new kid on the mobile block. Like its competitors, Windows 7 supports both mobile browsing on the Web as well as a local app experience for developers who need to provide rich interactivity to their users. Microsoft engineer Allan Murphy made the rounds last year with a PowerPoint presentation – now available as a free download – that explores the basics of app performance on the Windows Phone platform, including garbage collection, compilation, and frame rate refresh.
Finally, at the San Francisco/Silicon Valley Web Performance Meetup, performance guru Steve Souders shared his tips and tricks for accelerating mobile performance.
Conclusion
There was more good information on performance than we could include in a single installment. In our next round-up, we’ll include a few more articles on the hot topics in performance in 2011.