Yottaa is a customer of LoadStorm’s load testing tool, and they are focused on helping companies speed up their website. They test and monitor thousands of web apps, and they recently compiled stats regarding web performance that I found interesting. Thus, I share it with you.
The data sample comprised 14,000 different sites. They measured several aspects of web performance in metrics for front-end and back-end. The infographic below is well-designed and hopefully will be of value to you.
Why are Apple & Google winning?
- They understand the impact of performance on users’ decision making.
- Google is passionate about fast response times and good performance of all of their systems.
- Apple is consumed with producing the best quality and highest performing devices.
Why is Microsoft losing?
- MS software performs poorly.
- They accept poor performance as “good enough”.
- MS’s culture is hanging onto outmoded user expectations regarding performance.
Today’s Tech Leaders Understand Performance
Google is winning because they place a very high priority on performance. They are impressive because they are possessed with being fast.
Their search engine has always been the speed demon, and that appealed to all of us…not just geeks, but even my mom can tell the difference. She recently made Chrome her default browser because it runs so much faster.
I’ve been impressed with Google’s engineers because they invest in speeding up everything. They even give away cool performance tools like PageSpeed in order to help the whole world make sites faster.
They not only provide tools for Chrome, but they also have PageSpeed Insights for Firefox. It sure seems to me that high performance is a pervasive part of the Google culture.
Have you heard of Google Now? It’s the Android competitor to Apple’s Siri. The latest advancements in voice recognition and functionality to make it useful. Well, here is another example of Google’s passion for performance. Google Now is fast. BusinessInsider interviewed Hugo Barra, Google’s Android product manager who reported that his team spent months shaving seconds off the response time. The early reports from users indicate it’s very fast – “nearly instantaneous” according to BusinessInsider.
Would you argue that performance is not key to Google’s DNA? Isn’t it a key reason they are so successful? Performance = Google.
I’m convinced their engineers are some of the best and brightest on the planet, and they understand that speed sells. Performance = business success. Performance = technology leadership.
Apple has surpassed MS in company value and is the leader in technology. Apple has always taken a different approach and has shown how to be a driver in the global tech industry by employing integrated strategies between software and hardware. Performance is greatly enhanced by optimizing all the parts of the technology (e.g. iPhone) together.
Sales of smart phones and tablets are exceeding traditional computers, and the tightly constrained resources of mobile devices demand performance tuning that is enabled by hardware/software integration.
Not only the device, but also websites and cloud applications must run fast with fewer glitches. That’s one of the key reasons why Apple’s iCloud has been such a big success with the masses. It works well. It is fast. It performs up to expectations for a world full of impatient users.
The aim of this post is to outline how to determine and prioritise the key performance requirements within a project. I’ve already covered how important it is to have good performance requirements. These are the items that drive and determine the quality of the performance testing – but actually how do we best manage, assess and identify performance requirements?
Managing Performance Requirements
Lets take a step back first. I’ve often found that the person that best defines the performance requirements is usually the performance tester. This is in contrast to the business analysis or the stakeholders defining them. Why? A number of reasons – the main being time and accuracy.
Here’s a typical conversation:
Over at his blog Spoot!, Nicolas Bonvin recently posted two summaries of the great work he’s done benchmarking how well various open-source and free Web servers dish up static content under high loads. Bonvin, a PhD. student at the …cole polytechnique fÈdÈrale de Lausanne (EPFL in Switzerland, specializes in high-volume distributed data systems, and brings considerable expertise and real-world experience to bear in designing his tests.
First Round of Performance Testing
In his first post, Bonvin laid out the evidence he’d accumulated by running benchmark tests against six Web servers: Apache MPM-worker, Apache MPM-event, Nginx, Varnish, G-WAN, and Cherokee, all running on a 64-bit Ubuntu build. (All Web servers used, save for G-WAN, were 64-bit.) This first set of benchmarks was run without any server optimization; each server was deployed with its default settings. Bonvin measured minimum, maximum, and average requests per second (RPS) for each server. All tests were performed locally, eliminating network latency from the equation.
On this initial test battery, G-WAN was the clear winner on every conceivable benchmark, with Cherokee placing second, Nginx and Varnish close to tied, and both strains of Apache coming in dead last. As Bonvin notes, it wasn’t even close. G-WAN, a small Web server built for high performance, completed 2.25 more requests per second than Cherokee (its closest competitor), and served a whopping 9 to 13.5 more requests per second than the two versions of Apache.
Your website is a little slow – so what? Well, it is probably costing you money. I have been researching published facts about web performance because we are always trying to understand our industry better. This post should help you realize that improving your web application performance can directly impact your bottom line by 10% or more. Don’t believe me? Read on…
I read an article today on the E-commerce Times site Web Performance Metrics That Matter. It was the first result in a Google search on “web performance metrics”, and the title sounded like a perfect hit. My intent was to see what other people think are the best performance metrics.
Here is a paragraph that really surprised me:
There are some excellent stats gathered about performance testing at this resource site.
I especially like the one that states 5 second response time is the cutoff point for the business doing well. If the app takes longer than 5 sec to respond, the company employees start to get frustrated. It makes sense to me that productivity will go down. That not only leads to unhappy employees, it invariably leads to lost customers and lost profits.
An article caught my attention because I follow the author on Twitter. Two days ago, Jeffery Way published 15+ Tips to Speed Up Your Website, and Optimize Your Code!
For most developers, worrying about optimization when writing SQL queries is not an issue until performance becomes a problem with an application. Although SQL query optimization is important, it can be seen as a tedious process that some programmers fail to follow thoroughly. Some best practices can be observed during the development process that will help maximize performance for future application scalability.
Use SELECT Instead of Using SELECT *
SQL Server performance tuning is usually left until problems arise. Most of the time developers and even database administrators only focus on the tuning elements of their MSSQL servers when performance starts to become slow and adversely affects users. Although the typical reason for slow issues is configuration or development issues (poor stored procedure design), occasionally hardware can be an issue.
Even with the most bandwidth possibly available, an inefficient content delivery design can bring a Web 2.0 application to its knees. Nothing turns off potential users like a page that loads too slowly. This is a critical concern when designing any application, but it is a requirement that must be carefully balanced against the need to provide a content rich environment. The two goals are often opposed, but with careful design, both requirements can be successfully met.
This is quite a list of open source performance testing tools. While there are definitely some excellent products on this list, there are also many (perhaps most) which fall into the category of poor to mediocre. As is the case with any open source category, these were developed by programmers who generally develop something for their own use, without documentation and very specific application. The most popular according to this page are:
- Apache JMeter
- OpenSTA
- WebLOAD
For large companies, load balancing is an important feature to use to increase performance. Large companies with the available resources will use web farms. Web farms give your applications the ability to use multiple servers for resources more commonly known as pooling resources. Load balancing helps distribute user requests through a dispatch application that redirects to the different web farm servers.
If your web server is starting to lose performance and resources are lacking, it may be time to implement a web farm. A web farm will pool sources and allow your web server to share its resources with another high powered machine. It can greatly increase web form performance and rid some of the end user frustration of timeouts and slow responses.
Microsoft SQL Server 2000 and Microsoft SQL Server 2005 do not inherently support load balancing out of the box like most information technology experts might think. They have intrinsic tools that allow for scalability and load balancing, however, they do not have the defined load balancing functions that are commonly known among network administrators. Instead, Microsoft SQL Server 2005 comes with some tools to help with scalability and performance for your enterprise organization.
Microsoft Clustering Services (MSCS)
LoadRunner (now called HP LoadRunner, since HP bought Mercury Interactive) is one of the most popular software automation and testing programs on the market. It is a comprehensive testing program that gives webmasters and developers the ability to test the functionality and load managing ability of their website. There are many reasons why LoadRunner is so popular and why it is a good choice as a software testing tool for your website.