The post Super Bowl Site Performance appeared first on LoadStorm.
]]> Super Bowl Site Performance Game But while we may love football, we all know that this is only half of the competition. At $4.5 million dollars a piece, Super Bowl commercials only get a few seconds to vy for our attention, beat the competition, and make a valuable impression. After previous notorious site crashes that resulted in harsh social media backlash, we wondered if the companies who were willing to spend so much for 30 seconds worth of airtime also invested in their site performance and scalability that the $4.5 million dollar ads would incur. We tested each advertiser’s site with 2 virtual users, requesting the home page every 30 seconds over the duration of the game. Here’s what we found:The post Super Bowl Site Performance appeared first on LoadStorm.
]]> https://loadstorm.com/2015/02/super-bowl-site-performance/feed/ 0The post WebPageTest Public vs Private Instance appeared first on LoadStorm.
]]> WebPageTest (WPT) is a free, open source tool that is used to evaluate web page performance. The great thing about WPT is that it can be utilized by people with all levels of expertise, and has many options for advanced settings and customizations. The public WPT instances cover many scenarios, but private instances can be useful, especially when testing sites that aren’t yet publicly accessible.We decided to install our own private instance of WPT using an Amazon Micro EC2. However, as the WebPerfLab, we couldn’t just leave it at that. Come on, you should know us by now! We had to know; how would this affect our performance test results?
To determine this we ran similar tests using the private and public instances to compare results. The target pages and the browsers used were the same, and the tests were run from similar locations. WPT was set to Dulles, VA, and our EC2 is in Amazon’s US East datacenter located in Northern Virginia.
As you can see the private instance was loading our target pages one second slower on average than the public tests. Our initial guess was that the difference in load time could probably be attributed to a difference in bandwidth, so we decided to test that theory.
We ran several speed tests from our micro EC2 using SpeedTest.net and compared the results. It’s clear from our results that even though our EC2 experienced large fluctuations in bandwidth, the average speed was still much faster than what we needed for the performance tests (5 Mb/s download and 1 Mb/s upload). A difference in bandwidth wasn’t the answer.
After doing some more digging, we found some interesting information to corroborate the differing results which we were experiencing. We found an old article that gives similar test results showing a difference between a micro EC2 and a small EC2 as the host server. Patrick Meenan, the head honcho of WebPageTest, was quoted in the article stating that a micro EC2 is simply not good for performance testing. With the limited CPU and memory, the results fluctuate too wildly.
This leaves us with an important lesson: Taking a single measurement point and using it as a representative for the performance of a site is not enough. Even using a platform that is back-end consistent can yield significant variations. Pairing LoadStorm test results with WebPageTest results can give much more meaningful insight and perspective on how a site is really performing.
The post WebPageTest Public vs Private Instance appeared first on LoadStorm.
]]> https://loadstorm.com/2014/02/webpagetest-public-vs-private-instance/feed/ 0The post Web Performance News for the Week of February 3, 2014 appeared first on LoadStorm.
]]> National Signing Day Takes Rivals Website DownNational Signing Day for high school athletes is like Christmas and birthdays all wrapped into one. Websites like 247Sports.com and Scout.com had their writers covering stories and updating every minute. On one of the biggest day for the recruiting industry, the top dog website Rivals lagged behind. Out of all 365 days, National Signing Day resulted in a strong overflow on their message boards causing a bottleneck that brought down the site. Rivals.com team diagnosed the problem and decided that the best way to stabilize and fix the issue was to temporarily shut down the message boards while making Premium content available to all paid subscribers.
While some users were still able to access content, Head of Rivals.com, Eric Winter immediately addressed the problem and directed readers to their mobile website.
Rivals.com has an estimated 200,000 subscribers, twice the subscribers of 2nd and 3rd placed Scout.com and 247Sports.com. While Rivals was continuing to fix their website, 247Sports had their website flow in with 1 million unique visitors and had close to 15 million pageviews on National Signing Day. It’s safe to assume some of the traffic could’ve been Rivals, but due to the technology and unexpected traffic flow, Any big event like this will bring down websites if left unprepared. Eric Winter told AL.com before National Signing Day that the technology that they were carrying was out of date. As a frequent visitor to Rivals.com, I’m optimistic that there will be some changes in the future because of this event.
There’s a saying that goes “On the internet nobody knows you’re a dog.” How true is this? Can dogs really surf the internet? New data shows that humans are only accounting for 38.5% of all web traffic. This can only mean your dog is in fact, surfing on the net. Right!? Sadly no, the other 61.5% of web traffic is actually coming from bots. In 2012, 49% of web traffic were coming from humans, while 51% were bots.
You might be thinking that the majority of these bots are bad, but most of them are actually good bots. The good bots are out their indexing websites, to help users find more accurate and relevant information. The bad bots like spam bots are actually decreasing. Only 1% of spam bots account for internet activity. Next time when you’re on Google to find an answer, think of bots bringing back information that’s high quality and up-to-date.
Nonetheless, it’s still important to play it safe. Besides malicious spam bots, there are scrapers, hacking tools, and impersonators. Scrapers generally steal and duplicate content. They will also steal emails for spam purposes. Hacking tools are used to hijack servers and steal credit cards. Finally, impersonators are the ones driving down the website with bandwidth consumption with hopes to bring a website to a downtime.
Just like humans, the actions that bots take can be good or bad. With statistics showing that human traffic is accounting for less web traffic, the future will have some interesting consequences if the pattern continues.
The post Web Performance News for the Week of February 3, 2014 appeared first on LoadStorm.
]]> https://loadstorm.com/2014/02/web-performance-news-for-the-week-of-february-3-2014/feed/ 0The post Importance of Mobile Web Performance appeared first on LoadStorm.
]]> Mobile web browsing and shopping is growing at a remarkable rate. Sites that aren’t optimized to perform well on mobile devices will miss out on a large part of the market! Check out some surprising statistics on why mobile web performance is so important:
The post Importance of Mobile Web Performance appeared first on LoadStorm.
]]> https://loadstorm.com/2014/01/importance-mobile-web-performance/feed/ 2The post Web Performance News for the Week of December 23, 2013 appeared first on LoadStorm.
]]> The Importance of Critical Rendering PathIn the field of web performance, professionals would suggest that the critical render path is an important concept. What is the critical render path? Basically, it’s the code and resources required to render an initial view of a web page. The rendering path can be viewed as the chain of events that occur to make your webpage appear on a browser. The “critical” aspect would integrate only the most important components to render only the initial view of your webpage. In order to display a simple web page, the browser goes through a series of steps to get the resources it needs to display a web page:
Browser downloads the html file
Browser reads the html and sees that there are one css file, one javascript file and one image
Browser starts downloading the image
Browser decides it can not display the webpage without first getting the css and javascript
Browser downloads the CSS file and reads it to make sure nothing else is being called
Browser decides it still can not display the webpage yet until it has the javascript
Browser downloads the javascript file and reads it to make sure nothing else is being called
Browser now decides it can display the webpage
Although the path is simplified, many websites out there have numerous social buttons, several CSS files, several javascript files, many images, and maybe a few audio or videos. This can result in the render path being huge and complicated. Most websites have absolutely terrible render paths because they are calling so many things that the browser must load before the webpage can be displayed.
Two types of resources that webpages call that generally block the render of webpages are the CSS files and the javascript files. No matter how many of these you have, the browser must download and parse each and every one of these files before it can show anything at all to the user.
To better optimize a web page, be sure to focus on the “critical” aspect of what makes your site important. If you have over 1000 pictures and 200 javascript files, focus on the resources that are required to render the initial view of your web page.
Start by prioritizing the content on your web page, followed by deferring certain javascript files until after the pageload. By optimizing the critical path, you can reduce all those calls to one request. All the browser needs to render this page is the html file.
For an in depth view, read more at http://bit.ly/1hJDsLY
Responsive Web Design is becoming an established technique yet it’s adoption is still not being executed. Often when responsive sites are designed, the approach is primarily from a visual design perspective with little attention towards performance concerns. Fortunately, here are some techniques to develop a strong performance site that caters to the visual scale while delivering codes and assets tuned to mobile optimization.
14KB on Mobile
First impressions do count. While it’s easy to throw jQuery, a framework like Bootstrap/Foundation and web fonts on a page you trade ease of use for performance. It’s increasingly a mobile world and your main aim should be to engage users as quickly as possible.
At a recent Velocity conference, Ilya Grigorik’s highlighted what it takes to deliver a mobile page within one second. Users get one HTTP request and only 14kb of HTML to work with in order to get content in front of users. One of the only ways to achieve this is by inlining any “above the fold” CSS and JS required to render the first screen of content.
Compressing CSS and JS
Tools like modpagespeed and CloudFlare can help with front-end optimization. Responsive sites generally combine the desktop and mobile CSS and JS into one set of files. This results in presenting a poor code to the width users are viewing. This can be optimized by using javascript to detect the width of the page. Once detected, you can request the style and javascript appropriate to the size of width.
Shrinking images
When visiting a web page, images are the largest assets on a page. Remember to serve the right image to the proper width and being able to art direct the image for smaller screen sizes by cropping into the focal point of an image for smaller widths. Another way is to simply eliminate an image altogether by using a tool like icon font or SVG images. Both these tools use vector data so they are small in file size and scale efficiently across all sizes including high resolution retina screens.
Case study: Guardian Site
“A great example using the above techniques is the new Guardian newspaper website, currently inalpha. On average it’s start render time is 3 seconds faster than the current Guardian site and the mobile width inlines CSS and is 42% smaller overall, leading to a 1 second saving over the desktop width.”
Read more at http://bit.ly/19QlW6K
HTTP Archive Report has published their end-of-year technology statistics which collects information from 300,000 of the web’s most popular websites. The average page weight has increased by 32%. Some of the increase can be explained by the increasing ecommerce activity and advertising as people hunt for gifts. However, few websites lose weight in January and continue to gorge themselves throughout the year.
The report analyzes publicly-accessible content and shopping web sites rather than provides a breakdown of the specific technologies used:
Technology |
End 2012 |
End 2013 |
Increase |
HTML |
54Kb |
57Kb |
+6% |
CSS |
35Kb |
46Kb |
+31% |
JavaScript |
211Kb |
276Kb |
+31% |
Images |
793Kb |
1,030Kb |
+30% |
Flash |
92Kb |
87Kb |
-5% |
Other |
101Kb |
205Kb |
+103% |
Total |
1,286Kb |
1,701Kb |
+32% |
The Reasons
Bloated CMS Templates
Typical WordPress themes are crammed full of features. Many will be third-party styles and widgets the author has added to make the theme more useful or attractive to buyers. Many features will not be used but the files are still present.
HTML5 Boilerplates
A boilerplate may save time but it’s important to understand they are generic templates. The styles and scripts contain features you’ll never use and the HTML can be verbose with deeply-nested elements and long-winded, descriptive class names. Few developers bother to remove redundant code.
Carelessness
Developers are inherently lazy; we write software to make tasks easier. However, if you’re not concerned about the consequences of page weight, you should have your web license revoked.
Read more at http://bit.ly/1cBOLpT
The post Web Performance News for the Week of December 23, 2013 appeared first on LoadStorm.
]]> https://loadstorm.com/2013/12/web-performance-news-week-december-23-2013/feed/ 0The post WordPress Performance – Version Comparison appeared first on LoadStorm.
]]>Recently we decided to upgrade our WordPress core to version 3.7.1 and we figured we would perform some tests to see how the change affected us. I took our baseline of 11 performance tests (using webpagetest.org) and a single load test (using LoadStorm) to compare to our new set of data after upgrading to the new version. It was fun to see how things would perform, but the results were not shocking. There was virtually no difference between the performance tests. The page load time was on average 1.22% slower, and we’re talking tiny fractions of a second so the difference is indistinguishable to a human being. However, the percentage change could be noticeable on a slower site.
Our load test data was a little more interesting although a bit less thorough since we’re only comparing one test to another. The results were still quite similar, but we had a bit more variance on the HTML requests. The response average was nearly a half a second slower for the HTML pages, but the other measurements were reasonably close to one another for the HTML pages. For non-HTML requests we saw some errors due to timeouts and an increase in requests per second. The average response times were a tenth of a second slower.
In conclusion, the minuscule differences showed hardly any changes from the older version to the upgrade. Without knowing the full details of the changes that came with the new version the performance results make it appear that there were not many changes that would improve page performance. All in all WordPress is a solid content management system which you can have running very fast with a plugin and a few changes. See our earlier post about the W3 Total Cache plugin and how it helped our site’s performance and scalability through page caching.
If you’re interested in seeing the full set of test results you can view them from this PDF.
The post WordPress Performance – Version Comparison appeared first on LoadStorm.
]]> https://loadstorm.com/2013/12/wordpress-performance-comparison/feed/ 1The post Web Performance News for the Week of December 2, 2013 appeared first on LoadStorm.
]]> Numerous articles are being delivered to the internet everyday. Not everyone has the time to read every articles of interest to them. To help facilitate updates in the web performance world, here are a series of summaries to keep readers updated.Akamai will acquire all of the outstanding equity of Prolexic in exchange for a net cash payment of about $370 million. The transaction is expected to occur in the first half of 2014. Although Akamai already has a service to protect online companies from DDoS (distributed denial of service )attacks, CEO Tom Leighton informed everyone that Prolexic focused on protecting all manner of enterprise computing systems, such as corporate data centers.
“By joining forces with Prolexic, we intend to combine Akamai’s leading security and performance platform with Prolexic’s highly-regarded DDoS mitigation solutions for data center and enterprise applications protection. We believe that Prolexic’s solutions and team will help us achieve our goal of making the Internet fast, reliable, and secure.” said Leighton.
The deal will continue to grow Akamai’s Internet security business and extending their reach to customers’ data centers and non-web applications, such as e-mail and tools for staff to share digital files.
FCC has recently unveiled an app called FCC Speed Test App for Android smartphones which is available for free through Google Play. The app is a stepping stone towards evaluating the efficiency of mobile broadband network performance. Speed Test App is aimed at equipping consumers with information to allow them to make fact-based, informed decisions when choosing and evaluating their mobile providers.
Aggregated data collected from the app will continue to help the FCC in its future policy decision making. The program is based on involving consumer volunteers, the Federal Trade Commission, wireless service providers, and others to produce accurate information on mobile broadband services.By extending to mobile services this will provide valuable information to the public, industry and policy makers on broadband networks across the nation.
Regarding concerns for the app, the FCC stated that consumer privacy is the top priority. The FCC Speed Test page stated, “We used open, public meetings and worked with a diverse team of privacy experts, including federal partners, academia, and industry stakeholders, to develop our privacy policy and procedures. Simply put, NO personal information or unique identifiers are collected, data is collected anonymously, and it may be further processed to ensure that consumer privacy is protected.”
Conducted by TechValidate, Limelight Networks announced the results of a recent survey on how businesses are seeing web performance as a top business priority. Businesses are continuing to become more aware of the financial influence of high performance. With the awareness, online businesses are beginning to integrate performance testing into their development cycle. Below are some statistics announced from Limelight Networks.
Based on responses from over 230 Limelight Platform customers, the survey shows that 62% of companies view high performance delivery of content as one of their highest priorities for businesses today.
In terms of top challenges companies face:
Unintended traffic spikes is also one of the challenges companies face. According to the survey about 46% of respondents said they experience unplanned traffic spikes at more than two times a year. Half of those companies expect this to happen many times a year.
By Identifying and monitoring the key drivers of their business. Companies perceive web performance as an necessary component for boosting profitability. 48% of customers surveyed stated that their primary reason for addressing the web performance challenge is to retain and grow the customer base, while 25% cited increasing visitor engagement with content as the primary driver.
The post Web Performance News for the Week of December 2, 2013 appeared first on LoadStorm.
]]> https://loadstorm.com/2013/12/web-performance-news-12-2-13/feed/ 0The post Web Performance Affects Revenue appeared first on LoadStorm.
]]> Yes, the holiday shopping season is still far in the future. Wait…Cyber Monday is only 8 weeks away. Yes, that gives you plenty of time before you need to start performance testing your e-commerce site. Or does it?Each year at the beginning of October, we have seen a distinct increase in emails and phone calls asking for help with website performance. This year is no exception. So, we thought we would share a few important statistics about web performance with our readers.
Consider these stats from Kissmetrics:
According to Rackspace:
By the way, Google has stated that site performance is also affects search engine rankings. Faster sites move up. Slower sites move down.
Website visitors expect fast response times. Duh.
The post Web Performance Affects Revenue appeared first on LoadStorm.
]]> https://loadstorm.com/2013/10/web-performance-affects-revenue/feed/ 0The post Comparing Page Views to Load Test Estimates appeared first on LoadStorm.
]]> In some cases a customer may use google analytics to look for spikes in the number of page views per hour for a week or month. From this they may want to know how their load test results compare for number of page views. There are a couple of ways to try and calculate the number of page views per hour from a set of load test results using LoadStorm. (These examples are assuming the results came from load tests that were run for 60 minutes)Also keep in mind that some pages can put more strain on your server than others due to interactions with a database, large file sizes, or numerous requests which can affect the actual number of page views per hour that your server can handle.
The post Comparing Page Views to Load Test Estimates appeared first on LoadStorm.
]]> https://loadstorm.com/2013/09/estimate-number-page-views-load-test-results/feed/ 0The post Cost of Fixing Software Defects appeared first on LoadStorm.
]]> I love this graphic on Altom Consulting’s home page that shows the relationship between when a bug is found and the cost of resolving the problem.
Their tagline for the company that relates to this graphic: “We believe in testing as early as possible to minimize the impact and cost of fixing defects.”
The post Cost of Fixing Software Defects appeared first on LoadStorm.
]]> https://loadstorm.com/2011/10/cost-fixing-software-defects/feed/ 2