Feed aggregator
Web Performance Optimization Use Cases – Part 3 Automation
Webinar about the Transformation of APM on October 28th (help in German)
Web Performance Optimization Use Cases – Part 2 Optimization
Presenting Top Web 2.0 Performance Problems at WebTechCon 2010 in Mainz, Germany
Clutch Time – A New Web Performance Metric?
Web Performance Optimization Use Cases – Part 1 Benchmarking
Integrated Cloud based Load Testing and Performance Management from Keynote and dynaTrace
End-to-End Monitoring and Load Testing with Keynote and dynaTrace
Week 38 – Transactions in a JPA World
Quick Start Package for dynaTrace AJAX Edition
Top 3 Performance Problems in Custom Microsoft CRM Applications
Top 10 Client-Side Performance Problems in Web 2.0
Starting anew
After 18 years at Sun (the last 6 at Oracle), I finally called it quits. Last week, I began anew at Yahoo! I will still be focused on performance, but this time on end user performance.
At Sun, we were extremely focused on server-side performance. Our primary metric was throughput. We worried about scalability. Does Solaris scale on the maximum number of CPUs/threads that our largest system had ? What about the JVM ? And the appserver, and the webserver … you get the idea.
In the web world, things are quite different. The primary metric is response time. One could care less what the throughput on a particular server is – tens of thousands of servers are being deployed anyway. This mindset and architecture fascinate me. How do these large internet sites handle performance ? So I decided to find out. What better way, then to be part of one of the sites that sees the most traffic on the internet (see ComScore report).
I am part of the Exceptional Performance Team at Yahoo! This is the team that first brought YSlow to the community and is responsible for a whole host of tools to measure and analyze performance. I hope to contribute to this effort as well and of course, I will continue to blog about interesting performance issues that I encounter. Please do let me know if there are particular topics you would like to see on the Exceptional Performance blog.
Webinar: Application Performance Testing; From Conception to Gravestone
Gomez Webinar: Application Performance Testing: From concept to gravestone with Scott Barber and Imad Mouline. Register Here
Parameterizing Your Tests – Part 2
BrowserMob supports the ability to pass in parameterized data in your load test scripts and some of the ways of handling this have been explained in an earlier post.
But many a time we come across scenarios which require a unique login per transaction, such as that of a University where a unique user ID is required to fill out a college admission form. A login used once cannot be re-used in these situations. In this post, we’ll explain one way of handling parameterization for these types of scenarios.
Consider the following basic RBU script that uses a fixed username and password of ‘joe’ and ‘password’:
var selenium = browserMob.openBrowser(); browserMob.beginTransaction(); browserMob.beginStep("Step 1"); selenium.open("http://example.com/login"); selenium.type("username", "joe"); selenium.type("password", "password"); selenium.clickAndWait("login"); browserMob.endStep(); browserMob.endTransaction();Let’s look at how we can parameterize the username and password fields ensuring that the script uses a unique username/password combination every time it runs.
To start with we need to estimate the number of logins needed per concurrent user or in other words number of estimated transactions per concurrent user during the course of the test. This, of course, depends on a lot of factors such as average page load times, average think time between steps, number of steps in the scenario, etc. Below is a sample illustration of how to calculate this number.
Number of steps in scenario = 10Total number of intervals = 10
Length of each interval = 360 seconds
Average page load time = 3 seconds
Average think time between steps = 5 seconds
Total transaction length = Total number of steps * (Average page load time + Average think time between steps) = 80 seconds
Transactions per interval per concurrent user = Length of each interval/Total transaction length = 360/80 = 4.5 (rounding off to 5)
Total number of transactions per concurrent user for the entire test = Transactions per interval per concurrent user * Total number of intervals = 5 * 10 = 50
Since this number is based purely on the above assumptions and because the load times could vary during the actual test, it is always better to bump this number up a bit to ensure you have enough unique ids to cover all the possible transactions.
Hence, assuming we are testing with 50 concurrent users and 100 transactions per concurrent user during the course of the test, here is how we can go about with the parameterization:
var loginsPerUser = 100; var userNum = browserMob.getUserNum(); //starts from index 0 var txCount = browserMob.getTxCount(); //starts from index 1 var id = (userNum*loginperuser) + txCount; // Since the 'userNum' starts from index 0, for the first user, we just append the 'txCount' // whereas, the 'id' from above is appended for the remaining users if (userNum == 0) { var userId = 'test_' + txCount; } else { var userId = 'test_' + id; }This way ‘test_1′ to ‘test_100′ will be utilized for user 1, ‘test_101′ to ‘test_200′ will be utilized for user 2 and so on. The biggest advantage of this method is that there is no need to reference any external csv file for the parameterization although you do need to ensure that these logins exist in your database ahead of the test.
Here is the parameterized script:
var selenium = browserMob.openBrowser(); browserMob.beginTransaction(); browserMob.beginStep("Step 1"); selenium.open("http://example.com/login"); selenium.type("username", userId); selenium.type("password", "password"); //we are assuming that the password is the same for all users selenium.clickAndWait("login"); browserMob.endStep(); browserMob.endTransaction();Tweet This Post
Be prepared for Christmas Traffic
Christmas season is usually the busiest and most profitable period for online retailers and travel related sites. According to Comscore.com, in 2009, November's internet traffic to e-commerce sites grew by up to 47% against the base period (Aug 31st to Nov 1st). With a fast recovering economy, website owners can expect a greater increase in internet traffic during the year-end holiday season.
This might sound like good news to e-retailers but a sudden increase in traffic creates stress on a server and slows a website down. Moreover, poorly designed webpages that are not optimized for speed further prolong load times. Even a one second increase in load-time from three to four seconds will result in a dramatic decrease in customer conversion rate. Key findings from the "Consumer Response to Travel Site Performance" study conducted by PhoCusWright and Akamai showed that 57% of online shoppers will wait three seconds or less before abandoning the site and that 65% of 18-24 years old expect a site to load in two seconds or less. This means e-retailers actually lose many potential customers without even realizing it.
In the e-commerce market where profit margin is thin and competition is tough, minor tweaks that make website load faster will give retailers an edge over their competitors. Webmasters can choose between a variety of free online tools to check their website's performance and also get useful tips. Google's Page Speed and Yahoo's YSlow are two more popular choices that analyze web pages and suggest ways to improve their performance based on a set of rules for high performance web pages. Both services provide a grade for the website which webmasters can use to benchmark their website against the competitor's. Good practices as well as suggestions are also give and the services are free to use.
The good news is that there is still some time left before the Christmas traffic hits and webmasters can further optimize their website to prevent a "winner takes all" scenario where a few fast loading e-retailers account for the bulk of the online sales.
P.S.: We have a Christmas promotion going on here
Python - Shorten a URL Using Google's Shortening Service (goo.gl)
Using Python to shorten a URL with Google's shortening service (goo.gl):
#!/usr/bin/python # Corey Goldberg - 2010 import json import urllib import urllib2 def shorten(url): gurl = 'http://goo.gl/api/url?url=%s' % urllib.quote(url) req = urllib2.Request(gurl, data='') req.add_header('User-Agent', 'toolbar') results = json.load(urllib2.urlopen(req)) return results['short_url'] if __name__ == '__main__': print shorten('http://www.goldb.org/') print shorten('www.yahoo.com')You give it a URL to shorten: shorten('http://www.goldb.org/long_url')
... and it returns a shortened URL for you: 'http://goo.gl/jh4W'