Methods & Tools Software Development Magazine

Software Development Magazine - Project Management, Programming, Software Testing

Agile Crash Course: Agile Project Management & Delivery - Master the most important concepts & tools of Agile

Your Company Name Here - Reach 30'000 visitors/month and 35'000 software development professionals for $145.

Click here to view the complete list of archived articles

This article was originally published in the Summer 2003 issue of Methods & Tools

Modeling the Real World for Load Testing Web Sites - Page 2

Steven Splaine,

4. Usage Patterns

Unlike typical mainframe or client-server applications, Web sites often experience large swings in usage depending on the type of visitors that come to the site. U.S. retail customers, for example, typically use a Web site in the evenings (7:00 p.m. EST to 11:00 p.m. PST). Business customers typically use a Web site during regular working hours (9:00 a.m. EST to 4:00 p.m. PST). The functionality of a Web site can also have a significant impact on usage patterns. U.S. stock quotes, for example, are typically requested during market trading hours (9:30 a.m. EST to 4:00 p.m. EST).

When attempting to model the real world, you should conduct some research to determine peak usage ramp-up and ramp-down, peak usage duration, and whether any other load profile parameters vary by time of day, the day of the week, or another time increment. Once researched, schedule tests that will run over the real Internet at appropriate times of the day/week.

5. Client Platforms

Different client-side products (e.g., Browsers and O/Ss) will cause slightly different HTTP traffic to be sent to the Web server. More important, if the Web site has been designed to serve up different content based on the client-side software being used (a technique commonly refereed to as browser sniffing), then the Web site will have to perform different operations with correspondingly different workloads.

Some browsers allow users to change certain client-side network settings (threads, version of HTTP, and buffer sizes) that affect the way the browser communicates and thus the corresponding workload that a browser puts on a Web server. While few users ever change their default settings, because different browsers/versions have different defaults, a more accurate test load would vary these values.

6. Client Preferences

Most browsers also allow users to change client-side preferences; but again, few users actually change their default settings. However, different products/versions of a browser may have different default settings. For example, a browser with cookies disabled will reduce the amount of network traffic due to the cookies not being sent back and forth between the Web site and the browser; however, it might increase the resource requirements of the application server as it struggles to maintain a session with the user without the convenience of the cookie.

If encryption is going to be used to send and receive secure Web pages, the strength (or key size) of the encryption used to transfer the data will be dependent upon a negotiation that takes place between the Web server and the browser. Stronger encryptions utilize more network bandwidth and increase the processing requirements of the CPUs that perform the encrypting and deciphering (typically the Web server). Therefore, users with low settings (e.g., 40-bit keys) will put less of a strain upon a Web server than users with high settings (e.g. 128-bit keys).

By indicating that they do not want graphics or applets downloaded, Web site visitors will not only speed up the delivery of a Web page that contains these files, but will also consume a smaller portion of the Web site’s bandwidth and fewer Web server connections. If present in significant numbers, these clients can have a noticeable effect upon the performance of the Web site.

7. Client Internet Access Speeds

The transmission speed or bandwidth that your Web application will use can have a significant impact on the overall design, implementation, and testing of your Web site. In the early days of the Web (circa mid-1990s), 14.4 Kbps was the most common (e.g., standard) communications speed available. Hence, 14.4 Kbps became the lowest common denominator for Internet access. When 28.8 Kbps modems were introduced, however, they offered a significant performance improvement over 14.4 Kbps modems and quickly surpassed 14.4 Kbps modems in popularity. When 56.6 Kbps modems were introduced, the performance improvement wasn’t as significant. Consequently, 28.8 Kbps are still in use and unlike the 14.4 Kbps (which has nearly vanished) still comprise a significant (although decreasing) proportion of the Internet population. Many companies therefore use the 28.8 Kbps transmission speed when specifying the performance requirements of their Web site.

Go to page 1    Go to page 3    Back to the archive list

Methods & Tools
is supported by

Simpliv IT Courses

Software Testing

The Scrum Expert