Methods & Tools Software Development Magazine

Software Development Magazine - Programming, Software Testing, Project Management, Jobs

Click here to view the complete list of archived articles

This article was originally published in the Summer 2007 issue of Methods & Tools


An Agile Tool Selection Strategy for Web Testing Tools

Lisa Crispin, http://lisa.crispin.home.att.net/

Selecting a test automation tool has always been a daunting task. Let’s face it, just the thought of automating tests can be daunting! Theselection of tools available today, especially open source tools, is positivelydazzling. In the past several years, "test-infected" developers, notfinding what they need in the vendor tool selections, have created their owntools. Fortunately for the rest of us, many are generous enough to share them asopen source. Between open source tools and commercial tools, we have an amazing variety from which to choose.

To avoid that deer-in-the-headlights feeling, consider taking an ‘agile’ approach to selecting web testing tools. Plan an automationstrategy before you consider the possible tool solutions. Start simple, and makechanges based on your evolving situation. Here are some ideas based onexperiences I’ve had with different agile (and not so agile!) developmentteams. Even if your team doesn’t use agile development practices, you’ll get some useful tips.

An Agile Test Automation Strategy

First of all, your team should consider your testing approach. When I say ‘team’, I’m thinking of everyone involved indeveloping and delivering the software, which in your case might be a virtualteam. When do you write tests? Who writes them? How should the test results bedelivered? Who needs to be able to look at the test results, and what shouldthey be able to learn from them? What kind of tests need to be automated, andwhen? Do you have other tedious tasks, such as populating test data or looking through version control system output, that you’d love to automate?

Back in 2003, my current team had no test automation at all, and a buggy legacy web-based J2EE application. We desperately needed to automateour regression tests, since the manual regression tests took the whole team acouple of days to complete, and we were delivering new code to production everytwo weeks. We had decided to start rewriting the system, developing new featuresin a new architecture, while maintaining the old code, but this would be impossible without a safety net of tests.

We committed to using to test-driven development for a number of reasons, one being that automated unit tests have the highest return on investment of any automated test. We went a step further, and decided to also use ‘customer-facing’ tests and examples to help drive development. We’ve found that one example is worth pages of narrative requirements! We wanted to be able to write high-level, big-picture test cases before development starts, and then write detailed executable test cases concurrent with development so that when coding is finished, all the tests are passing.

Meanwhile, we required some kind of ‘smoke test’ regression suite for the legacy application, to make sure that critical partskept working. Due to the old code’s architecture, we decided these would haveto be done through the GUI. We wanted all of our tests to run during ourcontinuous build process, which was automated using CruiseControl, so we’d have quick feedback of any regression failures.

Quick and easy-to-read notification of whether tests passed or failed was important to us. Ideally, our build would include these results inan email. In the event of a failure, we wanted to be able to quickly drill down to see the cause.

Platform is an obvious consideration. Our build runs on Linux, and our application was running on Linux, Solaris and Windows at thetime. Any test tools that, for example, only ran on Windows did not have much appeal.

Based on all these needs, we started searching for tools. Our whole team takes responsibility for quality and testing, so we all needed toagree on our automation approach and tools. Having programmers, testers,database specialists and system administrators collaborate on test automationleverages a variety of skills to help get the best solutions. I highly recommendtaking a ‘whole team’ approach to deciding on a test automation strategy, choosing and implementing tools.

An Agile Tool Selection Strategy

The whole team approach means asking ourselves, "What skills do we have on our team?" Do any team members have extensiveexperience with particular test tools or types of test tools? What programmingand scripting language competencies exist on the team? How much technicalexpertise do the testers have? How about the business people who might bereviewing or even helping to write tests? What types of tests are youautomating? Unit, integration, functional, security, or do you need to doperformance or load testing? How robust do your test scripts need to be, and howmuch can you spend on maintenance? Are you planning to do data-driven or actionkeyword type tests where the tests accept a variety of input parameters and havea lot of flexibility? Or are you looking for straightforward, low-maintenancetests? Can you test at a layer below the user interface, or do you have anarchitecture that makes that difficult? These are all considerations when shopping for a test tool.

With a variety of test needs, consider that you may need a variety of tools. We tried to keep an open mind on what might solve a particularautomation problem, and we were willing to experiment. We’d pick a tool to tryfor a few iterations and see how we liked it. Getting up to speed on tools tothe point where you can effectively evaluate them takes time, so be sure to budget plenty of time in your planning.

Page 2   Back to the archive list