Click here to view the complete list of archived articles

This article was originally published in the Summer 2007 issue of Methods & Tools


An Agile Tool Selection Strategy for Web Testing Tools - Part 2

Lisa Crispin, http://lisa.crispin.home.att.net/

Our Tool Selection Process

If your team has specific needs that may not easily be met by a generic test tool, and you have the skill set to accomplish it, consider"growing your own". This way, you get a tool is customized to yourneeds and integrates well with your application. Many teams have chosen thisroute, which is why there are so many excellent open source tools availabletoday. If you ‘home brew’, you still need to consider your test toolrequirements. For example, if non-programmers need to specify tests, you’ll have to develop an interface to allow that.

In our case, our financial web-based application didn’t seem all that unique, and we were a tiny team without a lot of bandwidth fortool development. Our management included funds for test tools in the budget, so we looked for an outside solution.

To illustrate how your tool selection process might work, come back in time with me to late 2003 / early 2004 when we started our search.If we started from scratch in 2007, there would be even more options toconsider! A couple of great places to start your search for web testing toolsare www.softwareqatest.com/qatweb1.htmland www.testingfaqs.org,which list both commercial and open source tools, and www.opensourcetesting.org,which provides information about all kinds of open source software testingtools. Another good resource are members of your local testing/QA user group,and testing-related mailing lists such as http://groups.yahoo.com/group/agile-testing.

We had to address our various needs with tools. Unit testing was a no-brainer. With a Java application, JUnit is the obvious choice. The teamgot started writing unit tests right away, as they would form the foundation of our regression tests. Next on the priority list was GUI testing.

Vendor Tools

We desperately needed to get some automated smoke tests going, and thought maybe we could buy a tool that we could run with. As atester, I have extensive experience with the capture/playback/scripting type ofautomation tools. Vendor tools are often a safe choice. They’re generallydesigned for non-technical users, who can get started fairly easily. They comewith user manuals and installation instructions. Training and technical support are available for a fee.

Commercial tools are often integrated with a suite of complementary tools, which can be an advantage. Many vendors offer functional, performance and load test tools. They offer impressively robust features. However, they tend to be targeted towards test organizations, and aren’t very ‘programmer-friendly’. They can be difficult to integrate into a continuous build process. They often have proprietary scripting languages, or are limited to one scripting language such as Javascript.

I had previously used Mercury test tools, so QuickTest Pro was one option to consider among many commercial tools. Other vendor tools we could have considered, or could consider if we were looking today, are TestPartner, Rational Functional Tester, SilkTest, BadBoy, TestMaker and (one I have always wanted to try) LISA. We also tried out Seapine’s QA Wizard, since we used their defect tracking tool TestTrack. At that time, both of those capture/playback tools used proprietary scripting languages. We didn’t want to be limited to only capture/replay, as those scripts can be more work to maintain. The programmers on my team didn’t want to have to learn a new tool or new scripting language. That ruled out all the vendor tools we looked at.

Open Source Tools

We turned to open source tools. Since these were generally written by programmers to satisfy their own requirements, they’re likely to be programmer-friendly. But there are issues with open source tools as well. Support, for example. If you have a question about an open source tool, whom do you ask? Most of the open source tools we considered had mailing lists where users and developers shared information and helped each other. I checked each tool’s mailing list to see if there was lots of activity on a daily or weekly basis, hoping to see a large and active user base. I also tried to find out if users’ issues were addressed by the tool’s developer community, and whether new versions were released frequently with enhancements and fixes. Of course, with open source tools you’re free to add your own fixes and enhancements, but we knew we wouldn’t have the bandwidth to do this at first.

Open source tools have a wide range of learning curves. Some assume programming proficiency. This is fine if everyone using the tool is able to achieve that level of competence. Others are geared to less technical users, such as testers and analysts. Some have user documentation on a par (or even better) with good vendor tools, and others leave the learning more up to the user. Some even have bug tracking systems, and the developers actually fix bugs!

Think about the level of support and documentation you will need, and find a tool that provides it. We were looking for a tool that camewith a lot of help.

Our GUI Tool Search

One example of a tool we researched, but didn’t try out, was JWebUnit. Since this tool lets you create scripts with Java, it appealed tothe programmers on the team. At the time (2003), it didn’t seem to have asmany users or as much mailing list activity as other open source test tools. Weconsidered other Java-based tools, such as HtmlUnit, which is widely used. I hadused a similar tool, HTTPUnit, before, and had liked it well enough. However,all these Java-based tools were a problem for my severely limited Java codingskills. While we anticipated that the programmers would do a large percentage ofautomating the customer-facing tests, I needed to write most of the GUI testscripts. I wanted to get a smoke test suite to cover the critical functionalityof the legacy system, while the programmers got traction on the unit test side. We needed a programmer-friendly tool, but also a Lisa-friendly tool.

We considered scripting tools such as WATIR and scripting languages such as Ruby. I’d used TCL to write test scripts on a prior team,and I like the flexibility of scripting languages. We didn’t have any Rubyexperts on the team, and although I was eager to learn it, the time it would take was an obstacle.

We looked for something that required less OO programming proficiency. I’d heard good things about Selenium, but at the time it had somelimiting factor such as being difficult to integrate into our build process.Another tool that would have been a strong contender, but either it wasn’t available yet or I just didn’t know about it, is Jameleon.

After much research, we decided to try Canoo WebTest. This tool, based on HtmlUnit, uses XML to specify tests, and the scripts run via Ant.Since we use CruiseControl for our builds, it was simple to integrate theWebTest scripts with our continuous build. Being used to the features found incommercial tools, WebTest at first seemed a bit simplistic to me. At the time,it didn’t support things like if logic, except by including scripts written inGroovy or other scripting languages. It didn’t look easy to do data-driventests with WebTest. However, we liked the idea that the tests would be so simpleand straightforward, we wouldn’t have to test our test scripts. WebTest seemed a good choice for creating a smoke test suite.

The programmers were comfortable with WebTest, since they were all familiar with XML. If a test failed, they’d be able to understand thescript well enough to debug the problem. They could easily update or write newtests. We decided to try it for a few iterations. Since the programmers werebusy with learning how to automate unit tests, it was helpful to have a GUI testtool that was easy for me to learn. I implemented it with some help from oursystem administrator. We soon had two build processes, one running all the unittests, and the other running the slower WebTest scripts. It took about eightmonths to complete enough scripts to cover the major functionality of theapplication. These scripts have caught many regression bugs, and continue to catch them today. The return on our investment has been awesome.

Extending Our Coverage

Automated GUI test scripts are by far the most fragile and expensive to maintain, although WebTest’s features minimize the need forchanges. These scripts were fine for smoke tests, to make sure nothing major orobvious in the application was broken, but we didn’t want to do detailedfunctional testing this way. Also, we still needed a tool to support our plan to drive development with customer-facing, executable tests and examples.

Once we had traction both at the unit test and GUI test level and could take a breath, we looked for a tool that filled up the big gap in themiddle. We looked at FIT and FitNesse (which is essentially FIT using a wiki forthe IDE) since they allow a non-programming user to write test cases in atabular format and programmers to easily write fixtures to automate them. Thesetools basically replace the UI. They allow you to send test inputs to the code,operate on them, return actual results and compare them automatically withexpected results. The results turn red, green or yellow, and we love colorcoding. Both tools have a large user base and active mailing lists. We liked FitNesse’s wiki component, so we decided to try it.

FitNesse turned out to suit our needs for documenting not only the features we developed, but other information about maintaining theapplication. We found that it was easy to learn how to define test cases andwrite the fixtures to automate them. Integrating the test suites into our buildprocess took longer, but was doable. It was easy to write both high level teststo give the big picture, and executable tests to capture the detailed requirements.

Sometimes you get unexpected benefits from tools. We understood that creating FitNesse tests would require work from both a tester,to specify test cases, and a programmer, to automate them. We found that thisenforced collaboration enhanced communication within the team. The resultingcommunication flushed out misunderstandings and wrong assumptions early in the development process. Test tools aren’t just for automation!

Page 1    Page 3    Back to the archive list