Methods & Tools Software Development Magazine

Software Development Magazine - Project Management, Programming, Software Testing

Scrum Expert - Articles, tools, videos, news and other resources on Agile, Scrum and Kanban

Test Automation Strategy Out of the Garage

Mark Rossmiller, SWQA Software Design Engineer,Hewlett Packard

Abstract:

As a software engineer, I have been involved in software quality and testing of various forms of commercially developed software for over seven years. Many of these software applications ranging from database tier servers, LAN|Web communication, HR applications, installers, and printer device drivers. In all of these software examples, I developed various forms and levels of automation to increase the efficiency of the testing effort with an equal variety of success. It has been my experience that most of the success implementing automated test tools has primarily not resided in the automation tools employed. Rather, it has existed more in the vision of the leadership to make a paradigm shift from old methods of testing software in order to actualize the value of automated test tools. Without an environment designed, accepted, and developed in all stages of the test planning process the success of automation can be plagued with delays, feature changes, and resistance to change. The following testing strategy was developed to assist my HP colleagues and partners to recognize the necessity to reduce the time and cost of testing methods as it currently exists. I have borrowed from the HP Rules of the Garage as a template in defining the paradigm shift needed to facilitating better, faster, and smarter use of software testing resources.

Believe you can change the world.

Past success can be an invigorating reminder of how good a job we have done; that there is no need to alter the course of how we test our products, or whether we need to investigate alternatives to the way we have performed testing on past products. In this view, past success can be a detriment to being able to change testing habits. To believe that one can change the world requires that we are willing to make change ourselves. I think Carly Fiorino stated this best, "We must be willing to eat our own dog food...". This means you cannot simply advocate the benefits of automation. One must actually implement and use the tools within a process improvement cycle - plan|do|check|act (the Shewhart cycle) (1 Grady). Automation development and testing must be understood as having no difference in application from that of the development practices involved in the software being tested.

Work quickly, keep the tools unlocked, and work whenever.

Although software testing is often constrained to follow the product development release schedule, it never-the-less should make every effort to optimize planning and arrange resources on its own well defined set of check points. Setup time must be reduced to a minimum or completed prior to drop of the next code release. It is essential to develop processes that enable testing as soon as release drops occur. This requires significant reduction in setup time associated with testing of the next software release cycle candidate. In most cases, there needs to be a consistent test environment established having static test machines in place at all times. In addition, a table of objectives and backup procedures should be prepared in advance to accommodate dynamic testing where a variety of SW|HW|OS configurations are necessary to complete the scope of planned testing. We cannot move quickly enough if we choose to wait until the next drop occurs.

Testing tools are only valuable if they get used. Particularly with complex automation, it is necessary for the tool users to become familiar with its use, provide for the expanded use of the tool, and most importantly, to provide usage feedback that further enhances the value of the tool in testing. It can then be determined if the support of the tool for required testing has merit or if a suitable alternative is available. Some of the risks in practice that can inherently devalue tool and automation investment are:

  1. Lack of a sponsorship authority that champions and continuously promotes use of the tool(s).
  2. Tools are implemented for testing that is ineffective, or not as originally intended.
  3. Support and involvement of partners are not incorporated in the tool design.
  4. Significant and sometimes un-communicated feature changes to the product being tested occur.
  5. Unclear ownership, expected outcome, and training in the use of the tool.
  6. Lack of motivation to use the tools; usage improvements thus cannot be determined rendering the tool obsolete or "locked".

When testing resources become scare, further improvements in efficiency can still be realized through automation that can be run unattended or after-hours. "Lights out" testing has been developed and available for several years at HP. VCD has several automation tools built with Visual Test on site and leveraged from other sites. These tools have great value, have a history of proven use, and should continue to evolve to meet changing product testing demand. (Please find tool detail in Resources section of this document).

Know when to work alone and when to work together.

"Lasting productivity improvement must come from within and cannot be imposed from without." (5 Bumbarger).

If you want to work alone in an organization, propose radical change. Obviously this is not the time to work alone since progressive change requires a receptive and collaborative effort. But the act of proposing change presents many perceptual, habitual, cultural, emotional, and political obstacles from those who would be expected participants and supporters. That is why these environmental factors need to be addressed or it is futile to introduce the tools of change. Real, lasting productivity improvement requires change. And change requires creativity and innovation. The irony in productivity improvement is that success is most unlikely to succeed when imposed from without - but often is forced to change when ineffective from within.

Share tools, ideas. Trust your colleagues.

Results of using automation in software driver testing showed a reduction in testing time by half. There is also the inherent benefit of consistency of execution - repeatability. Thousands of keystrokes or settings can be repeated exactly without the variation when done by hand. Automation reduces thrashing through defects that are not properly characterized and/or hard to duplicate. This results in thousands of reported defects over time remaining open and without a clear resolution. Automation further benefits the tester by providing time savings when multiple SW|FW|HW|OS configurations are required. A multiple of tests can therefore be run concurrently and multiply the effectiveness of the testers time.

No politics. No bureaucracy.

Organizations by the very nature of their existence are political and as they grow - become bureaucratic. Our organization is no exception. But this does not mean that visionary thinking cannot be used to improve or eliminate as many obstacles within the organization to effect positive change.

The customer defines a job well done.

From the standpoint of software quality, the customer as well as the next customer represents Argus project data, R&D partners, beta testing, outsource testing, and final media to name a few. Automation has indirectly provided improvements in servicing the next customer via automated defect handling tools to report defects more efficiently and earlier in the development cycle. Automated defect submittals have, for the past 2 years, contributed the largest share of reported defects at a savings of more than 771 hours in submittal costs. We should continue to support these systems and provide the technical resources and tools to maintain the reduced cost of defect handling.

Go to part 2   


Click here to view the complete list of archived articles

This article was originally published in the Summer 2001 issue of Methods & Tools

Methods & Tools
is supported by


Testmatick.com

Software Testing
Magazine


The Scrum Expert