Methods & Tools Software Development Magazine

Software Development Magazine - Programming, Software Testing, Project Management, Jobs

Click here to view the complete list of archived articles

This article was originally published in the Summer 2007 issue of Methods & Tools


Mocking the Embedded World:
Test-Driven Development, Continuous Integration, and Design Patterns

Michael Karlesky, Greg Williams, William Bereza, Matt Fletcher
Atomic Object, http://atomicobject.com

Despite a prevalent industry perception to the contrary, the agile practices of Test-Driven Development and Continuous Integration can besuccessfully applied to embedded software. We present here a holistic set ofpractices, platform independent tools, and a new design pattern (Model ConductorHardware - MCH) that together produce: good design from tests programmed first,logic decoupled from hardware, and systems testable under automation.Ultimately, this approach yields an order of magnitude or more reduction insoftware flaws, predictable progress, and measurable velocity for data-drivenproject management. We use the approach discussed herein for real-worldproduction systems and have included a full C-based sample project (using anAtmel AT91SAM7X ARM7) to illustrate it (see Appendix). This example demonstratestransforming requirements into test code; system, integration, and unit testsdriving development; daily "micro design" fleshing out a system’sarchitecture; the use of MCH itself; and the introduction of mock functions to automated unit tests.

Introduction

Heads around the table nodded, and our small audience listened to us thoughtfully. Atomic Object had been invited to lunch with apotential client. We delivered our standard introduction to Agile [1] softwaredevelopment methods. While it was clear there was not complete buy-in of thepresented ideas, most in the room saw value, agreed with the basic premises, orwanted to experiment with the practices. And then we heard an objection we hadnever heard before: "Boy, guys, it sounds great, but you can’t do thiswith firmware code because it’s so close to the hardware." Conversationand questions were replaced with folded arms and furrowed brows. It was thefirst time we had had in-depth conversation with hardcore embedded software developers.

In standard Atomic Object fashion, we took our experience as a dare to accomplish the seemingly impossible. We went on to apply to embedded software what we knew from experience to be very effective and complementary techniques – in particular Test-Driven Development (TDD) and Continuous Integration (CI). Inspired by an existing design pattern, we discovered and refined an approach to enable automated system tests, integration tests, and unit tests in embedded software and created a small C-based test framework [2]. As we tackled additional embedded projects, we further refined these methods, created a scriptable hardware-based system test fixture, developed the means to auto-generate mock functions for our integration tests, wrote scripts to generate code skeletons for our production code, and tied it all together with an automated build system.

The driving motivation for our approach is eliminating bugs as early as possible and providing predictable development for risk management. We know by experience that the methods we discuss here reduce software flaws by an order of magnitude or more over the average [4]. They also allow a development team to react quickly and effectively to changes in the underlying hardware or system requirements. Further, because technical debt [3] is virtually eliminated, progress can be measured and used in project management. A recent case study of a real-world, 3 year long, Agile embedded project found the team out-performed 95th percentile, "best in class" development teams [4]. Practices such as Test-Driven Development (TDD) and Continuous Integration (CI) are to thank for such results. Within embedded software circles, the practices of TDD and CI are either unknown or have been dismissed with skepticism. The direct interaction of programming and hardware as well as limited resources for running test frameworks seem to set a hurdle too high to clear. Our approach has been successfully applied in systems as small as 8 bit microcontrollers with 256 bytes of RAM and scales up easily to benefit complex, heavy-duty systems.

Application of these principles and techniques does not incur extra cost. Rather, this approach drastically reduces final debugging and verification that often breaks project timelines and budgets. Bugs found early are less costly to correct than those found later. Technical debt is prevented along the way, shifting the time usually necessary for final integration and debugging mysteries to developing well-tested code prior to product release. Because code is well-tested and steadily and predictably added to the system, developers and managers can make informed adjustments to priorities, budgets, timelines, and features well before final release. In avoiding recalls due to defects and producing source code that is easy to maintain and extend (by virtue of test suites), the total software lifecycle is less costly than most if not all projects developed without these practices.

A Note on "Mocking" and This Article’s Title

Mocking in software development is a specific practice that complements unit testing (in particular, interaction-based testing). The majority of a system’s code consists of calls making calls to other parts of the codebase. A mock is a specialized substitution for any part of the system with which the code under test interacts.

The mock not only mimics the function call interface of the system code outside the code under test it also provides the means to capture the parameters of function calls made upon it, record the order of calls made, and provide any function return value a programmer requires for testing scenarios. With mocks we can thoroughly test all of the logic within a function and verify that this code makes calls to the rest of the system as expected. Mocking is covered in more depth later.

Automated unit testing is far more prevalent in high-level software systems than in embedded systems though certainly even here it is not widespread. To our knowledge, automatically generating and unit testing with mocks in embedded software (particularly in small systems and those using C) such as we have done is a new development in the embedded space. This article’s title is a play on the uniqueness of the mocking concept to embedded software and a reaction to those in the industry that may say practices such as TDD are impossible to implement or have no value in embedded software development.

The Value of TDD and CI

Test-Driven Development and Continuous Integration are complementary practices. Code produced test-first tends to be well designed and relatively easy to integrate with other code. Incrementally adding small pieces of a system to a central source code control system ensures the whole system compiles without extensive integration work. Running tests allows developers to find integration problems early as new code is added to the system. An automated build system complemented by regression test suites ensures a system grows responsibly in features and size and exists in a near ready-to-release fashion at all times.

Test-Driven Development Overview

Traditional testing strategies rarely impact the design of production code, are onerous for developers and testers, and often leave testing to the end of a project where budget and time constraints threaten thorough testing. Test-Driven Development systematically inverts these patterns. In TDD, development is not writing all the functional code and then later testing it, nor is it verifying code by stepping through it with a debugger. Instead, testing drives development. A developer looks for ways to make the system testable, does a small amount of design, writes test programming for the piece of the system currently under development, and then writes functional code to meet the requirements of the test-spawned design. Designing for testability in TDD is a higher calling than designing "good" code because testable code is good code.

At the highest levels (e.g. integration and system testing) fully automated testing is unusual. However, at the lowest level, automated unit testing is quite possible. In automated unit testing, a developer first writes a unit test (a test that validates correct operation of a single module of source code – for instance, a function or method) and then implements the complementary functional code. With each system feature tackled, unit test code is added to an automated test suite. Full regression tests can take place all the time. Further high-level integration or system testing will complement these unit tests and ideally will include some measure of automation.

System Test-Driven Development follows these steps:

  1. Pick a system feature.
  2. Program a system test to verify that feature.
  3. Compile; run the system test with the system itself and see it fail.
  4. Identify a piece of functionality within the feature (a single function or method).
  5. Program integration and unit tests to verify that functionality.
  6. Stub out the functional code under test (to allow the test code to compile).
  7. Compile; run the integration and unit tests and see them fail (to verify expectations).
  8. Flesh out the functional, production code.
  9. Compile; run the integration and unit tests.
  10. Refactor the production code.
  11. Repeat 9-10 until the integration and unit tests pass and the functional code is cleanly implemented.
  12. Compile; run the system test.
  13. Repeat 4-12 until the system test passes.
  14. Repeat 1-13 until all features of the system are implemented.

TDD provides several clear benefits:

  • Code is always tested.
  • Testing drives the design of the code. As a side effect, the code is well designed because of the decoupling necessary to create testable code.
  • The system grows organically as more knowledge of the system is gained.
  • The knowledge of the system is captured in tests; the tests are "living" documentation.
  • Developers can add new features or alter existing code with confidence that automated regression testing will reveal failures and unexpected results and interactions.
  • Tests catch the majority of bugs and leave for a human mind difficult testing issues like timing collisions or unexpected sub-system interactions.

Continuous Integration Overview

The technique of continuous integration regularly brings together a system’s code (possibly from multiple developers) and ensures viaregression tests that new programming has not broken existing programming.Automated build systems allow source code and tests to be compiled and runautomatically. These ideas and tools are important complements to effective TDD.When TDD and CI are used together the system’s code-base is always thoroughlytested and has few, if any, integration problems among subsystems or sections ofcode. Integration problems are discovered early when it is cheapest to correctthem. Further, any such problem will be discovered close to where and when theproblem was created; here, understanding is greatest and good design choices are most likely.

Page 2   Back to the archive list