Methods & Tools Software Development Magazine

Software Development Magazine - Project Management, Programming, Software Testing

Scrum Expert - Articles, tools, videos, news and other resources on Agile, Scrum and Kanban

Requirements for Outsourcing

Tom Gilb, https://www.gilb.com/

Abstract

Outsourcing differs from other development because there is bound to be a contractual relationship, probably a geographic distance, a different sense of loyalty, language misunderstandings, cultural differences, reluctance to speak up to the client - and many other associated problems. Good requirements are always a problem, but outsourcing increases the problems, and makes even great demands on the requirements specification. The payoff for doing good requirements is greater, and the penalty for not doing them is more threatening.

I am going to argue that we need to make use of far more explicit background specification for each requirement, a page or more of specification for each requirement. I will argue that this is a necessary investment - because failure to do so will probably cost far more - sooner or later. I will argue that failure to be more detailed than normal will be counted in the clients disfavor in any legal proceedings trying to determine responsibility for failure of the project.

Outsourcing Requirements Principles

Here is a set of principles for Requirements for Outsourcing:

  1. If anything can be misunderstood, it probably will be.
  2. Writers Are Responsible for Readers Wrong Renditions
  3. Assume Nothing, Specify Everything
  4. Too Much is Safer than Too Little
  5. If They ask a question, document and integrate the answer
  6. Quality Control before sending
  7. Evolve Requirement Delivery
  8. Quantify Quality
  9. Constrain explicitly
  10. Connect relationships

Let me explain them in more detail:

If anything can be misunderstood, it probably will be.

Every person has a strong tendency to interpret words slightly to largely differently from everybody else. When we ask 20-30 people to write down their interpretation of a short requirement statement, we always experience totally different, never identical answers from individuals working on the same project. We call this the 'Ambiguity Test' - and it really gets the point across to the whole group about how careful we must be when writing specifications that must be understood correctly.

Explicit Definition

One simple tactic we recommend in 'Planguage' or The Planning Language [CE] is to take the trouble to explicitly define any term that could possibly cause misunderstanding, and then 'Capitalize' the term to signal that the reader must interpret it with the official definition.

For example:

Intuitiveness:

Scale: Probability that a defined [User] can intuitively figure out how to do a defined [Task] correctly without any Errors Needing correction.

User: defined as: {Novice, Experienced, Expert}.

Task: defined as { All Tasks: {Data Entry, Screen Interpretation, Answering Call, <others undefined now> }.

Errors: defined as keyboard or mouse or touch screen inputs that are unacceptable, unintended, or incorrect.

Needing: defined as: objectively incorrect data, or data violating Corporate Data Standards.

Example: Capitalized Terms indicate formal definitions somewhere, locally or elsewhere in the project documentation, or 'later' in the definition process. Source CE, page 160 (line 1-3).

Fuzzy Brackets

Another simple Planguage tactic, is to mark all 'dubious' terms and phrases with <fuzzy brackets>.

This means that we are not certain about whether the term could be misinterpreted. The early spec reader should be very careful in interpreting the term. The spec writer has declared the term to be potentially defective until rewritten or defined. People happily use this fuzzy brackets facility several times per sentence. It allows them to rapidly continue their flow of work, without getting into premature detailed discussions. But they have obliged themselves to get back later and fix it up, and failing that, they have at least warned the spec reader to be careful.

Performance Requirements:

Ambition: <competitive> <breakthrough level of quality>.

Reduce Product Cost.

Improve Productivity [Engineering].

Improve timeliness of <engineering drawings>.

Improve <drawing quality>.

Reduce <drawing errors>.

Others.

Reduce <Engineering Process> timescales ('time to market').

Improve <Efficiency> [Manufacturing, Procurement].

Achieve <Growth>.

Others

Example: The use of fuzzy brackets to mark all terms needing definition work, later. The text is taken from a Board level proposal for $60 million, that we restructured and began to define. The Board was not willing to grant budget for this set of badly defined promises. Source: CE, page 73.

Requirements for Outsourcing

Case Study Example of Ambiguity Test: Dozens of different interpretations during an ambiguity test, for a multinational client, in London. We asked them to interpret 'Soldier Friendliness' of a military mobile phone.

Writers Are Responsible for Readers Wrong Renditions

If the reader - the outsourcer - misunderstands requirements I hope we can agree that the fault is entirely the spec writer.

As with all communication the communicator should know their audience, and communicate so they can understand. Anything less is both arrogant and impractical.

We have a rule in Planguage that says:

R9: Clear: Specifications should be 'clear enough to test' and 'unambiguous to their intended readers.' <- CE, p 17

One client of ours took the full consequences of this rule by introducing another rule that the intended (types of) readers were to be listed on the front page of a requirements set.

We frequently ask people to count the number of serious-consequence violations of this rule in a sample of their own, often 'approved' requirements. The extrapolated estimate of the number of violations is on average about 100 per 300 words. Extrapolated means that since the quality control process we use is provably 1/3 effective, and a team finds about 33 'defects' (rule violations) in a page, then we reckon (and can prove by further QC) that there are about 100. People are quite 'shocked' by this result - especially when they realize it is serious. We normally release requirements with a huge and unacceptable lack of ambiguity and lack of clarity. We don't bother to conduct a quality control process to measure this; so we are unaware of it. But it should come as no surprise. We know that even dictionaries happily give us about many different definitions of the same word, and the definition I need, in a narrow technical area like systems engineering (for , for example architecture, is not usually one of those.

architecture | arki tek ch ər| |rkə t k(t) ər| |k t kt ə |
noun
1. the art or practice of designing and constructing buildings.
* the style in which a building is designed or constructed, esp. with regard to a specific period, place, or culture: Victorian architecture.
2. the complex or carefully designed structure of something : the chemical architecture of the human brain.
* the conceptual structure and logical organization of a computer or computer-based system : a client/server architecture.

Sample definition, not the one I need for systems engineering (of which there are many!)

Assume Nothing, Specify Everything.

You need to make all assumptions explicit. You need to do this in detail, in the context of each individual requirement.

R14:

The 'Assumption' Planguage parameter can be used for this purpose. But there are also a number of alternative ways, such as {Risk, Source, Impacts, Depends On, Comment, Authority, [Qualifiers], If }. In fact, any reasonable device, suitable for the purpose, will do.

The box above cites one Planguage rule regarding assumptions.

Here is a template one could use to remind people to state assumptions:

Requirement Specification Template (A Summary Template)

Tag: <Tag name for the system>.

Type: System.

=========================== Basic Information ==========================

Version: <Date or other version number>.

Status: <{Draft, SQC Exited, Approved, Rejected}>.

Quality Level: <Maximum remaining major defects/page, sample size, date>.

Owner: <Role/e-mail/name of the person responsible for changes and updates>.

Stakeholders: <Name any stakeholders (other than the Owner) with an interest in the system>.

Gist: <A brief description of the system>.

Description: <A full description of the system>.

Vision: <The overall aims and direction for the system>.

============================= Relationships ============================

Consists Of: Sub-System: <Tags for the immediate hierarchical sub-systems, if any, comprising this system>.

Linked To: <Other systems or programs that this system interfaces with>.

========================= Function Requirements ========================

Mission: <Mission statement or tag of the mission statement>.

Function Requirement:

<{Function Target, Function Constraint}>: <State tags of the function requirements>.

Note: 1. See Function Specification Template. 2. By default, 'Function Requirement' means 'Function Target'.

======================= Performance Requirements ======================

Performance Requirement:

<{Quality, Resource Saving, Workload Capacity}>: <State tags of the performance requirements>.

Note: See Scalar Requirement Template.

========================= Resource Requirements ========================

Resource Requirement:

<{Financial Resource, Time Resource, Headcount Resource, others}>: <State tags of the resource requirements>.

Note: See Scalar Requirement Template.

=========================== Design Constraints ==========================

Design Constraint: <State tags of any relevant design constraints>.

Note: See Design Specification Template.

========================= Condition Constraints =========================

Condition Constraint: <State tags of any relevant condition constraints or specify a list of condition constraints>.

====================== Priority and Risk Management =====================

Rationale: <What are the reasons supporting these requirements? >.

Value: <State the overall stakeholder value associated with these requirements>.

Assumptions: <Any assumptions that have been made>.

Dependencies: <Using text or tags, name any major system dependencies>.

Risks: <List or refer to tags of any major risks that could cause delay or negative impacts to the achieving the requirements>.

Priority: <Are there any known overall priority requirements? >.

Issues: <Unresolved concerns or problems in the specification or the system>.

================== Evolutionary Project Management Plan ==================

Evo Plan: <State the tag of the Evo Plan>.

========================= Potential Design Ideas ========================

Design Ideas: <State tags of any suggested design ideas for this system, which are not in the Evo Plan>.

Example Template: In addition to the explicit parameter 'Assumptions' there are a number of other parameters that deal with some class of assumption such as 'Rationale, Risks, Priority, Design Constraints' and many more. Source CE page 77.

Too Much is Safer than Too Little

When we teach and consult with companies, we quickly show that what was a short one-liner of a requirement, easily becomes a full page, with all the background information, such as assumptions and relationships is added to the requirement as background information. We would not recommend this if we did not believe, and if our clients do not believe, that it pays off.

One director of a telecom company defended this in front of his CEO and fellow directors (and this author) by pointing out that the traditional one line marketing requirement cost on average $400,000 to implement. If the cost of building a full page of related data to that requirement, was the price the company needed to pay to protect that investment, then is was a small price indeed. This company has since won the battle with the reluctant (to specify requirements well) marketing people, which included firing two reluctant marketing directors. And included a founding director patiently working with marketing people to improve the quality of requirements. It also includes test and quality managers patiently working with engineers to increase the scope of requirement definitions (for several years).

Emergency Stop:

Type: Function.

Description: <Requirement detail>.

Module Name: GEX.F124.

Users: {Machine Operator, Run Planner}.

Assumptions: The User Handbook describes this in detail for all <User Types>.

User Handbook: Section 1.3.5 [Version 1.0].

Planned Implemented: Early Next Year, Before Release 1.0.

Latest Implementation: Version 2.1. ''Bug Correction: Bug XYZ.''

Test: FT.Emergency Stop.

Test [System]: {FS.Normal Start, FS.Emergency Stop}.

Hardware Components: {Emergency Stop Button, Others}.

Owner: Carla.

Real example: some additional parameters to describe a requirement in excess of the 'Description". Source CE, p.91.

If they ask a question, document and integrate the answer

I have a fanatic habit. When I am listening to clients discuss requirements, I grab the answers to questions and document them in the requirement specification. I am fearful that:

  • nobody heard the answer correctly
  • if they did then the answer is unintelligible or incorrect but no one can be bothered to challenge it now
  • people who were not in the room will have different information, but they will never know what was said
  • conditions will change and this will not be true or relevant any more in the future

So, I figure it is good practice to document things, and hope this is more useful than no documenting it.

I also have a stringent practice of documenting exactly who gave the answer, quite publicly - right then and there.

Assumption: the CEO fully backs this requirement. <- John Jones.

It has the interesting effect of getting the person cited as the source to take a good look at what he is being quoted with (was it even captured accurately?), and make him wonder if he really wants to be stuck with having said it (is it really accurate, and will it make me seem foolish later?) Just healthy, I figure.

Quality Control before sending

Can we agree that we should be responsible for NOT sending 100 potential misunderstandings per page to an outsourcing supplier? You need to set an exit level for your requirements process, for example "no more than 1.0 Major defects per page".

If you do set such a requirement process exit level, seriously, then our client experience (the earliest serious large scale experience was at Douglas Aircraft, for engineering drawings, 1989) is that you will strongly motivate engineers to learn to reduce their defect injection - to write clearly enough to succeed and to 'exit' from the requirements process. (If you want to learn more about this process, see my paper Agile Specification Quality Control: Shifting emphasis from cleanup to sampling defects (INCOSE 2005) available at www.gilb.com). It takes several learning cycles, but individual engineers actually reduce their 'defect injection' in requirements by one order of magnitude in a few months and more in the longer term.

Before you waste both your outsourcers time and your own - make sure you quality control requirements in relation to a reasonable set of rules of specification. Have a reasonable requirement process exit level - not 100 major defects per page (by default - when you don't measure the level).

Requirements for Outsourcing

Illustration: the personal learning curve of one aircraft engineer, when subject to specification quality measurement and a numeric exit condition. Hundreds of engineers went through this defect injection reduction process. A multinational bank reported reduction from 80.4 Major defects/pages to 11.2 when comparing the requirements for several IT projects.

Evolve Requirement Delivery

One way to find out if you have done a good enough job on requirements is to check the real thing. The evolutionary project management (Evo) method consciously divides up a project into about 50 increments. Each increment will attempt to deliver some of the requirements to some of the stakeholders.

If you start getting high value requirements back from the supplier, on a regular basis, then you must be communicating requirements well. If not, you can at least analyse your problems early and correct your process. It is not possible to have a large scale problem without you getting early and frequent warning signals.

You can even make an interesting 'no cure no pay' contractual relationship with your supplier/ Pay them for requirements delivered successfully - not work carried out.

Requirements for Outsourcing

Figure. The gradual delivery of requirements at each Evo step (simplified) Source CE page 158.

Quantify Quality

Quality requirements are a dominant reason for many projects. We want to replace an old system with one that has higher qualities.

Quality levels are a powerful driver of system costs. The nearer you get to perfect quality, the nearer you get to infinite costs.

For any serious outsourcing communication, you have to quantify the qualities you want.

Most people make the mistake of simply declaring the quality to be critical "Highest levels of security" ... "State of the Art Reliability".

Or worse, they simply replace the quality requirement with a design, intended to deliver the requirement, "industry standard programming languages:.... "Consistent user interfaces" are examples.

Here is an example of a requirement specification rule:

Rule QQ: All system quality requirements must be defined quantitatively with a scale of measure and at least one numeric target or constraint level required.

Quantification gives you about as clear and unambiguous communication of your quality requirement to the outsourcer as you will get. It also forces you to really think more deeply about what you really expect from the outsourcer. I observe that people do not have clear ideas about most of their quality requirements at all. The have no policy to quantify qualities. In fact they feel no responsibility to clarify qualities - in spite of often explicitly declaring them to be 'primary motivation for this project' (I saw that today at work, at the latest. I see it at least once a week).

In addition quantification means you can expect to measure both partial and complete progress towards the quality target levels. This means you can see if your supplier is really delivering at a rate consistent with meeting your deadlines.

Case Study. Real (our client FIRM) tracking of 25 quality levels in product development, in the 9th of a 12 week cycle before customer delivery. In the improvement % column is the degree of progress towards 100% of the planned target levels. It is clear that this project with 25 engineers working in 4 parallel teams is on track for meeting the targets on time. They would have to average 75% to be on track in general.

Constrain Explicitly

We have a tendency to be explicit about what we want. But we may fail to be equally explicit about what we don't want.

If we fail to define all relevant constraints, together with our target requirements.

It is not enough to say how warm you want the room temperature to be. You need to specify the upper and lower limits too.

It is not enough to assume the outsourcer will know what your national laws are, or that your system must respect them.

You need to make a long explicit list of those constraints, and take nothing for granted.

Here is a template for thorough specification of scalar requirements (variables like performance, quality, costs). Source CE, page 135.

Elementary scalar requirement template <with hints>

Tag: <Tag name of the elementary scalar requirement>.

Type: <{Performance Requirement: {Quality Requirement,Resource Saving Requirement,

Workload Capacity Requirement},Resource Requirement: {Financial Requirement,

Time Requirement,Headcount Requirement,others}}>.

============================ Basic Information ===========================

Version: <Date or other version number>.

Status: <{Draft, SQC Exited, Approved, Rejected}>.

Quality Level: <Maximum remaining major defects/page, sample size, date>.

Owner: <Role/e-mail/name of the person responsible for this specification>.

Stakeholders: <Name any stakeholders with an interest in this specification>.

Gist: <Brief description, capturing the essential meaning of the requirement>.

Description: <Optional, full description of the requirement>.

Ambition: <Summarize the ambition level of only the targets below. Give the overall real ambition level in 5-20 words>.

============================ Scale of Measure ===========================

Scale: <Scale of measure for the requirement (States the units of measure for all the targets,

constraints and benchmarks) and the scale qualifiers>.

============================= Measurement ============================

Meter: <The method to be used to obtain measurements on the defined Scale>.

============= Benchmarks ============= ''Past Numeric Values'' =============

Past [<when, where, if>]: <Past or current level. State if it is an estimate> <- <Source>.

Record [<when, where, if>]: <State-of-the-art level> <- <Source>.

Trend [<when, where, if>]: <Prediction of rate of change or future state-of-the-art level> <-<Source>.

============== Targets ============== ''Future Numeric Values'' =============

Goal/Budget [<when, where, if>]: <Planned target level> <- <Source>.

Stretch [<when, where, if>]: <Motivating ambition level> <- <Source>.

Wish [<when, where, if>]: <Dream level (unbudgeted)> <- <Source>.

============== Constraints ============= ''Specific Restrictions'' =============

Fail [<when, where, if>]: <Failure level> <- <Source>.

Survival [<when, where, if>]: <Survival level> <- <Source>.

============================= Relationships =============================

Is Part Of: <Refer to the tags of any supra-requirements (complex requirements) that this requirement is part of. A hierarchy of tags (For example, A.B.C) is preferable>.

Is Impacted By: <Refer to the tags of any design ideas that impact this requirement> <-<Source>.

Impacts: <Name any requirements or designs or plans that are impacted significantly by this>.

======================= Priority and Risk Management ======================

Rationale: <Justify why this requirement exists>.

Value: <Name [stakeholder, time, place, event]: Quantify, or express in words, the value claimed as a result of delivering the requirement>.

Assumptions:<State any assumptionsmade in connection with this requirement> <-<Source>.

Dependencies: <State anything that achieving the planned requirement level is dependent on> <- <Source>.

Risks: <List or refer to tags of anything that could cause delay or negative impact> <- <Source>.

Priority: <List the tags of any system elements that must be implemented before or after this requirement>.

Issues: <State any known issues>.

In bold, I have highlighted the aspects of the requirement specification that give information about constraints with respect to this particular requirement.

Connect relationships

You need to help your outsourcers understand relationships between requirements and a long list of other things.

Here are some specific principles about this: (from another recent paper on this called "Requirement Relationships: A Theory, some Principles, and a Practical Approach" (www.gilb.com))

The 'Requirement Relationship' Principles:

  1. THE CLIENT STAKEHOLDER PRINCIPLE: A requirement specification that has no identified client stakeholder, is not a valid requirement . Because - we cannot ascertain its usefulness or value to a given stakeholder.
  2. THE SERVER STAKEHOLDER PRINCIPLE: A requirement specification that has no specified, or implied, server stakeholder(s) is not yet seriously planned for real implementation. Thus we cannot understand who will deliver it, when, or how efficiently
  3. THE REQUIREMENT RELATIONSHIPS PRINCIPLE: A single requirement can have any useful number and types of relationships that are worth specifying. The total costs of specification should be less than the expected benefits in the long term for the system.
  4. THE EARLY RELATIONSHIP PRINCIPLE: Failure to deal with requirement relationships in the requirement specifications themselves will have the effect of increasing development and maintenance costs. Because the relationships will then more likely be sensed, and dealt with, downstream, in design, testing and operation or even decomissioning.
  5. THE DYNAMIC RELATIONSHIP PRINCIPLE: Requirement Relationships are not static, nor are they are all determinable initially. Consequently we need to track them as they emerge and change; we need to verify them, and we need to analyze the consequences of any change in requirement relationships.
  6. THE RELATIONSHIPS ARE 'CRITICAL KNOWLEDGE' PRINCIPLE: The requirement relationship knowledge is itself far more valuable and critical than the requirement alone. This is because it potentially helps us to impact greater value and scope, earlier and better than we otherwise would be aware of, or would deal with. The requirement itself might change but most relationships might remain as useful facts
  7. THE 'REQUIREMENT REVIEW BASIS' PRINCIPLE: All requirement review processes are dependent on the quality and quantity of requirement relationship information available. Otherwise we risk approving requirements in ignorance of critical facts.
  8. THE RISK MANAGEMENT PRINCIPLE: The Risk Management process is continuously dependent on the quality of requirement relationship information. All requirement relationship specifications help us to identify and manage risks.
  9. THE 'BUTTERFLY EFFECT' PRINCIPLE: Even one single fault in a requirement relationship specification can be the root cause of project or system failure. It is impossible to be sure that even a single missing or incorrect requirement relationship specification will be unable to severely or critically damage your engineering effort.
  10. THE DESIGN RELATIONSHIP PRINCIPLE: All architecture and design specifications must follow the same relationship specification principles, as their 'near cousins', requirements. This is because, all 'solutions, means, designs, architectures, and strategies' are themselves also requirements, as viewed by other stakeholders.

Requirements for Outsourcing

Illustration: This shows the four main system attribute types: resource, function, performance and design. It also shows the processes, which implement the functions. Using Planguage, the complex relationships amongst these four different types can be specified. For example, a specific performance level might apply only to a handful of functions; rather than the entire system. Or, a function might be implemented by several processes. Or, different resources can be specifically allocated to different functions. [source: CE 2005, Figure 3.3]:

Summary

Much better quality of requirements is a current necessity for most of us. But outsourcing places demands on the requirements process that are unusually high because of physical and cultural differences. This paper has tried to give some specific and practical advice on what to do to specify better requirements. It is hard work, but it is a lot less work than dealing with the misunderstandings caused by bad requirements.

References

CE: Gilb, Tom, Competitive Engineering, A Handbook For Systems Engineering, Requirements Engineering, and Software Engineering Using Planguage, ISBN 0750665076, 2005, Publisher: Elsevier Butterworth-Heinemann.

Copyright © 2007 by Tom Gilb.


Click here to view the complete list of archived articles

This article was originally published in the Winter 2007 issue of Methods & Tools

Methods & Tools
is supported by


Testmatick.com

Software Testing
Magazine


The Scrum Expert