How To Analyse Specifications - Part 2

Extracting Testable Conditions

Following on from Part 1, a large part of Specifications Analysis is extracting Testable Conditions, so now we will look at how to do that.

In order to ensure a quality product, every function that the product carries out must be testable. That is to say, if a product is supposed to do x, y, and z, then in order to determine that it does do x, y, and z, it must be possible to test that it does indeed do x, y, and z.

This is why it is so imperative that the Specifications should be specific rather than vague.

If the Specification says the product will do "x", then it must do "x". Whatever "x" is, it must be clearly defined. When "x" is clearly defined, it is possible for the Developer to develop the product so that it does "x", and it is possible for the Tester to confirm that the product does "x".

Does the product do "x" is therefore a Testable Condition. A Testable Condition is a black and white scenario. Either the product does "x", or it does not do "x".

So when you are analysing the Specifications to produce Testable Conditions, you are looking for everything that can be resolved to a black and white scenario.

To the extent that you are able to do this, the Specification is clear and specific.

Conversely, to the extent that you are not able to do this, the Specification is unclear and not specific.

There are different ways of ways of determining whether you have extracted all the conditions. One is to number everything. Another is to use coloured highlighting pens.

Wherever there are gaps (numerical or uncoloured), you will need to get clarification from the BA.

This is a matter of going through the document for as many times as it takes to get everything in it to be testable.

By the time you have finished, you will have a set of Testable Conditions - you now have to fit these Testable Conditions into your Test Cases.

How To Analyse Specifications - Part 1

Specifications are the foundation of a project. It tells Developers what to develop, and it tells Testers what to test. If the Specifications are of a low quality, then the development work and the testing work will be of a low quality - unless the Specifications are improved.

It should be no surprise then, that the very first job of a Tester is to test the specifications. This is often called Static Testing. Some Testers have a tendency to leap into creating test scripts, but this can (and often does) become a waste of time. So it is testing the specifications against themselves that is the Tester's first priority.

But what do you test them against? What are the criteria that determine whether the Specifications are doing their job correctly?

The answer is in the word specifications, and runs from the first letter to the eighth letter - it is the word "specific". This is the first test of the Specifications - they should be specific.

They should not be vague - a Business Analyst once told me that he had "deliberately left the specifications vague to cover a range of possibilities". I was unimpressed.

Specifications should not use words that don't describe specifically should happen.

Words and phrases such as "handle appropriately", "correct", and "as expected" are very often used in Specifications. I call them "fluffy" words or phrases, because they describe approximately what should happen, but they do not clearly describe what should happen. They are not specific.

The First Test
So this is the first test of the Specifications.

How specific is it? Does it use fluffy words? Does it use fluffy phrases?

These questions need to be raised by the Tester, and clarifications sought wherever the Specification fails the test, e.g "What do you mean when you say that the data output should be handled appropriately? How exactly should it be handled?"

The Second Test

The second test of the Specifications is accuracy.

Now evidently, the Tester is not, at least in the beginning, in a position to say that the Specifications are inaccurate. They are based on discussions between the Business Analyst and one or more people in the business, who we shall refer to as the Product Owner(s). Since the Tester was not a part of those discussions, they are not in a position to say whether or not the Specifications accurately reflect what was said.

But as a Tester, you need to look out for functionality being described differently in different parts of the Specifications - you may even find contradictions within the Specifications!

This can often involve getting out pen and paper and rearranging various points of the Specifications. You can do this in written form, or table form. In the process of this rearrangement, you will see new patterns emerging, which will highlight to you any differences.

This is testing the documentation, or Static Testing.

The Bedrock of Testing

The purpose of Software Testing is to find defects and faults. We normally call these "issues" - it's a less negative word.

With the information provided by the Tester, the issues and faults can be rectified, or determined to be acceptable.

It is one of the axioms of Software Testing that the earlier an issue is found, the cheaper it is to fix it.

Supposing for example you are working on testing the software that will go into a hardware peripheral device.

And then imagine, as an example, a defect being found in the initial design of this software. By initial, I mean before any code has even been written - the design idea is still on paper. When it is found at this stage, it is a simple matter of rectifying the design defect. The cost is neglible.

But if that same defect is not caught at the design stage, and gets through into the development stage and is coded into the software, if it passes through all the hurdles of software testing (we'll discuss how a defect gets past the testing stage in another article) and finally makes its way into the finished product ready for shipping, to be ultimately found by a customer, then the cost to fix it can be enormous. We're talking about hardware here, and a defect could mean a product recall, which means refunding all the customers or replacing the product, and re-designing the software to put into the next batch. We haven't even begun to calculate the cost in lost sales because the hardware production has to be halted whilst the software defect is fixed. And then there is the cost to the reputation of the producer.

The cheapest point at which to find it is in the initial design - and the most expensive point at which to find it is in the hands of the customer. Now this defect could have been found anywhere along that path between initial design and being found by a customer. But you need to bear in mind that at each stage, it is cheaper to find and fix the defect than at the next stage, and it is more expensive to find and fix it than at the previous stage.

The cost of fixing a defect gets larger the further into the development process it is found.

If you are working on a server or web based application, then at least the recall cost of the hardware is avoided. But it is still more expensive to fix a defect found at that stage than if it had been found earlier in the development life cycle.

The Software Tester's Mindset

Are you tired of being criticized as a Software Tester? How often do you hear the words "only a tester"? It is often developer's (though not always) who say it, and unfortunately, testers themselves don't often help the matter.

Developers see themselves as having a deep and in depth knowledge of the system that they are developing. This is natural because they get to know the insides of the process that they are working on.

But testers (especially if they are testing in a black box scenario) often seem to have only a sketchy knowledge of the same system. Why should this be? Software Testers need to know more of the system than developers do, and they need to know it in depth. They need to know it's weaknesses and it's strengths.

But they often don't.

Now go back to what I said about Developers, and note carefully that I said "they get to know the insides of the process that they are working on" - the important word in this sentence is "process".

The Developer does not know the insides of the system that they are working on - they know the insides of the process that they are working on.

But the Tester who wants to be a good - no, a brilliant - tester, will make sure that they know the system inside out. That means that when a Developer tells you that it isn't a defect that you just found, you will know when it is a defect.

But to be that brilliant tester that knows the trade inside out, you need to start from the basics. You need to know a range of software testing techniques,

Whether you are a newbie, or you have been around for a few years, there are lessons that you can learn about the art, the craft, of software testing.

The Software Tester's Mindset means that there are always new things to find out. So let's get started ...