Safety language and the preservation of uncertainty

On the third day of CAST2011, Jeff (another tester) and I played the hidden picture exercise with James Bach. We were to uncover a hidden picture, one pixel at a time, uncovering the fewest amount of pixels possible in a short amount of time. This forced us to think about the balance between coverage and gathering enough valuable information to stop. I won’t tell you our approach, but eventually we felt comfortable stating our conclusion. James challenged us to tell him with absolute certainty what the hidden picture was. I responded with, “It appears to be a picture of…”, which to my delight was followed by praise from the master. He remarked on my usage of safety language.

Two days earlier, Michael Bolton’s heady CAST2011 keynote kept me struggling to keep up. He discussed the studies of scientists and thinkers and related their findings to software testing. To introduce the conference theme, he concluded that what we call a fact is actually context

dependent.

Since CAST2011 focused on the Context-Driven testing school, we heard a lot about testing schools (or ways of thinking about testing). For example, the Factory testing school believes tests should be scripted and repeatability is important. Some don’t like the label “Factory” but James Bach pulled a red card and argued the label “Factory” can be a good label under the right circumstances (e. g., manufacturing). I never really understood why I should care about testing schools until Bolton (and Bach) explained it this way…

Schools allow us to say “that person is of a different school” rather than “that person is a fool”.

I’ll try to paraphrase a few of Michael’s ideas:

The world is a complex, messy, variable place. Testers should accept reality and know that some ambiguity and uncertainty will always exist. A testers job is to reduce damaging uncertainty. Testers can at least provide partial answers that may be useful.
If quality is value to some person, then who should test the quality for various people? This is why it’s important for testers to learn to observe people, and determine what is important to them. Professor of Software Engineeering, Cem Kaner, calls testing a social science for this reason.
Cultural anthropologist Wade Davis believes people strive by learning to read the world beyond them.
Per Michael, if the above points are true, testers should use safety language. I really liked this part of the lesson. Instead of saying “it fails under these circumstances”, a tester should say “it appears to fail” or “it might fail under these circumstances”. Instead of “the root cause is…”, a tester should say “a root cause is…”. When dealing with an argument, say “I disagree” instead of “you’re wrong” and end by saying “you may be right”. This type of safety language helps to preserve uncertainty and I agree that testers should use it wherever possible.



Safety language and the preservation of uncertainty