The White Glove Heuristic and The "Unless…" Heuristic

Part of the Rapid Software Testing philosophy involves reducing waste wherever possible. For many organizations, documentation is an area where we might want to cut the clutter. It’s not that documentation is valueless, but every minute that we spend on documentation is a minute that we can’t spend on any other activity. Thus the value of the documentation has to be compared not only to its own cost, but to its opportunity cost. More documentation means less testing; that might be okay, and even important, but it might not.

This leads to the White Glove Heuristic: if we have documentation somewhere in our process, such that running a white-gloved finger over it would cause the glove to pick up a bunch of dust, let’s at least consider applying less work to that document, or eliminating it altogether.

In the RST class, there’s often push-back to this idea. That’s understandable; at one point, someone started producing the document in an attempt to solve some problem or mitigate some risk. The question then becomes, “Has the situation changed such that we no longer need that document?”–and the problem I see most often is that the question is begged.

On a recent trip to India, many of the participants in the class pushed back on the very idea of reducing documentation in any way, claiming “our project managers would never accept that.”

I was curious. “Have you asked them?” The answer was, as I suspected, No. “So suppose you’re producing a forty-page test report for a busy executive. What if that executive only ever reads the summary on the first page? Might she approve of a shorter document? If she had important questions about things in that document, could you answer those questions at a lower cost than preparing the big document?” Maybe, came the answer. “So: your project managers would never accept changes to your test documentation, unless they’re not reading the whole thing anyway. Or they’d never accept changes unless they were aware of the amount of testing time lost to preparing the document. Or they’d never accept changes unless they had the confidence that you could give them the information they needed on demand.” The class participants then began to recognize that a session-based test management approach might allow them to make their testing sufficiently accountable while satisfying the executives with more lightweight summary reports.

Later in the class, we were talking about oracles, and how slippery they can be. Oracles are heuristic; that means that they often work, but they can fail, and that we learn something either way. The class presents a list of consistency oracles (the list is now a little longer than in the linked article); for example, a product should behave in a manner consistent with its history– unless there’s a compelling reason for it to be otherwise, like a feature enhancement or a bug fix.

This led me to formulate The “Unless…” Heuristic: Take whatever statement you care to make about your product, your process, or your model, and append “unless…” to it. Then see where the rest of the sentence takes you.

3 replies to “The White Glove Heuristic and The "Unless…" Heuristic”

  1. Michael,

    I have been struggling with this issue. I believe that I am a descent tester without following a step by step script as long as I am learning new things and testing different things. I hate documentation so I might be biased., but, I have come to realize, is that when I leave this project, the next tester may have a large learning curve and that maybe the quality of testing might go down(or wishful thinking since I won’t be the tester). So I struggle with what amount of document test scripts should be left behind to help the new guy. Should it be step by step? How about just an outline of a script? Or just hand him the UCASE and manuals say have fun !!! Any thoughts?

  2. Hey, Eric…

    Thanks for replying. I think a lot of people struggle with the issue, and they do it in different ways and in different contexts, so there are lots of possible answers.

    If I know who my successor is going to be, that will change my answer. Then my ideal is not to leave a document behind that tells her what to do; I can teach her, answer questions, have a conversation, assess her capabilities, and try to anticipate and shore up weak spots. Documents might be a part of that if they’re the most efficient way for me to impart the information. Sometimes they might be, and sometimes not.

    Here’s the primary thing I’m going to try to help my successor with: I want to help them ask and answer the question “Is there a problem here?”. James has a very nice session that he recorded with Mike Kelly on this subject. So my goal is to give them an idea of what the program is supposed to do (but there are typically several ways for my successor to learn that already; after all, I learned at some point); I want to leave them with a list of risks that we have developed (so far) about the product and the systems with which it works; I want to leave them with some pointers to useful oracles (principles or mechanisms by which we recognize a problem); and I want to help impart a quick notion of the coverage areas in the product. If, for most of the testing effort, I leave them with heuristics, rather than algorithms, I’ll have more confidence that they can handle themselves when the program and system (and therefore the algorithms) change. I want to teach them to fish, not give them a fish.

    I would suggest that scripts would very rarely be helpful–unless those scripts were to be run by a machine, but I don’t think that’s what you’re talking about.

    A step-by-step set of instructions is not likely to empower a new tester. That set of instructions is unlikely to explain my knowledge or my thought process–and any documented explanation of my thought process would be incomplete no matter how much time I spent on writing it down. On the other hand, if the other tester can grasp my thought process, they won’t need a script; I don’t use one. I don’t want to leave them a description of a fish, or of my fishing trips, unless those trips were really interesting and taught me something about fishing. Or about fish.

    Outlines might be better. I do use ideas about testing, and I might write these down if there’s no faster or more reliable way to impart them to the new tester.

    Another way to approach the problem is to time-box it. At the end of each day, spend five to fifteen minutes journalling with an eye to what a new tester might find interesting, useful, or important about your day. If documents or code examples or screen shots or diagrams exist, refer to them, but don’t take more than your time limit on the exercise.

    Instead of training the newbie that you don’t get a chance to meet, train someone else on your team in what you do. Have them train you, too.

    Ask yourself why you hate documentation. I suspect that you like documentation, when it’s useful, clear, better than talking to someone, available, reliable, credible, done by people whom you trust; and I’ll bet you don’t like it so much when one or more of those aspects is absent such that the absence slows you down. But really complete documentation can slow you down, too; it can be badly organized, self-contradictory, full of unimportant detail, full of distracting detail–it can lead you to observe some problems while missing others. Don’t stop at “I hate documentation.” Identify why you think you hate it, why others might like it (or only seem to), and then imagine how to you can do less of the things that people don’t value so much while still keeping everyone happy.

    I’ll always try to end, sincerely, with “Have fun!”

  3. could not find link to the consistency oracle, would have been good to see it 🙁

    Michael replies: Thanks for the note. Fixed.


Leave a Comment