Several years ago, I wrote an article for Better Software Magazine called Testing Without a Map. The article was about identifying and applying oracles, and it listed several dimensions of consistency by which we might find or describe problems in the product. The original list came from James Bach.
Testers often say that they recognize a problem when the product doesn’t “meet expectations”. But that seems empty to me; a tautology. Testers can be a lot more credible when they can describe where their expectations come from. Perhaps surprisingly, many testers struggle with this, so let’s work through it.
Expectations about a product revolve around desirable consistencies between related things.
- History. We expect the present version of the system to be consistent with past versions of it.
- Image. We expect the system to be consistent with an image that the organization wants to project, with its brand, or with its reputation.
- Comparable Products. We expect the system to be consistent with systems that are in some way comparable. This includes other products in the same product line; competitive products, services, or systems; or products that are not in the same category but which process the same data; or alternative processes or algorithms.
- Claims. We expect the system to be consistent with things important people say about it, whether in writing (references specifications, design documents, manuals, whiteboard sketches…) or in conversation (meetings, public announcements, lunchroom conversations…).
- Users’ Desires. We believe that the system should be consistent with ideas about what reasonable users might want. (Update, 2014-12-05: We used to call this “user expectations”, but those expectations are typically based on the other oracles listed here, or on quality criteria that are rooted in desires; so, “user desires” it is. More on that here.)
- Product. We expect each element of the system (or product) to be consistent with comparable elements in the same system.
- Purpose. We expect the system to be consistent with the explicit and implicit uses to which people might put it.
- Statutes. We expect a system to be consistent with laws or regulations that are relevant to the product or its use.
I noted that, in general, we recognize a problem when we observe that the product or system is inconsistent with one or more of these principles; we expect this from the product, and when we get that, we have reason to suspect a problem.
(If I were writing that article today, I would change expect to desire, for reasons outlined here.)
“In general” is important. Each of these principles is heuristic. Oracle principles are, like all heuristics, fallible and context-dependent; to be applied, not followed. An inconsistency with one of the principles above doesn’t guarantee that there’s a problem; people make the determination of “problem” or “no problem” by applying a variety of oracle principles and notions of value. Our oracles can also mislead us, causing us to see a problem that isn’t there, or to miss a problem that is there.
Since an oracle is a way of recognizing a problem, it’s a wonderful thing to be able to keep a list like this in your head, so that you’re primed to recognize problems. Part of the reason that people have found the article helpful, perhaps, is that the list is memorable: the initial letters of the principles form the word HICCUPPS. History, Image, Claims, Comparable products, User expectations (since then, changed to “user desires”), Product, Purpose, and Statutes.
With a little bit of memorization and practice and repetition, you can rattle off the list, keep it in your head, and consult it at moment’s notice. You can use the list to anticipate problems or to frame problems that you perceive.
Another reason to internalize the list is to be able to move quickly from a feeling of a problem to an explicit recognition and description of a problem. You can improve a vague problem report by referring to a specific oracle principle. A tester’s report is more credible when decision-makers (program managers, programmers) can understand clearly why the tester believes an observation points to a problem.
I’ve been delighted with the degree to which the article has been cited, and even happier when people tell me that it’s helped them. However, it’s been a long time since the article was published, and since then, James Bach and I have observed testers using other oracle principles, both to anticipate problems and to describe the problems they’ve found. To my knowledge, this is the first time since 2005 that either one of us has published a consolidated list of our oracle principles outside of our classes, conference presentations, or informal conversations. Our catalog of oracle principles now includes:
- Statutes and Standards. We expect a system to be consistent with relevant statutes, acts, laws, regulations, or standards. Statutes, laws and regulations are mandated mostly by outside authority (though there is a meaning of “statute” that refers to acts of corporations or their founders). Standards might be mandated or voluntary, explicit or implicit, external to the development group or internal to it.
What’s the difference between Standards and Statutes versus Claims? Claims come from inside the project. For Standards and Statutes, the mandate comes from outside the project. When a development group consciously chooses to adhere to a given standard, or when a law or regulation is cited in a requirements document, there’s a claim that would allow us to recognize a problem. We added Standards when we realized that sometimes a tester recognizes a potential problem for which no explicit claim has yet been made.
While testing, a tester familiar with a relevant standard may notice that the product doesn’t conform to published UI conventions, to a particular RFC, or to an informal, internal coding standard that is not controlled by the project itself.
Would any of these things constitute a problem? At least each would be an issue, until those responsible for the product declare whether to follow to the standard, to violate some points in it, or reject it entirely.
A tester familiar with the protocols of an FDA audit might recognize gaps in the evidence that the auditor desires. Similarly, a tester familiar with requirements in the Americans With Disabilities Act might recognize accessibility problems that other testers might miss. Moreover, an expert tester might use her knowledge of the standard to identify extra cost associated with misunderstanding of the standard, excessive documentation, or unnecessary conformance.
- Explainability. We expect a system to be understandable to the degree that we can articulately explain its behaviour to ourselves and others.If, as testers, we don’t understand a system well enough to describe it, or if it exhibits behaviour that we can’t explain, then we have reason to suspect that there might be a problem of one kind or another. On the one hand, there might be a problem in the product that threatens its value. On the other hand, we might not know the about the product well enough to test it capably. This is, arguably, a bigger problem than the first. Our misunderstanding might waste time by prompting us to report non-problems. Worse, our misunderstandings might prevent us for recognizing a genuine problem when it’s in front of us.
Aleksander Simic, in a private message, suggests that the explainability heuristic extends to more members of the team than testers. If a programmer can’t explain code that she must maintain (or worse, has written), or if a development team has started with something ill-defined and confusion is moving slowly through the product, then we have reason to suspect, investigate, or report a problem. I agree with Aleksander. Any kind of confusion in the product is an issue, and issues are petri dishes for bugs.
- World. We expect the product to be consistent with things that we know about or can observe in the world.Often this kind of inconsistency leads us to recognize that the product is inconsistent with its purpose or with an expectation that we might have had, based on our models and schemas. When we’re testing, we’re not able to realize and articulate all of our expectations in advance of an observation. Sometimes we notice an inconsistency with our knowledge of the world before we apply some other principle.This heuristic can fail when our knowledge of the world is wrong; when we’re misinformed or mis-remembering. It can also fail when the product reveals something that we hadn’t previously known about the world.
There is one more heuristic that testers commonly apply as they’re seeking problems, especially in an unfamiliar product. Unlike the preceding ones, this one is an inconsistency heuristic:
- Familiarity. We expect the system to be inconsistent with patterns of familiar problems.When we watch testers, we notice that they often start testing a product by seeking problems that they’ve seen before. This gives them some immediate traction; as they start to look for familiar kinds of bugs, they explore and interact with the product, and in doing so, they learn about it.Starting to test by focusing on familiar problems is quick and powerful, but it can mislead us. Problems that are significant in one product (for example, polish in the look of the user interface in a commercial product) may be less significant in another context (say, an application developed for a company’s internal users). A product developed in one context (for example, one in which programmers perform lots of unit testing) might have avoided problems familiar to other us in other contexts (for example, one in which programmers are less diligent).
Focusing on familiar problems might divert our attention away from other consistency principles that are more relevant to the task at hand. Perhaps most importantly, a premature search for bugs might distract us from a crucial task in the early stages of testing: a search for benefits and features that will help us to develop better ideas about value, risk, and coverage, and will inform deeper and more thoughtful testing.Note that any pattern of familiar problems must eventually reduce to one of the consistency heuristics; if it was a problem before, it was because the system was inconsistent with some oracle principle.
Standards was the first of the new heuristics that we noticed; then Familiar problems. The latter threatened our mnenomic! For a while, I folded Standards in with Statutes, suggesting that people memorize HICCUPPS(F), with that inconsistent F coming at the end. But since we’ve added Explainability and World, we can now put F at the beginning, emphasizing the reality that testers often start looking for problems by looking for familiar problems. So, the new mnemonic: (F)EW HICCUPPS. When we’re testing, actively seeking problems in a product, it’s because we desire… FEW HICCUPPS.
This isn’t an exhaustive list. Even if we were silly enough to think that we had an exhaustive list of consistency principles, we wouldn’t be able to prove it exhaustive. For that reason, we encourage testers to develop their own models of testing, including the models of consistency that inform our oracles.
This article was first published 2012-07-23. I made a few minor edits on 2016-12-18, and a few more on 2017-01-26.
The solution to the “familiarity” issue is to hire “fresh” testers. Testers that haven’t tested a similar system before.
Michael replies: You seem to be suggesting that seeking patterns of familiar problems is a Bad Thing. That isn’t necessarily so; in fact, the “familiar problems” heuristic can be very powerful. The Bad Thing comes when the familiar problems heuristic reverses from a medium into a bias that overwhelms other approaches.
I remember that for a certain computer game (can’t remember the name), a team of testers consisted of office managers that only played solitaire. (they also addressed the usability issue as well).
And how did that work out?
It’s good to have some people test with limited preconceptions (consider the “Fresh Eyes Find Failure” lesson from Lessons Learned in Software Testing. It’s also pragmatic to spend some of your testing time seeking problems you’ve seen before.
[…] FEW HICCUPPS (Developsense Blog) […]
Thanks MB.
Wondering, do you have a real world example of the use of the World Heuristic? I’ve been struggling with this one since JB mentioned it to us in our RST class. When thinking about it I always come back to Comparable Products first.
I recall the very brief example being a door. If it doesn’t open we could recognise a potential problem due to our ‘world’ knowledge of doors. But once again I find myself at Comparible Products first.
Cheers,
DG
[…] FEW HICCUPPS Written by: Michael Bolton […]
[…] FEW HICCUPPS Written by: Michael Bolton […]
[…] gain experience with oracles, I use now for my daily work the following oracle heuristics: FEW HICCUPPS Very good explained by Michale Bolton! But I’ve to get used to it first […]
Thanks for an updated list, I used HICCUPPS last weekend at the RTI. Will this updated list make it into the latest RST slides?
(BTW your link to your article Testing Without a Map is broken.)
[…] instance the Heuristic Test Strategy Model, and explore the product with helpful mnemonics like FEW HICCUPPS . The second approach is to converse and to keep on conversing with the stakeholders and ask them […]
[…] particular expectations are enumerated as Consistency Heuristics that sapient testers rely on (e.g. http://www.developsense.com/blog/2012/07/few-hiccupps/). Apparently, at least some sapient testing is […]
[…] which we recognise a problem. Below are links with more information on oracles in Software Testing: http://www.developsense.com/blog/2012/07/few-hiccupps/ http://www.developsense.com/blog/category/oracles/ […]
[…] testers, we should be validating what people say their products are claiming, and as an industry, we can (and should) do better to […]
[…] FEWHICCUPS – a set of consistency heuristics […]
[…] Heuristics from Michael Bolton […]
[…] Michael Bolton – FEW HICCUPPS […]
Trackback:
http://www.stickyminds.com/article/using-business-decisions-drive-your-testing-coverage?
[…] FEW HICCUPS by Bach and Bolton […]
[…] term for acronyms that spell words or phrases that are easy for humans to remember. For example, FEW HICCUPPS is one: “H” stands for History, “I” for Image, “C” for […]
[…] focused on the consistency heuristic. If you aren’t familiar with it you can read more on Michael Boltons’ Blog. That post also contains a link to the original […]
[…] Testing: evaluate software in a much broader sense than just documented requirements. Use consistency heuristics as a […]
[…] FEW HICCUPPS by Michael Bolton […]
[…] an easy way of explaining our story though, and lack of explainability is one thing that might be a way of recognising a problem. I originally had this item in the mnemonic named “Tell a story”. Telling a story is […]
[…] Foundations course has really helped. I get started by thinking of consistency issues using the Few Hiccups mnemonic. That’s where claims, comparisons and usability branches stem from. After that ideas […]
[…] – Few Hiccupps – Michael […]
[…] Few Hiccups – Michael Bolton […]
[…] hid Initialize Personal Information, as submenu of Reseting menu. Which is probable inconsistency with user expectations. As a user, I was afraid to select Reseting in order not to start factory reset of player […]
[…] HTSM: Heuristic Test Strategy Model (James Bach/Michael Bolton) and other heuristics like risk and consistency to inform your […]
[…] What is the cause of the impact: presence of anti-pattern, missing recommended pattern, or inconsistency criteria used (purpose, communication, claims, patterns, comparable features) […]
[…] mneumonics were shared, SFDiPOT (Structure Functions Data Interfaces Platform Operations Time) and FEW HICCUPS (Familiarity Explainability World History Image Comparable Products, Claims, User Expectations, […]
[…] owner is coming from, and what they might be envisioning. At this point I introduced the mnemonic FEW HICCUPS to highlight over oracles which can be considered, in particular, I highlighted the History, Image […]
[…] look outward instead of inward. And this is were I am at the moment, employing heuristics such as FEW HICCUPS and CRUSSPIC STMPL (PDF) to investigate the context. It turns out that my investment in the […]
[…] explained how the most important thing he ever learned about testing software was the acronym HICCUP, by Michael Bolton (who is a very knowledgeable man, and was there as part of running a three day […]
[…] little late to the game, nearly five years after it was initially published, by Michael Bolton (http://www.developsense.com/blog/2012/07/few-hiccupps/), I was introduced to HICCUPPS and consequently (F)EW HICCUPPS in a talk by Richard Bradshaw […]
[…] Part 1 of my series on Michael Bolton‘s (F)EW HICCUPPS mnemonic. […]
[…] Test Oracles by Michael Bolton […]
[…] Part 2 of my series on Michael Bolton‘s (F)EW HICCUPPS mnemonic. […]
[…] Part 3 of my series on Michael Bolton‘s (F)EW HICCUPPS mnemonic. […]
[…] Part 4 of my series on Michael Bolton‘s (F)EW HICCUPPS mnemonic. […]
[…] Part 5 of my series on Michael Bolton‘s (F)EW HICCUPPS mnemonic. […]
[…] week i read the blog few hiccups by Michael Bolton which can be found here. This article is about better describing the expectations you have for your code before testing […]
[…] FEW HICCUPPS [Bach, Bolton] is mnemonic that represents several consistency oracle heuristics. […]
[…] that I use comparable heuristics, FEW HICCUPPS [Bach, […]
[…] Once you’re in the realm of oracles and heuristics, there’s none better (IMHO) than FEW HICCUPS, from Michael Bolton. […]
[…] Oracles (FEWHICCUPPS model for Product […]
[…] company. Things are moving fast and you can not apply traditional software testing techniques. FEW HICCUPPS technique created by James Bach and Michael Bolton can help you to jump start your testing in your […]
[…] client or the reader of your manual to be surprised. So you’ll develop a diversified set of ways to recognize problems that might cause loss, harm, annoyance, or diminished value. Armed with those, you’ll try out […]
[…] is example how I made an important bug because I did not consider History Consistency and project created a risk by not putting jenkins build scripts under git source […]
[…] That list may be explicit, or we can help ourselves with creating implicit checklist using FEW HICCUPPS mnemonic [M. […]
[…] post is a practical example of comparable product oracle usage. This is one oracle from original HICCUPPS mnemonic created by Bach and […]
[…] testing oracles are presented by Bach and Bolton. Those are fallible heuristics that could be inconsistent with each other. However, they could also […]
[…] explained how the most important thing he ever learned about testing software was the acronym HICCUP, by Michael Bolton (who is a very knowledgeable man, and was there as part of running a three day […]
[…] unspoken requirement – StickymindsFew HICCUPPS – DevelopsenseJabob’s law – Laws of UXJacob’s law of internet UX (video) […]
[…] evaluate, you have to perceive behavior, decipher its meaning within your context using what oracles are available, and make decisions applying the knowledge gained from the evaluation. Those […]
[…] testing does not need large sets of automated checks. Smart usage of A FEW HICCUPPS [Bolton] could have prevented this Keep issue without automation […]
[…] Here goes the featured image. I managed to upload the featured image. When I click on Add Media, nothing happend. Chrome dev tools Javascript console reports an exception. This is not consistent with WordPress history, user desire, claims, comparable product, simmilar product feature, feature purpose. Enough to be sure that this is a bug [source]. […]
[…] Provide enough training for testers (domain, product knowledge, common software errors, HICCUPPS). […]
[…] When looking for test oracles: FEW HICCUPPS adapted by Michael Bolton […]
[…] Bolton, M. 2012. FEW HICCUPPS [online] Developsense Blog. Available at: Link […]
[…] in doubt could use Jakob Nielsen’s Usability Heuristics for UI Design or Michael Bolton’s FEW HICCUPPS heuristics to determine if the suspicious behaviour is indeed a dark pattern. Nonetheless, personal […]