The following passage is drawn from Robert Pirsig’s Zen and the Art of Motorcycle Maintenance. It is a wonderful book, sort of to design and systems thinking what The Goal is to process and stochastics (making it accessable in lay manager terms). At any rate, the title of this post indicates the discussion I’d like to prompt, about assumptions.

I love the entirety of the following passage, but have trimmed it down a bit to give you enough of the flavor to get the point:

… I remember Chris and I were on a trip to Canada a few years ago, got about 130 miles and were caught in a warm front of which we had plenty of warning but which we didn’t understand. The whole experience was kind of dumb and sad.

We were on a little six-and-one-half-horsepower cycle, way overloaded with luggage and way underloaded with common sense. [...] By ten o’clock the sky was so dark all the cars had their headlights on. And then it really came down.

We were wearing the ponchos which had served as a tent the night before. Now they spread out like sails and slowed our speed to thirty miles an hour wide open. The water on the road became two inches deep. [..]

The cycle slowed down to twenty-five, then twenty. Then it started missing, coughing and popping and sputtering until, barely moving at five or six miles an hour, we found an old run-down filling station by some cutover timberland and pulled in.

At the time, like John, I hadn’t bothered to learn much about motorcycle maintenance. I remember holding my poncho over my head to keep the rain from the tank and rocking the cycle between my legs. Gas seemed to be sloshing around inside. I looked at the plugs, and looked at the points, and looked at the carburetor, and pumped the kick starter until I was exhausted.

We went into the filling station, which was also a combination beer joint and restaurant, and had a meal of burned-up steak. Then I went back out and tried it again. Chris kept asking questions that started to anger me because he didn’t see how serious it was. Finally I saw it was no use, gave it up, and my anger at him disappeared. I explained to him as carefully as I could that it was all over. We weren’t going anywhere by cycle on this vacation. Chris suggested things to do like check the gas, which I had done, and find a mechanic. But there weren’t any mechanics. Just cutover pine trees and brush and rain.

I sat in the grass with him at the shoulder of the road, defeated, staring into the trees and underbrush. I answered all of Chris’s questions patiently and in time they became fewer and fewer. And then Chris finally understood that our cycle trip was really over and began to cry. He was eight then, I think.

We hitchhiked back to our own city and rented a trailer and put it on our car and came up and got the cycle, and hauled it back to our own city and then started out all over again by car. But it wasn’t the same. And we didn’t really enjoy ourselves much.

Two weeks after the vacation was over, one evening after work, I removed the carburetor to see what was wrong but still couldn’t find anything. To clean off the grease before replacing it, I turned the stopcock on the tank for a little gas. Nothing came out. The tank was out of gas. I couldn’t believe it. I can still hardly believe it.

I have kicked myself mentally a hundred times for that stupidity and don’t think I’ll ever really, finally get over it. Evidently what I saw sloshing around was gas in the reserve tank which I had never turned on. I didn’t check it carefully because I assumed the rain had caused the engine failure. I didn’t understand then how foolish quick assumptions like that are. Now we are on a twenty-eight-horse machine and I take the maintenance of it very seriously.

Pirsig, Robert M. (2009-04-10). Zen and the Art of Motorcycle Maintenance (pp. 20-21). HarperTorch. Kindle Edition.

Discussion: What do we learn (in particular about assumptions) from this story? And what other stories help illustrate how assumptions shape what we do, including our system design decisions and outcomes.

10 thoughts on “Assumptions

  1. It’s not what we don’t know, it’s what we don’t know we don’t know.

    I’ve always thought it odd when companies look for IT staff with experience in the company’s domain. I’ve seen more problems coming from assumptions based on outdated experience than not. I’ve always felt that it’s better to be deliberately ignorant and ask good questions than to go in thinking you know what someone else is talking about.

  2. Assumptions are a bit of a two edged sword.
    I had a manager once who, if one said “I assume you mean…” would retort “never assume”. Made a deep impression on me. But actually there’s nothing wrong with assumptions, as long as one documents them as such.

    Quite often, in my experience, it’s very difficult to get someone to explain to you exactly what they mean – especially if they think it’s obvious, which is an assumption in and of itself. In such a case it can be useful to write down a set of “assumptions” and ask the other person to confirm or correct those. It’s more positive than just asking “what do you mean?” and better than saying “so what you mean is”, because the latter is telling someone what they mean (which they don’t always respond well to – I don’t), whereas the former is saying what you think you’ve understood, which makes clear that you know you might be wrong.

    In particular I have found it useful when trying to tie down the scope of something or simply to clarify that what you think you have understood to be required is actually what the other person wants.
    As architects we do this all the time – no one more, I think, than a building architect, who needs to translate a purely functional requirement from a lay person (someone like me) into something that can be built but which the layman can’t properly describe.

    So getting back to the story, perhaps if the narrator had written down all his assumptions explicitly, he might have been able to discover which ones needed to be validated.

  3. The first thing that crossed my mind while I was reading this story was the phrase “polder blindness”. My English is not that good so I borrow the following explanation of this phrase from a “Paper presented to the Joint Meeting of the ECMT’s Road Safety Committee and Committee for Road Traffic, Signs and Signals, The Hague, The Netherlands, 15 March 1989″:

    “A typical Dutch type of accident may illustrate this. The type of accident is called “polder blindness”. Two cars driving in daylight on two intersections roads in our new polders collide not infrequently at an intersection. These roads have low traffic volumes and are straight, easy or even boring to drive and comfortable, while intersections are very conveniently arranged without visual obstructions. No doubt the crossing cars are very well visible from a long distance in the wide horizon of the flat landscape. Still they collide”

    This is a link to the report:

    After reading this report I now see that assumptions are “the result of the selection of elements for foreseeing”.

  4. The “Who’s on first” skit (the updated version gets to it more quickly) is a comedic amplification of the trouble we get into when we don’t surface and validate our assumptions. ;-)

    The story I quoted from Pirsig, and the observation about Ford that follows, demonstrate how assumptions become determining:

    “Henry Ford … spent roughly 7 years developing mass production and then another 10 perfecting it. In the process, Ford built such a tightly-coupled factory that, when GM and others demonstrated the market for variability in car makes and models, Ford could not respond. Changing the design of even a single bumper created ripples all up and down the line. 5 years later, to finally change, all Ford manufacturing operations were shut down for six months—laying off 75,000 men—while Ford engineers worked on a new production line. The Ford Motor Company never regained its dominance in the market.” —
    Andrew Hargadon, Creativity versus Efficiency part 2, September 2007

    Assumptions about what the market wanted were cemented into the production line, which optimized for efficiency over variability.

  5. I’m not sure whether Ford’s problem was really driven by assumptions but it is certainly a classic example to explain what requisite variety is about.
    I doubt that Ford simply assumed the market would not change. He may have underestimated that rate it would change. My feeling is that the model proved to be less flexible than imagined. It was a lot more difficult to adapt the production line than Ford expected. That could of course also reveal an assumption he made about the model, which was never tested until it became a serious problem.

    So we could draw two (non-mutually exclusive) conclusions. One is the theme of this discussion – that we shouldn’t simply assume anything – or rather that all assumptions should be tested as the negative hypothesis. The other is that assuming that science and determinism are corollaries of each other is unscientific (as I’ve contended elsewhere). Ford thought he had found a deterministic solution to efficient industrial production. If that’s your approach to science, you’re unlikely to admit the possibility of systems that can’t be made simple.

    As an aside and allowing for the possibility that it’s just a semantic distinction, I think all good experimentation starts from an assumption. You need something to test. The error is in the failure to test.

    Who’s on second base? No What’s on second base. Next.

    • :-) Ford apparently assumed that sameness at lowest cost was not only preferred (by his target market), but going to be preferred for a while, over some choice. Clearly that assumption enabled something important. But also set in place constraints. Now I mentioned Ford to illustrate that the assumptions we make become deeply embedded in the systems we create. They may not be easy to back out of. And they rule out paths we might have taken if we just stepped back, surfaced the assumption, and questioned it. We don’t have a crystal ball, but hidden assumptions tend to close paths we didn’t know we were closing.
      Getting multiple perspectives shared, to surface differences in assumptions, is one approach to take.

      • The fact that choices both open and close paths is what sets my teeth on edge every time I see or hear “YAGNI”. There’s a world of difference between “You don’t need it now” and “You will never need it”. Failing to understand that and failing to take change into account will cost you in the future. Nothing’s more expensive than cleaning up after unwarranted certainty.

  6. I have noted elsewhere that one of the most frequent unspoken assumptoins we make is that we all share the same assumptions. It is also probably one of the most dangerous.

    For more on this, see my talk “Why We Can’t Agree on What ‘Enterprise Architecture’ Means, and Why That’s OK, At Least for Now”. from The Open Group Conference Cannes, April 2012.

    (Open Group membership probably required).

    I’m giving an major update of this talk at the BPM & EAC conference in London this coming June.


Leave a Reply to Gene Hughson Cancel reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>