Why Johnny Can’t Be Agile


—Andy Hunt

01/01/2011
Published in PragPub Magazine

Ten years ago in February, at Snowbird in Salt Lake City, Utah, I was hanging out with 16 notable folks who cared deeply about software development. We talked about many things, all around the general theme of what to that point had been called “light-weight processes.”

But “light-weight,” while perhaps technically correct, carried connotations of insufficiency. So after some discussion, we all agreed on coining the term agile. And thus the Agile Manifesto and the eponymous movement were born.

But agile techniques were a hard sell. Even simple, non-controversial practices such as version control weren’t yet universally adopted. I used to ask audiences at talks how many people did not use any version control on their project. Typically somewhere between 10% and 30% of the audience raised their hands. It wasn’t that folks were dead set against using version control on religious grounds, they either just didn’t know any better or just didn’t bother.

So surely this was just a matter of education; of spreading the word. Agile methods made sense. Agile ideas are grounded in reality, and even some actual science. Surely the world would see the logic in it? You’d expect some amount of resistance to any new way of developing software—especially one where continuing to Embrace Change was part and parcel of the new method. Folks don’t like change in general, and here we were advocating riding the wave of constant change. But somehow it seemed that resistance to agile ways went deeper than that.

We are not rational or logical creatures. We’d like to think we are, but the biological, psychological truth of the matter is that we’re predictably irrational. How irrational? Take a look at Wikipedia’s list of common cognitive biases for starters. This dauntingly long list begins to describe the many ways that we humans aren’t as logical or reliable as you might think. While you might encounter many of these in your daily life or work, a few stand out as significant barriers to agile adoption. These cognitive biases are why Johnny Can’t Be Agile.

Need for Closure

I think the most significant cognitive bias that affects agile methods in particular is our Need for Closure. In general, people are not comfortable with doubt and uncertainty—so much so that we’ll go to great lengths to resolve open issues and to remove uncertainty and reach closure. In other words, our default wiring says it’s better to make the wrong decision now instead of waiting and making a better decision later.

That’s partly why classic Big Design Up Front seemed so appealing. It demanded a heavy initial investment in design and architecture, but promised closure. With the big formal design done, there was no doubt and no uncertainty. There would be problems later on of course, because of pesky low-level details or volatile changes that often invalidated the initial design. The promised closure and removal of lingering uncertainty never actually materialized, but everyone felt better with the illusion of closure.

Uncertainty is a good thing: it leaves your choices open. Forcing premature closure, as in Big Design Up Front, cuts off your options and leaves you vulnerable to errors and unforeseen changes. Artificially making and declaring a decision, such as the end date of a project, doesn’t remove the inherent uncertainty; it just masks it. So agile requires that we grit our teeth and not only put up with uncertainty and doubt, but hold on to them. Live in them. Revel in uncertainty, and postpone closure for as long as possible. After all, you know the most about a project after it’s delivered, and you know the least when you first start.

Logically then, you should postpone major decisions until near the end of the project. But you won’t want to, and some folks simply can’t stand it. They’ll demand an answer, a decision, now. The agile way to respond to that is to offer an answer as a current estimate that will be revised periodically.

The end date of a project is a popular bit of closure that lots of folks would really like to know. But we don’t know it; it’s uncertain based on quite a few variable factors, not all of which we control. So we make a guess. An iteration or two go by, and we’ve got more data to work with. So we make a better guess. Time goes on, and we slowly converge on what will ultimately be a mostly-right answer. For someone who is uncomfortable with uncertainty, this might seem like slow torture.

The Devil You Know

There’s an old expression, “better the devil you know than the devil you don’t.” There are a number of cognitive biases that support this approach; we’re happier with a familiar situation, even if it’s sheer torture, instead of facing the unknown—which might just be even worse.

Hmm, this smacks of another issue, Post-purchase Rationalization: the tendency to persuade oneself through rational argument that a purchase was a good value. That tendency keeps a lot of expensive tool vendors in business. You've got a lot of incentive to use that crappy IDE or database because your team spent so much money on it. Even if it's killing your project.

This is summed up nicely in the Exposure Effect. We tend to prefer things just because they are familiar. This includes tools, techniques, or methods that aren’t working well anymore or that are even actively causing harm. So we’ll generally stick to a crappy way of working, or a crappy programming language, or a crappy IDE just because it’s familiar.

On a related note, we’re wired to be very much Loss Averse. That means it’s usually more important to us not to risk winning if there’s any chance at all of losing something we already have. Remember all those certifications that you worked so hard for? Isn’t that language dying out? You might think it’s better to stick to the dying language because of your investment in it. Hmm, this smacks of another issue, Post-purchase Rationalization: the tendency to persuade oneself through rational argument that a purchase was a good value. That tendency keeps a lot of expensive tool vendors in business. You’ve got a lot of incentive to use that crappy IDE or database because your team spent so much money on it. Even if it’s killing your project.

You’ll even promote those few choice bits of fact to support your decision, whilst conveniently ignoring the vast majority of facts that show you’re in deep trouble. Ah, that would be the Confirmation Bias at work. Everyone has a tendency to find just the facts that fit their own preconceptions and pet theories, and dismiss the rest.

With all of this baggage supporting the existing status quo, it’s even harder for a newcomer, such as an agile method, to gain acceptance.

The Programmer’s Bias

Okay, I’ll admit it, there actually isn’t a cognitive bias named The Programmer’s Bias, but there should be. First, take the Planning Fallacy: the tendency to underestimate task-completion times. Add to that some Optimism Bias: the tendency to be over-optimistic about the outcome of planned actions. Sound uncomfortably familiar?

Left to our own devices, we’ll underestimate the task at hand and overestimate the quality of the result. Now in agile methods, we’ve got techniques to help counter that, including iterative and incremental development, feedback with unit and other automated tests, heavy user involvement to validate our efforts, and so on. But you actually have to use these techniques, there are no shortcuts. And folks in your organization who aren’t part of the agile experience may still expect it to be perfect and available yesterday.

Reflecting Dimly

Agile methods are big on reflection, on conducting formal, team-wide and informal, personal retrospectives. It’s a great idea, and really the only way you can improve over time. But there are some problems here.

First, there’s the well-known Hawthorne Effect. Researchers have noticed that people have a tendency to change their behaviors when they know they are being studied. You’ll see this when you introduce a new practice or a new tool on a team. At first, while everyone is watching—and everyone knows they are being watched—results look great. Discipline is high, and the excitement of something new fuels the effort. But then the novelty wears off, the spotlight moves away, and everyone slides inexorably back to previous behaviors.

That’s enough to condemn agile in the eyes of many organizations. They’ll say, “We tried it once, and it didn’t work.” Perhaps a bit of confirmation bias has snuck in there as well. The normal lull after the initial novelty wears off gets confused for failure of the method itself.

Finally, and perhaps most damagingly of all, is the little-known Dunning–Kruger effect. This is the source of my favorite story about the “Lemon Juice Man,” the fellow who robbed a bank in broad daylight and thought that lemon juice made him invisible to the security cameras. It didn’t. But it never occurred to him to doubt his own belief in the power of lemon juice.

This lack of “metacognitive ability” is our lack of thinking about how we think; our lack of critically evaluating what we know and how we think we know it. As a result, less skilled people will overrate their own capabilities by a large amount. Highly skilled people will underrate their abilities, because they have come to understand just how complex their field can be.

So we often don’t know what we don’t know, and it’s hard to convince us otherwise. But knowing that there are problems in how we think is the first step in avoiding them.

Something to think about.

What’s a guru meditation? It’s an in-house joke from the early days of the Amiga computer and refers to an error message from the Amiga OS, baffling to ordinary users and meaningful only to the technically adept.


Keep up to date with my low-volume newsletter and don't miss another article or fresh idea:

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Newsletter


Book cover

⬅︎ Back to all articles


Latest News

Recent Articles

Andy Hunt lecture/talk Andy Hunt lecture/talk Andy Hunt lecture/talk Andy Hunt lecture/talk