Have a Hypothesis

The comparison between debugging and the scientific method has been made before. Originally, I understood this to be a cyclical process where you proceed based on observations of your program using things like interactive debuggers, lots of print statements, and a short iteration cycle. A recent experience with some buggy code has convinced me that this is the wrong way to look at the scientific method as it applies to debugging--at least for the beginner.

As part of my thesis research I'm working on tracing sources of emissions of carbonaceous aerosols using a general circulation model. This involves having an optimization routine modify emissions to bring them into agreement with observations. My results were obviously wrong, but they had a pattern that matched the spatial distribution of the biogenic emissions source in the model to the pixel. Clearly this observation implied that the biogenic emissions were to blame. (Spoilers: that wasn't the problem.)

What followed was a downward spiral of frustration and ineffectual iteration in the observe-edit-run cycle. That madness only stopped when I settled down and remembered a part of the scientific process that I'd left out (as the astute reader will have noticed). The essence of the scientific method and the most important insight it offers into debugging is:

Have a hypothesis.

Before you load the debugger, before you go crazy with print statements, before you do anything but carefully review the information about why your program is failing, have a hypothesis. Doing so ensures you spend some time thinking about your code, helps structure your investigation and keep it moving forward, and steers you away from bad some habits. In retrospect it's an obvious oversight, but it's essential to making the process work efficiently.

To have a hypothesis, you have to understand how your program works. Without a solid grasp of its overarching structure, an accurate mental model of how it executes, and at least an inkling of where the problem is, you can't form a hypothesis about your program. You also probably can't debug it effectively. If you find that can't easily come up with a hypothesis or two, it's probably time to pause and learn about the parts you don't understand. Having a brief space to think without the distraction of furiously interrogating the source code can also be enough to bring about that vivid flash of insight. It's important to have the right ratio of thinking, coding, and refining your ideas, but striking that balance is no mean feat even for an experienced developer. Having a hypothesis is a shortcut, an easy thing to keep in mind to get the mix just about right.

One of the motivations for applying the scientific method to debugging is to simplify an otherwise overwhelming task. Trivial examples aside, bugs arise in obscure and unexpected ways, so it's never obvious how to tackle them. Through a combination of optimism, desperation, and the apparent lack of alternatives, it's easy to escalate into logging, simulations, elaborate visualizations, flushing out the problem through sheer force of print statements etc. There are a multitude of activities that can help you understand how your code works, and you can pour hours into them achieving nothing but a frantic, buzzed half-instinct about what's going on. By putting the hypothesis first, you focus on the goals of your analysis and have a clear endpoint: the hypothesis stands, doesn't, or reveals another hole in your thinking. Having a hypothesis isn't as good as a detailed plan or solid good habits and instinct, but it lifts you out of the weeds and demands that you proceed in light of the bigger picture.

I've noticed a number of transparently bad habits in my debugging: poring over source code unrelated to my problem, belabouring preliminary points, relentless logging and printing, and (in times of truly dire frustration) the `change-a-line-and-pray-it-compiles' method of code validation. Each of these behaviours has its own trigger, it's own destructive cycle, it's own threats to watch for. Each is also inconsistent with having a hypothesis. Perhaps the most beneficial effect of having a hypothesis is that it's a tight, simple defense against seductive bad habits.

In my Carbonaceous Aerosols case, it turns out there was a bug in my optimization algorithm; the extra source was a red herring. I arrived at this conclusion after a large amount of inefectual flailing, followed by a walk, a few deep breaths, and a focused effort to keep my hypothesis in mind as I tracked down the problem. It's a useful brain-hack; coming to it on my own (at least in this instance) has made it particularly sticky. Now if only I'd had it two months ago ...

Nat Knight — 2014-08-25