This is the latest in an occasional series of posts by Henry Quinn, whose insight is the reason we invite him to blog for us--not the fact that he's our executive director's husband.
Here's a thing that no one should ever say, but that people sometimes do:
"I'd like to make a forecast about how some effort is going to impact donations, but there's something important that I'd just be guessing at. Oh well—I'll just guess."
The reason you shouldn't say this is what those of us in the business call a ‘sensitivity analysis.’ It's a method for making maximum use of the things you do know, while still allowing for one or more degrees of uncertainty—all without forcing you to commit to an unfounded guess about any of them. Want a simple, single-unknown example?
You're planning on-site changes to increase the visibility of your membership renewal page. You know the following:
- In 2011, you had 1M visitors.
- 4 percent of those saw the membership renewal page.
- Of those who saw the old page, 46 percent renewed successfully.
- The average renewal membership is worth $45.
- Finally, you're going to be implementing these changes in July, and 78 percent of your visits come in the second half of the year.
Notice that all these things are pretty easy to measure or find out about 2011, and you actually know a LOT. In fact, there's only one thing, a small, isolated thing, that you don't know. This is key to a sensitivity analysis—you need to isolate the thing that you don't know, and define it very, very precisely. If you throw up your hands at “Well, these efforts are going to do SOMETHING,” you can't use this method. You're also not trying very hard, and quitters never win. The changes in this example aren't going to do SOMETHING, they're going to do something quite SPECIFIC: increase the rate at which visitors to your site visit the renewal page. That is the one and only figure you don't know.
And with that in mind, I can tell you EXACTLY how the changes will impact your revenues in 2012*: the revenue impact is going to be equal to the number of visitors times the percentage that see the renewal page times the percentage of those who renew times the value of a renewal times the percentage of potential renewals that will be affected by the changes given when they'll be implemented times some factor representing the increase in the rate at which visitors look at the renewal page. To put a point on it, that's 1M X .04 X .46 X $45 X .78 X a factor, or $645,840 times that factor.
What we've done is box in your uncertainty, and determine precisely what impact the unknown is going to have on your eventual estimate—you've determined the SENSITIVITY of your forecast to a single, understandable unknown. The only thing we have to do now is figure out what a reasonable range of guesses for that factor is, and then apply the $646,000 figure to get a range of reasonable impacts. Rather than making a single guess for a large and complex unknown, you can describe a relationship, within a limited range, based on a very specific unknown. That analysis looks like this:
If the increase in renewal form view rate is this...
Then the increased revenue will be this.
Now THAT'S a useful table. How much is it going to cost you to make these changes? If the answer is $10K, I'd go ahead and make them—if they provide even a 2 percent lift in renewal form view rate, they'll pay for themselves in the first year. If the answer is $75K, I might skip them—you'd need a view rate increase between 10 percent and 20 percent, which feels pretty aggressive, to make them worthwhile.
At a higher level, I'd make two other points about how this table can help you make better decisions. First, it's easy to communicate, upwards and outwards. It describes a very straightforward relationship, with exactly one moving part, and so discussions about it can be very focused.
“Ah,” says your Executive Director. “So you're recommending we go forward with these changes because they'll bring in an extra $32,292? Well, 5 percent sounds like a reasonable level of lift. Let's do it.” It's not that $32K feels high or low—there's too many ingredients in that stew for it to be a simple call. It's the 5 percent lift that's up for discussion, and that's a much simpler consideration. Everything else follows automatically from that.
Second, when your forecast eventually breaks, this table makes it easy to diagnose where it broke. Did you see a 5 percent lift in view rate, but miss that $32K? Then something about your assumptions was wrong, but the changes themselves had the desired impact, which was to increase a very specific metric. Did the relationship hold, but the lift in view rate not come in? Then your prediction about the impact of the changes was off on the high side. This is so much more instructive than simply saying, well, we thought it would come in at $32K, and it didn't.
(Which really illustrates a third sense in which this is good practice: it establishes, precisely, how you're going to measure the impact of the change, months in advance of its implementation. Think about that—knowing, a year before you'll be asked, what you're going to do when someone asks you, “Did that work?”)
So, uncertainty exists. If you believe it's lurking all around you, spread across everything equally, it becomes really hard to predict anything. But if you believe that you know most things pretty well, and that the things that you don't know can be narrowly defined, a sensitivity analysis can be a really helpful method for wrangling it.
* Yes, yes—you're expecting other changes in 2012, related to other changes in what you do, how you do it, or external circumstances. These changes, they aren't true knowns, but you've probably had to think about, and budget them, as part of some larger planning process. You've already decided what they're going to be, and so they can be applied here as though they were knowns. We're also (secretly) assuming that the mentioned rates are constant throughout the year. If you wanted to, you could easily eliminate this assumption by actually measuring those rates from July to December, or measuring them relative to the first half of 2011 and extrapolating from current measurements in the first half of 2012.