On his company's blog, one of the Cooper employees discussed the company's difference in opinion on what makes for a successful startup development shop. An agile practitioner, Eric Ries, had suggested that the method for a successful startup is:
Build -> Measure -> Learn.
Cooper suggested that Ries got the elements right, but failed when it came to their order. Cooper said:
Learn -> Build -> Measure.
I couldn't agree more. How many times have we walked into a session with a stakeholder, who started out the conversation by saying, "What I think we need to do is..." without ever once explaining what their problem is?
A recent project of mine started by a drop-in from one of our field operations representatives. He asked me about a particular piece of functionality in our system that had been developed a couple years back but, because of some limitations in surrounding systems, had never been turned on. After a few questions from him about the capabilities, he let me know it what he wanted was different and this function wouldn't meet his need like he thought it would.
Knowing this stakeholder as well as I do, I know he wouldn't be spending this much time on a question if it was a casual inquiry. Thus, I decided to probe further and try to get to the root of the problem. As I started to ask questions, it came out that what he was asking for was really a way to achieve the same outcome that is achieved by one of our competitors, but within the unique confines of our particular business model.
It was one of those situations where, no matter how well you could have planned previously, what he wanted to do was completely contradictory to how the company had done business for decades. It wasn't a bad situation to be in, and what he wanted was a way to respond to a vastly different business environment that had sprouted up over the prior year. Yet, it just didn't fit within any of our existing business rules. A couple options we tried got close, but nothing ever really got 'there.'
So what if I, instead of seeking to first learn, had instead just piled a group of developers in a room and cracked the whip to get them coding? Sure, I might have had a solution sooner, and that solution would allow me to measure to see if the system changes would produce a desirable result. I could have learned first. But if I had gone with my initial first idea on how to solve the problem, without first taking the time to really dig in to the solution, I would have had a problem.
Namely, my first shot at a solution, while much easier to implement than our final solution, would have utterly failed to get at the root problem my stakeholder was trying to solve. The problem seemed extremely simple on the surface; in fact most of our stakeholders were astounded when we gave lengthy development efforts to do what was a simple act when using a piece of paper and a pencil. Eventually, it was the 3rd of 4 potential solutions that was eventually selected as the final answer and it took months just to get to that solution.
So how did we end up with the 'right' solution? There were a couple things we did that made a lot of sense looking back on it:
First, don't rush it. The original timeline we were given was 2 weeks to implement from the time I had that first conversation. It took months just to get buy-in because the seemingly simple requirement, so simple it was summed up in a single sentence by a stakeholder, ended up having massive implications throughout our entire enterprise. Don't scrimp on the analysis. Don't assume your stakeholder's assessment of the problem complexity is correct.
Second, dig in. Ask questions. Ask some more. Ask the first set of questions again, this time phrased differently. Ask someone else the same questions. Keep digging until you either hit rock or you're told to hand over the shovel.
Third, get creative. We had four viable, if radically different, ways to tackle the problem, once we understood it. They all met the core requirements, but some had more risk, some were larger development efforts and some were, frankly, hacks, that would have been a nightmare to maintain. If you looked at these four solutions in a vacuum from one another, I doubt you would ever believe they all solved the same problem. That is a good thing. Each of these potential options were theoretical 'build' steps which allowed us to then 'measure' if the solution was appropriate or not. Turns out, two of them really were not appropriate for our organization. Had we not followed the designs all the way out to the ends of their implications, we could have implemented one of these inferior solutions, causing us to lose much credibility with our stakeholders.
Lastly, over-communicate. About one month after the first conversation, we received sign-off on the solution direction and were beginning to plan the development effort. A seemingly innocent conversation between a BA and a stakeholder ended up turning the entire project upside down. Turns out, despite having said multiple times in multiple meetings what the implications of our initial solution would be, namely that one of the requirements would not be met by the 'quick' solution, our stakeholders didn't grasp what exactly that meant. Once they did, we started the analysis process over again, walking them through the entire set of options once again.
So what do you think of the Learn, Build, Measure process?