30 March 2011

Smoke Test on a Smoke Break

Smoke break for the shirt-wearing fansphoto © 2006 Photocapy | more info (via: Wylio)

A few weeks ago, I made one of those blunders all leaders new to a team makes... I made an assumption, a bad one, about everyone being on the same page regarding our testing process. It wasn't that the team didn't know how to perform a smoke test or that they didn't understand the necessity of performing a smoke test, its just that I wasn't clear at the outset of what my expectation were for the release.

Two days into the testing, someone finally got to what is our primary path for testing of a release... and found that the entire build was broken. This is something that should have been noticed within the first 10 minutes, not two full days of testing later. It was akin to trying to drive away in a car that had no wheels. The problem was so obvious, no one could have missed it. Yet, we did.

Sometimes its easy to get caught up in all the new bling of a release that we forget what it means to hit the fundamentals first. As the leader, the failure was my responsibility. I reiterated to the team why the smoke test was important, we all renewed our commitment to do this testing at the beginning of every build cycle and then went and did it.

Building it Better

But this miss on our part made me really start to think about what makes a good smoke test. Before our little hiccup, this had been mostly ad-hoc; we each took a look at the product, thought about the things that are vital to our users achieving their goals, hit the high points and then ran through as many things from highest to lowest priority as we had time. We were pretty good at this, too (when we remembered to do it).

This just wasn't good enough, in light of the problems that this miss caused. Thankfully, I have a team of really great testers and several others were thinking along the same lines as I. Last week, I finally took a few minutes to start outlining what it was I wanted in our smoke tests and how exactly we should go about doing them. No more than an hour later, with me never having said a word to my team about my personal brainstorming session, two team members walk up and present their ideas for a smoke test. The smile that flashed across my face likely blinded people three isles down the hallway.

I reviewed their initial draft and liked what I saw. They had basically nailed it. There were a few tweaks that needed to be addressed, but the list for our environment was mostly all there. With the hard part of creating the scenarios done, I started to think this situation would make a good blog post about why I was so happy about this particular smoke test. What follows are my thoughts on what makes a good smoke test.


It needs to take no longer than the worst cigarette addict's smoke break. In the time that the most fanatical nicotine fiend takes to burn through a few Marlboros, the team should know if the build is good enough for full testing or not. If the smoke test fails, snuff out the build and bum another from the development team.

For our team, we set this time at 20 minutes. The development team has usually taken the build for a spin in the testing environment, so its pretty rare when we are set loose that there is a problem that hasn't already been found and addressed. Still, its our first shot at the build and fresh eyes will routinely find issues that are overlooked by those who have been staring at it for hours.


What is that one thing that, if you can't do, the product becomes largely useless to your users? If this isn't the first item on your smoke test list, you probably need to rethink your list. Is it a piece of desktop software? If so, I'd start with a clean install (or upgrade if you mostly have a static user base). If the build won't even apply to the test environment, there isn't a lot else you can do.

For web testing, that's a bit different, especially in larger development shops like the one where I work. We have an Ops team that handles the deployments and a development team that makes the builds. Each build has passed through at least two teams before my team gets it, so install issues are basically found before I ever see it.

Given that we're an ecommerce site, ordering is the number one priority for our customers. If they can't place orders, we don't make money. We begin our smoke testing here. We look at the many different types of orders, they may different ways of making it into the ordering path and then make sure these all work.

Once the critical paths have been tested, its time to start digging in to the tasks of lesser importance. Add in scenarios until you fill up your allotted time. Don't choose quantity over quality, but aim for a nice medium that allows you to cover most of your system while hitting the really important parts in (relative) depth.


But even for a website, not everything is about the web. Mobile and App channels are becoming increasingly more important in our highly-connected society. If our smoke test only concentrated on one ordering channel to the exclusion of all the others, we might be neglecting some of our most important customer's needs.

For installed software testing, latitude testing could be just as important. Imagine if you were testing a new video codec and decided to only test it with a single media player front end during your smoke test. Video codecs are important due to their ability to plug in to many different OSes and architectures, enabling people with all different types of environments to view the same content. Covering as many platforms and devices as possible, especially the ones that are common among your users, is vitally important.


At some point, you have to make a call... is this build good enough or not to continue testing? If it is not, you know now and not two days later when you are now behind instead of on target or maybe even ahead of where you believed you would be. If everything tests out, you're not at the end, but you are at the end of the beginning.

29 March 2011

Designing process for other people

swimming race

I've been involved in process analysis and design for ages.  First in my local workspace, then in my team space, my department and then they let me wild with a whole business unit.  I did things that, on the whole made sense, but occasionally scared the pants off some of my stakeholders.  Luckily, things have mostly worked out.

In my wake I have seen problems though.  Finding the balance between back office process gnomes like me and putting control of how work is to be managed in the hands of those that do the work is not a simple black and white exercise.

At the very least there is the change management aspect of telling someone to change the way they think and act when they come to work.  Don't forget that a new process changes the underlying values that underpin the purpose of the work.

Then there is depth of knowledge and unexpected consequences.  When you re-organize the way teams work, you are certain, upon a follow up visit to see how things went to discover various interpretations, partial adoption of the change and occasionally outright bizarre scenarios that just weren't anticipated.

Lastly there is the insidious undermining of the authority and decision making rights of the staff who do the work.  You take away their ownership of how things get done, you take away a good deal of commitment to outcomes and replace it with commitment to process compliance.

This last scenario has been flogged to death recently in the context of managing project teams, but the truth is it's a very easy thing to see in many more places than project teams.  Just go to any call centre, or back office processing team and there it is.  It wasn't always like that and it doesn't have to be like that today.

When changing business processes involve the people who do the work and help them feel like they are in control of what is happening.  When designing new processes, ensure that they are defined sufficiently loosely to allow a range of responses to the many and varies customer needs and wants they are going to come across.

25 March 2011

An Elevator Pitch for your Project

Molly holds court in the elevatorphoto © 2007 Daniel Morrison | more info (via: Wylio)I've got a personal elevator pitch. I developed it over the course of a boring, week-long training class that a former employer required me to take. Given the remedial and dull content of the course itself, I had to get something out of my time, so that's where I developed my 'personal pitch.'

When I started as a business analyst, none of my friends had any idea what it was I did, only that I loved my job. (Lets be honest, at that point, I really didn't even understand what I did, even if I was almost competent at the tasks!) I decided that I was tired of trying in vain to explain requirements, process flows and solution designs, so I came up with something a bit easier to understand. Here it is:
I'm a business analyst. I study how people do their jobs. I then figure out ways they can be better at their jobs, either by changing the work they do, how they do their work or the systems they use to do their work.
Over the years, I've tweaked that short description (I did a couple tweaks as I just typed it out) numerous times and its something I'm quite happy with and proud to say about myself. I've spoken with only a small number of people in my life who were ready to go with their own elevator pitch at a moment's notice. Having this little gem waiting in the wings has been of great use to me over the years and I highly recommend everyone create one of these.

Because of my fascination with my personal elevator pitch, I'm always curious to see more about giving elevator pitches. When this article by Don Dodge, about giving elevator pitches for startups, showed up in my blog reader, I was instantly attracted to it. Don gives some fabulous advice for giving a great company elevator pitch, much of which is useful for crafting a personal elevator pitch... or even an elevator pitch for your project.

If you've ever seen me give a product demo, you know its one of my best skills. Last week I was at our company's yearly operator's convention and I had the privilege to give a demo (with a voice so strained I could barely talk) for three hours about one of our newest systems. In the demo booth with me was the hardware vendor whose platform was used as a foundation for the application that we had built internally. At one point, my vendor pulled me aside and expressed to me how impressed she was at my ability to demo a system. High praise from someone who gives an amazing demo herself!

That demo was nothing more than an elevator pitch for our application. Given that I performed the elicitation, analysis, documentation, verification and validation of the requirements for that project, I knew it better than just about anyone else, so I was a natural choice for giving the demo, but its how I gave the demo that made the difference. We made sales all day from people who walked up just to see what was new and left having made the decision to purchase then and there. The system is great, it nearly sells itself, but having a great pitch can only help. Here's the pitch I give.

First, as Don Dodge says, explain the problem. It can't be just any problem, it has to be a problem that can relate to the person hearing the demo. If you can't state the problem in a frame that makes sense to the person listening, you're going to fail. Use their language, their viewpoint and better yet, use them.

I start out by asking, "Have you ever been in situation X?", knowing that they have, otherwise they wouldn't be there to hear me. This is my second point, focus on the person. When they answer 'Yes' to my question, they almost always try to put a spin on the question to let me know that the problem is worse than the scenario I gave. When this happens, I know I've got their attention and its time to begin using the scenario they present as a means to show them how the product meets fixes their problem.

Now, at the third step, I finally acknowledge that the solution has been sitting quietly on the table behind me this whole time. If they haven't seen it before, I give a quick overview, usually no more than 30 seconds, then proceed to walk them through the solution using the demo system. Once they know that the product can meet their needs, I move the conversation to a select, high impact set of features that either show off the best parts of the demo system or that fit closely aligned with their original need.

During this last part, I try to make the demo as interactive as possible, doing my best to have them pose problems and me show solutions to them. Quality back and forth is key in showing how the output of your project can be central to their needs. Once their questions are answered, or if the line behind them is getting really long, I move into the most important section, the close.

Because I'm a natural influencer and not a seller, I tend to shy away from the hard sale tactics employed by many car salesmen. To me, a good demo doesn't need arm-twisting as your project or product will sell themselves. Always ask if there are any more questions you can answer, give the person something they can walk away with (even if its just a cheap printout flier) and most important, give them a call to action. This can be as simple as "I hope you like what you saw. Make sure you stop by the table by the door to speak with the person taking orders!"

So what about you? How do you pitch your projects or products?

24 March 2011

Too Much Stress?

In these last few months I have been under an unusually high amount of stress.

I have moved cities, taken on an uncertain role in a start-up mode consulting business and am working too many clients at once on short work packages with no long term certainty.  That's the money worries.  On top of that I have injured my back and have had chronic pain since October, I haven't been sleeping well, am not doing enough physical activity, and am dealing with a displaced toddler (we moved) and a handful of other minor gripes.

Life wasn't meant to be easy, right?

Working on one thing after another I am clearing the stressors and getting more and more in control.  It's been taking longer than expected.

I read the writings of Ellen Weber and Robyn McMaster and their work on the relationship between our brains, behaviors and the world we love in.  They write, among other things, about recognizing when you are in trouble and when you need to change the way you behave and think differently to break your rut.

It's hard to synthesize their work into a few dot points.  I just don't understand it well enough.  But I do know enough to recommend you to take a look at their websites/blogs where they share a lot of their knowledge.  If you are under stress (and these days many of us are) you might just find some actionable knowledge there.

23 March 2011

It doesn't just apply to startups...

Ever since I read his book, The Inmates are Running the Asylum, I have been following the sage-like advice of Alan Cooper. The man is sheer genius. If you're a twitter user, I highly recommend that you follow him.

On his company's blog, one of the Cooper employees discussed the company's difference in opinion on what makes for a successful startup development shop. An agile practitioner, Eric Ries, had suggested that the method for a successful startup is:

Build -> Measure -> Learn.

Cooper suggested that Ries got the elements right, but failed when it came to their order. Cooper said:

Learn -> Build -> Measure.


I couldn't agree more. How many times have we walked into a session with a stakeholder, who started out the conversation by saying, "What I think we need to do is..." without ever once explaining what their problem is?

A recent project of mine started by a drop-in from one of our field operations representatives. He asked me about a particular piece of functionality in our system that had been developed a couple years back but, because of some limitations in surrounding systems, had never been turned on. After a few questions from him about the capabilities, he let me know it what he wanted was different and this function wouldn't meet his need like he thought it would.

Knowing this stakeholder as well as I do, I know he wouldn't be spending this much time on a question if it was a casual inquiry. Thus, I decided to probe further and try to get to the root of the problem. As I started to ask questions, it came out that what he was asking for was really a way to achieve the same outcome that is achieved by one of our competitors, but within the unique confines of our particular business model.

It was one of those situations where, no matter how well you could have planned previously, what he wanted to do was completely contradictory to how the company had done business for decades. It wasn't a bad situation to be in, and what he wanted was a way to respond to a vastly different business environment that had sprouted up over the prior year. Yet, it just didn't fit within any of our existing business rules. A couple options we tried got close, but nothing ever really got 'there.'

So what if I, instead of seeking to first learn, had instead just piled a group of developers in a room and cracked the whip to get them coding? Sure, I might have had a solution sooner, and that solution would allow me to measure to see if the system changes would produce a desirable result. I could have learned first. But if I had gone with my initial first idea on how to solve the problem, without first taking the time to really dig in to the solution, I would have had a problem.

Namely, my first shot at a solution, while much easier to implement than our final solution, would have utterly failed to get at the root problem my stakeholder was trying to solve. The problem seemed extremely simple on the surface; in fact most of our stakeholders were astounded when we gave lengthy development efforts to do what was a simple act when using a piece of paper and a pencil. Eventually, it was the 3rd of 4 potential solutions that was eventually selected as the final answer and it took months just to get to that solution.

So how did we end up with the 'right' solution? There were a couple things we did that made a lot of sense looking back on it:

First, don't rush it. The original timeline we were given was 2 weeks to implement from the time I had that first conversation. It took months just to get buy-in because the seemingly simple requirement, so simple it was summed up in a single sentence by a stakeholder, ended up having massive implications throughout our entire enterprise. Don't scrimp on the analysis. Don't assume your stakeholder's assessment of the problem complexity is correct.

Second, dig in. Ask questions. Ask some more. Ask the first set of questions again, this time phrased differently. Ask someone else the same questions. Keep digging until you either hit rock or you're told to hand over the shovel.

Third, get creative. We had four viable, if radically different, ways to tackle the problem, once we understood it. They all met the core requirements, but some had more risk, some were larger development efforts and some were, frankly, hacks, that would have been a nightmare to maintain. If you looked at these four solutions in a vacuum from one another, I doubt you would ever believe they all solved the same problem. That is a good thing. Each of these potential options were theoretical 'build' steps which allowed us to then 'measure' if the solution was appropriate or not. Turns out, two of them really were not appropriate for our organization. Had we not followed the designs all the way out to the ends of their implications, we could have implemented one of these inferior solutions, causing us to lose much credibility with our stakeholders.

Lastly, over-communicate. About one month after the first conversation, we received sign-off on the solution direction and were beginning to plan the development effort. A seemingly innocent conversation between a BA and a stakeholder ended up turning the entire project upside down. Turns out, despite having said multiple times in multiple meetings what the implications of our initial solution would be, namely that one of the requirements would not be met by the 'quick' solution, our stakeholders didn't grasp what exactly that meant. Once they did, we started the analysis process over again, walking them through the entire set of options once again.

So what do you think of the Learn, Build, Measure process?

22 March 2011

Sample Size for Estimating

When using group based estimates how many people do you need for a representative sample?

There are a couple of things to think about when getting into this.

Firstly you need to understand the underlying principles behind group based estimates.  I go to Mike Cohn's work for this - for example Agile Estimating and Planning. Secondly your team need some basic schooling in what estimating is about, and the team need data to continually learn from what has gone before.  Estimates without evidence are guesses.

So, once you have your systems and approach worked out, the question: How many people from the team need to participate? I used the Sample Size Calculator at Survey System to get a statistically relevant sample size the above numbers.  The site has also got some definitions of confidence level and confidence interval there for you in case (like me) you need to check the definitions.

The numbers are presented in the diagram above are for a 20% interval (i.e. results will be within 20% tolerance) with a 95% confidence level.  By knowing this you know whether you can get away with estimates with less than everyone on the team and what it does to your confidence in the results.

This isn't the end and to start with the 95% confidence level with a 20% interval will only reflect a band of what people think the work will take.

In reality you'll have awful estimates to start with.  Just like anything you do at work, it's a skill you need to develop.  It's not hopeless though.  Observe, learn and improve.

Melbourne Scrum/Agile Meet-ups

Agile Business Analysts in Melbourne
On 29th March we'll be hosting an "Agile BA" meet-up in Melbourne.  It's a new group and it's designed to be a peer led discussion and learning group.  There haven't been any meetings yet and this will be the first time that I know of that a group will meet-up to discuss the specifics of the BA's part in the agile way of life.  Details here.

Scrum Meet-up
On the 13th of April there is a Scrum Meet-up scheduled.  (Details here.  No location yet.)  Again, it's mostly a self directed learning group, although some additional agendas were discussed last month.  This group looks like it is made up of a collection of coders, testers, analysts and project managers.  Some have extensive backgrounds working with Scrum, and some are new.

This group also wrote up a backlog of things they'd (we'd!) like to discuss.  You can read it here.  Contact the owner if you'd like to get access to add new things.

Other events
There are more, particularly the Lean groups, but I am not involved or planning to attend.  You can look them up yourself if you have a specific flavour you are seeking.

21 March 2011

Snowbird; Agile 2011

On the 12th February the Agile Manifesto celebrated it's 10th birthday (Twitter #10yrsagile) with an event facilitated by Alistair Cockburn.  Leaders from the Agile community were invited to a ski resort to participate in a workshop discussing where 'agile' is and where it's heading.

Here is a list of blog posts by some of the Snowbird participants sharing thoughts and reflections after the event.


Organisational goals > Org needs > Job description > Performance measures (KPIs) > Competency assessment > Performance review > Development plan.

How does your traceability matrix stack up?

18 March 2011

Bad Ideas v/s Good Ideas

The proper response to a lousy idea is not to stop thinking. It is to come up with a better idea. Indeed, we should prefer a bad idea to no ideas at all, because a bad idea can at least be reformed, while not thinking offers no hope.
I'm currently reading What Technology Wants by Kevin Kelly. The above quote just struck me as encapsulating what it is we as project people do every single day. I hope it inspires you all as much as it does me. The book is not for the faint of heart and its arguments are deeply rooted in history and society. I highly recommend it to everyone who reads this blog as a great philosophical tome on what I'll paraphrase as 'right technology.' Get it today!

(ps... I've got a post rolling around in my head right now based off of one of Kevin's ideas. Hopefully I'll get it written up this weekend for next week's reading on this blog!)

16 March 2011

Comparing agile and traditional project management

If I were writing an academic paper comparing traditional and agile software development/project management methods I would start my literature review with some of these.  Any suggestions for where to extend it?

I'd also need to understand that the waterfall model is not really practiced. It’s just a straw man for an argument.

I'd ask myself,
  • What are the common characteristics of real world ‘traditional project management’?
  • What divergences do we have to the agile manifesto content?
  • How are the ‘agile methods’ different?

15 March 2011

Why I'm a Business Analyst

If you've been reading my posts on this blog for long, you know that I tend to focus more on the 'soft' side of business analysis than on the hard skills. As passionate as I am about learning and being able to correctly apply the skills and tasks that make business analysts effective, I am a lot more passionate about the impact good business analysis has on our teams, companies and, if you'll allow me to be a bit dramatic, the world.

I don't know if Apple employs many (or any) business analysts. I do know they employ many amazing designers and developers who may not realize they use excellent BA skills. If you haven't played with an iPad, you really need to. If someone will let you borrow one (provided you can pry it out of their hands long enough) then I suggest you use it for work for a few days. It can change the way you work; I know it changed how I work.

But more so, I see good business analysis making an impact exactly like what you see in this video below. I hope it inspires you as it did me.

14 March 2011

Want better team meetings?

Consider using case studies in team meetings.

Share the job across the team. Take an hour or two to prepare. Longer if appropriate.

Go into your team meeting and give a 15 minute case study, and then put some questions to your team. Test their theory and practical knowledge. Follow up with a discussion of what actually happened and what you might do differently now that you’ve heard the team’s advice.

Use team meetings as an opportunity to share and learn.

10 March 2011

User Story Template

Due to popular demand I have aggregated some information on User Stories and created a simple template.  If you feel this would be useful to someone please send them the link.  If you have feedback, please leave a comment.
This template is provided in three parts.
  • Part 1 is an introduction to the User Story Template and some guidelines.
  • Part 2 is a Story Card layout for you to save/print and use on your project
  • Part 3 is a list of useful references and links that you should read to help maximize your value from this technique.
You can also sign up to one of my online User Story training courses. (Occasionally offered for free here.)

Part 1; About User Stories
User Stories are supporting artifacts for requirements. User stories are not expected to be a full and complete set of requirements. They are an anchor for a conversation. As a person who is creating and delivering requirements to a development team you may have further details written down, models created and rules listed. These are also useful and should be, like User Stories, used as supporting tools in a conversation with your developers.

Three key aspects of a user story are:
  • The “user” of the solution
  • The outcome you envisage from an interaction with the system, and
  • The value this interaction/outcome is trying to yield.

User stories come in different sizes and shapes and are expected to be prioritised in order, based on value. (Value includes mitigating risk, so hard, but low reward stories may be addressed early.) Typically User Stories are categorized into three types;
  • Epic
  • Theme (sometimes called Feature)
  • Story
Each of these labels represents a different class of granularity. Epics are huge and suited to things off in the distance. Themes are things generally being worked on now or in the near future. Stories are what you take to the sprint. Smaller classes of requirement fit into the larger ones. Think of Russian dolls. You can read more on these three classes of story elsewhere.

Part 2: Template

Front of card
  • Story [Short Name] 
  • As a [role] 
  • I want [something]
  • So that [benefit]
  • Size ____
  • Priority ____
Back of card
  • Acceptance Criteria [Short Name]
  • Given [Context]
  • When [Event 1] 
    • [Event 2] [Etc.]
  • Then [Outcome] 
    • [Outcome 2] [Etc.]

Part 3: Further reading
Below are some excellent web resources to help you learn more;
You might also want to read something a little more in depth.  Try the book "User Stories Applied: For Agile Software Development" by Mike Cohn.

Did you find this template useful? Does it need improving? Let me know.

Agile Business Analysts in Melbourne

I have scheduled a meet-up for the "Agile BA" in Melbourne.  For me this is a convergence of two interests, and co-incidentally seems to be a subject many people are seeking guidance on.

If this is an area of interest to you, come along.
29th March
6.356 Collins St

8 March 2011

Melbourne Scrum Training

Last week at the Melbourne scrum meet-up many people talked about their desire to find good ways to implement scrum into their environments.  It will probably be a recurring theme over the next few months.

I was just on the email thingy with Rowan Bunning and he's got a CSM training course in Melbourne next week.  I personally found it a good course to help me with a scrum implementation so I recommend it.

You need to have an integrated understanding of the basics before you can really make this stuff work for you, and the chances are that a few conversations with people in a structured context is going to do that a lot better than books and blog posts.

I don't get paid or receive any benefit from Rowan for mentioning his course.  I mention it because I genuinely believe it can help you in increasing your performance at work.

As Shim might say: Think about it!

7 March 2011

Who does confidentiality benefit?

It makes sense when you work in a competitive industry to keep secrets. Even if they’re badly kept secrets.

When I worked in the telco industry there was sufficient churn of staff across the different Australian players that it was unlikely any company didn’t know what the others were up to.

And that’s before you get advertising agencies, IT vendors, outsourced sales companies and the like involved.

Of course everyone keeps confidential in their context, but once people change companies, they take their personal knowledge with them. So the real competitive advantages (beyond people) lay in a flexible infrastructure that could take new products to market quickly and keeping you people engaged enough to not jump across to the competition. Arguably that’s another type of infrastructure.

It’s a matter of course that when you sign on you sign up to some rules. You agree to conditions around use of the internet and to keep anything that’s marked confidential, secret or ‘in confidence’ hush.

Anyway, most of the work I have been doing in the last few years has been for government.  So competitive secrets aren't really a factor.  And government generally says it wants to be transparent, right?  


Here’s the thing; Imagine if government projects were front to back transparent. If I were looking for a cross platform ticketing system for Melbourne City for example, I could publish my goals, requirements and so on to the world and potentially another government somewhere in the world has faced the same challenge. They could share what they know.

Organisations could collaborate, share expertise, develop more diverse career paths for their staff, and save money on buying more pre-built systems.

Suppliers would be more open and accountable for the products they build, could invest more into maturing system quality and functionality because more customers will use the better end of the product spectrum. Poor products would disappear quicker.

Wouldn’t that be awesome?

5 March 2011

System Error

Worth watching this series of public statements by senior IT managers (often .gov.uk) for the UK about why their orgs need to switch to agile approaches.

The clip embedded below is very dry, and the sound isn't very good.  However, it's a lead in to a series of other clips - go through to Youtube and watch them all - or at least a bunch.  Who knows, maybe it will help you in your transformation negotiation.

Maybe it's time to switch to Lean.

3 March 2011

Deadlines, People, Planning, Choices

Ben managed a complex project. Hear him talk about some of the choices he made along the way.

The HiPPO: A Project Nightmare

Hippo in Ngorongoro Craterphoto © 2006 Geof Wilson | more info (via: Wylio) Frankly, Hippos are really very aggressive animals. They've got massive jaw pressure, large fang-like teeth and because they are extremely territorial, will attack any human that gets even close to them. In short, they are not likely to end up on the list as a replacement for dogs as man's best friend.

But I'm not really taking about the Hippos you find in Africa, I'm talking about HiPPOs. While they can share some of the same traits, especially when it comes to territorial protection, Hippos and HiPPOs really are different things.

Hopefully by this point I've peaked your interest just enough to make you wonder what in the world I'm talking about. A HiPPO is the 'Highest Paid Person's Opinion.'

You've all seen it before... you're in a meeting, trying to figure out the solution to a really tough problem. For the last hour, everyone has been tossing out ideas, shooting holes in the ideas of others and generally building to a consensus as to what should be done. The group finally comes to a final decision, only to recognize that there is a shadow hanging over the room, namely that 'Bob' hasn't been involved.

Bob doesn't necessarily have anything to do with the solution. He's a VP and the entire room has been discussing implementation details of a project that has a smaller budget than what Bob makes in a month. Yet, no one wants to make the decision without Bob's input. The meeting ends with one of Bob's direct reports (or worse yet, someone who reports to one of Bob's direct reports) being assigned to go find out what Bob wants to do as his opinion is really the only one that counts, despite him not having the expertise or situational knowledge to make an informed decision.

I'm not a person who thinks that keeping a Bob in the dark is a good idea, but I'm also not one who advocates taking every decision to Bob, either. Taking every decision to Bob leads to decision paralysis and then project paralysis. If no one but Bob can make the decision and Bob is unavailable, work grinds to a halt. So when should a decision go to Bob?

When I'm deciding if something needs to go to Bob, I ask the following questions:

First, what is the impact? Is it big or small? How many people does it impact? Will it have a negative impact on another team, department or division within the organization? If the impact is small, both in people and reach, the decision doesn't really need to go to Bob.

Second, what does it cost? This one is a bit trickier because costs can be difficult to quantify. Is the cost a direct cash outlay? Is it for a deferrable purchase? What are the spending limits imposed by company policy? What are the savings? What department is paying for it? If the purchase fits within the guidelines of the people assembled to make the decision and Bob has previously authorized similar purchases, then its probably safe to say that it doesn't need to go to Bob.

Third, what about timing? Its rare when a decision must be made right now, but they do come up. If Bob isn't available and the decision must be made now, who makes it? Bob's boss? Bob's boss' boss?

Last, what is Bob's personality? Is he a person that expects every decision to go to him? Is there a way to help Bob understand that certain decisions really don't need his input? Is it worth sitting down with Bob to outline exactly what should be brought to him and what shouldn't?

These questions really are not easy to answer, despite them having seemingly easy answers. The same question, asked in the context of a small upgrade project versus a company strategic project may change if Bob should be included in the decision making process.

What questions would you ask before going to 'Bob' for an answer?

2 March 2011

Checklists are good

Army Guard Father, Son Fly TogetherIn recent months I've been making the recommendation to people to improve the quality of their through the use of checklists.  Checklists are useful for a number of reasons.

In the first instance checklists help us be complete and help us avoid mistakes of omission.  Our work is complex and we are often working under the pressure of tight deadlines or dealing with many concurrent parcels of work.  In these circumstances the complexity of our situation can present many many opportunities to make mistakes.  In this scenario checklists help us ensure we cover the basics and make sure we handle the basics effectively.

Taking this one step further, and filling in a checklist on a form, and even better - verbally running through it with a colleague - elevates the quality of the checklist because your peer or reviewer can help you by challenging you and keeping you focused (a little like a deadline.)

And thirdly, the act of creating a checklist - or improving an existing one - is great because to create one you need to research and learn from your peers in the industry.  If you as an individual or as a group of team mates construct a checklist, you have researched, formed opinions, evaluated alternative ideas and then made judgments about what is appropriate to your situation.

So, having gone through the act of giving this advice so many times recently I discovered The Checklist Manifesto: How to Get Things Right by Atul Gawande.

(I haven't read it yet, but it's in the backlog.  Right behind Capers Jones Applied Software Measurement and Jurgen Appello's Management 3.0.)