The Brutally Honest Guide to Product Management

"All the responsibility and none of the authority"...This is the muttered mantra of the product manager. I've collected my battle scars from 26+ years of start-ups to Fortune 50 companies. I'm sharing 'em all, semi-edit, to let the next gen avoid some of the hidden traps and find ways to smooth over the rough patches.

Tuesday, January 19, 2010

Product Definition-The Eternal Battle: Data vs. Instinct

Product definition…it is the most basic job a PM has and yet NOTHING is more potentially dangerous. Perceived weak product definition screws with getting projects funded, with getting the teams working in alignment without passive resistance, and with management and boards coming in and changing direction after every tradeshow, or competitor news story. (Notice I said perceived weak, not actual weak. You can have a great product definition and vision, but if it is not accepted by the folks who hold the purse strings, they will keep changing directions on you based on their latest return from trade show fire drill.  And if it isn’t both clear, and agreed to by the whole team, people either go off in their own directions, will do end-runs, or just get frustrated because they think you don’t know what you are doing and end up punching the clock.) On the other hand, a strong product definition that everyone buys off on gives you a team that can be creative in meeting that goal, actual alignment between marketing, product and engineering, and a lot of moral strength to fight off getting the team jerked around from above, even to the board level.

The core of the problem is this: None of us know what we are doing!

I’ll let that set in for a minute, and before you get too defensive about it, let me provide some context.
My wife is a Genetic Counselor. She lives in a world dedicated to discovering the objective truths in situations that often involve people making incredibly difficult decisions for themselves or their current or potential future children. Over the years, she has looked over my shoulder at the various methods I’ve used to determine the correct product decision with a sense of amused disbelief. And when I finally looked at it through her eyes, I had to admit she was right…we’re all just making this up. There are no objective facts here, just our best guesses. (And yes, I’m including most of the qualitative and quantitative methods in this. The truth is that we attempt to put a scientific quantitative method over what is at it’s core an incredibly subjective set of feelings from what is by it’s nature, a much too small sample of our potential customers. ) Anthony Ulwick, who actually has method that I liked the most and will talk about a little later, tells a story that illustrates this incredibly well. Back in the mid-eighties he was part of the group at IBM that launched the PC Jr. IBM was one of the best consumer marketing companies around, and they prided themselves on how well they knew how to do the VOC research. What they laid was one of the greatest bombs in PC history (I think only exceeded by the quarter billion Phillips blew on CD-I), and it really shook him. How could IBM, one of the greatest customer research companies of all time, miss the mark that badly? And as he looked at the wide consumer use (beyond just the tech world) of the standard VOC methods of figuring out what to make, he saw that even perfectly applied, the success rates were something like 20%.

This is a very disquieting fact for most of us. We want to feel like we did the right research, talked to the right people, and got the right voice of our customers so that we either A) have a successful product, or frankly B) can point to something outside of ourselves to blame if we get this wrong. I’ve seen way too much of the latter.
Instinct vs. Data Driven Product definition:
There are two basic schools of thought regarding doing product definition: gut vs. statistics. These two camps have very little respect for each other. Here is a bullet from a recent job posting for a PM role at Verisign:
• Gathering and analyzing internal and third-party data to make objective, fact-based recommendations that improve products relative to the defined strategies and objectives.
Can’t you just hear the disdain dripping…clearly whoever wrote this was tired of meetings filled with people chest bumping each other to show that they new best. When I was there, Adobe was that kind of company. To get a product approved, you needed to show that there was a market demand by heading over to the research department to cull the analyst (click here for a piece I did on how that sausage is made) world to build the obligatory Venn diagrams, and digging up time and funding with the customer research group to either do, or more likely piggyback on some other bit of research to try and dope out what people want. (I’ll write a piece later on the various ways different companies and funders need to see projects before they approve them.) This VOC research is the way most big corporations have been driving product definition for years, and creates a semi-impermeable shield to enforce a direction on the whole team. (“Hey…look, we did the research, and this is what the customers said…”). The challenge with this, and I’ve seen it WAY too much, is that customer’s can respond to what they think about a given product, but if you ask them to look into their future needs, they frankly don’t know what to say…they have lots of individual opinions, but what gets created ends up acting as a Rorschach test, which each stakeholder in the company ending up using the data to justify their own feeling about what should be done. And also, to be brutal about it, it’s not the customers job to tell us what to build…it’s their job to let us know what they want to get done at the end of the day. It’s our job to get clever about how to solve that.

Which brings us to the second school of product definition, instinct. One of my best friends, an ex-V.P. of Engineering at Adobe, pointed out that he thought I was such a good product definition guy because I ended up method-acting my customers. (Despite starting out as a Physics major, I ended up with my BA in Theater, much to my father’s dismay.). He was probably right, but that runs into major problems of it’s own. There are two parts of product process where everybody involved think they know best: UI and product definition. Any UI guy who’s had the whole world, one after another, stand over their shoulder and tell them to “move that up just a bit” knows the revenge desire to go over the coder’s shoulders and tell them to just “change that from a semi-colon to a colon” and see how THAT works. For those of us in charge of product definition, the situation is the same…everybody “knows” what we need to do, and it is that clash of instincts that leads to real problems: Either ongoing control battles with either lots of shifting direction, other people in the project waiting around for any perceived sign of weakness to jump in and get things to change based on what they think is the right way to go, or frustrated members of the team creating passive resistance that leads to them wanting to show why something isn’t going to work rather then trying to make things work.

One of the ways many companies deal with this is by making sure to hire someone in the PM role who comes from the area that the product is aimed at. The notion is that they have the moral authority to follow their instincts because they are the customers, and everyone else has to differ to their pronouncements. That can work, but it has a couple of significant downsides:
  1. Very infrequently are they experienced PM’s, and if they were that good at the target profession, why are they leaving it?
  2. It is still one person’s instincts, and the minute they prove wrong in any way, their credibility goes and the project falls apart.
  3. Often they will also try and hire marketing folks from the same area, and suddenly there are two experts who can clash….that also leads to a shaking in confidence in direction.
Another approach to getting every one in alignment is sheer force of will. A PM or company founder can create enough sense of force or confidence that people will follow. The downside is unspoken passive resistance, or a completed drop in confidence when directions need to change. This can lead to an increasing use of the stick, which can work well for death marches, but you’d better have a good flow of new employee’s, because you are going to need them when you’ve wasted those you have now. When I was younger, I tended to lead with passion and conviction, and this worked as long as I had the confidence of management and the team, but has gotten harder and harder over the years, and runs into big resistance within big companies and with a fairly large segment of the VC and angel community. Founders of companies are about the only ones who are allowed to make the great jumps based on instinct alone, and they have to either self fund or convince $$ to trust them. There was no good logic behind Acrobat, and if Chuck and John hadn’t been the ones to force it though, it never could have passed any of the VOC or market data tests. The same can be said, frankly for most of the innovative products that the now cautious companies were founded on.

The worst approach, in my opinion, is though committee and consensus building. This sounds great on paper, but in general turns out to be a process of finding the safe middle, where there is little chance for market expansion or disruption. Much better to make sure that the goal is clearly focused and let people come together to innovate under that lens. So how do you get that focus if you don’t have one person who is all-knowing, and the voice of your customers is an inconsistent mush with the disconnection from any objective fact usually reserved for economics. (That should really piss off the MBA’s, or at least those who aren’t honest with themselves about the power of numbers. I was selected to do the Apple/Stanford 3-Day MBA while I was at Apple, and the strongest thing I ended up with from that was how much more science marketing could be, and how much more creative accounting and projecting was.)

For me the answer stems from rephrasing the questions asked of the customers. Stop asking them what they want…they have no good way to tell you and it isn’t their job to get clever about that, it’s ours. Instead, concentrate on paying very careful attention to what those customers want to have done at the end of the day! This is called a lot of things, outcome focus, jobs focus, etc. I really don’t care. The soul of it is paying attention to what things they need to get done and breaking up the steps they take to get there. This isn’t just physical stuff they need to get accomplished, but the emotional jobs as well. (I hate the fact that so many people concentrate on “pain points” at the exclusion of looking at “joy points”, and equally powerful motivator.

There is a lovely book, called What Customer’s Want, by our good friend Anthony Ulwick who I talked about earlier. He does a great job of both laying out this problem as well as actually pointing to a set of methods to bring a bit of balance between instinct and science. I’ll go through them in more details in a later post. I’ve already gone way too far in this one, but the essence is spending time listening, reconstructing and slice and dicing the users need’s apart in a qualitative way, fairly anthropological. Blend in what that team and all the stakeholder feel is the right direction, Then take those pieces, put them in a fairly consistent form and do a quantitative pass to end up with a list of the jobs that are both really important to people AND where they are not happy with the methods they have in front of them now. From there you have a set of things you are trying to do for folks, that has been validated enough for you and the rest of the team to believe in, and you and they can go hog wild in figuring out clever solutions for those very focused problems. I’ll run through a couple of examples in later postings. 

Getting late now…see you in a couple of days…
B
Future topics
  • Acting in Product Definition
  • Intention vs. action…
  • Outcome driven Innovation
  • Looking at the whole problem-not a slice
  • Ignoring the voice of the customer
  • Innovation Frustration
  • Agile out of control-Agile as capitalism
  • Project Thrash-Bungie boss course changes vs. being agile and responsive.
  • Agile vs. long term planning.
  • Emotional Customer intentions…communicating them to the team
  • Pretty Pictures, Great Facades, bare bones machines, Lots of #’s or Just Passionate teams..what gets the $$ in the bank?
  • What screws up a project?
  • Bad news: Polly Anna or riding the wave with them.