Tag Archives: failure

Failure: The Secret to Success

Honda has produced a series of short ‘documentaries’ promoting Honda innovation. I’m usually quite critical about this kind marketing content masquerading as entertainment, but having watched two so far, I can say they are well produced, engaging,  and only tangentially talk about Honda.

This one is about failure. I don’t think about failure much, even though some might consider that I’ve been around it quite a bit. Many companies that I’ve worked for didn’t make it, or had to cut back during hard times, but I never thought of them as failures. With each one, I often had great results with my piece of the company puzzle, met brilliant and talented people, and learned a hell of a lot. For me personally, that’s a success.

Projects come and go. I try and make sure that whatever I spend my time on is going to give me a great experience, draw on my strengths and knowledge, and give me new challenges to learn from. If it doesn’t work out, I’m very disappointed, but I take what I’ve learned to the next opportunity.

Oh, and it needs to  pay the mortgage. Can’t forget that one.

From Salon (login or adroll required) via Neelakantan at Interim Thoughts.

If you are an older subscirber, please update your RSS feed, or if you haven’t yet, subscribe to this blog’s RSS feed. Don’t know what RSS is? Check this out for an explanation of RSS.

Listen, Don’t Obey

As Tim correctly pointed out in his comment on the previous post, waterfall doesn’t exactly have a stellar record of success. In my experience, a major cause for that is shortchanging the upfront research that is necessary to ensure the success of subsequent stages of development. As I said in my comment, often the proper upfront research that would shape the vision of the product based on business need and customer understanding is sacrificed to meet aggressive deadlines and tight budgets. It’s no wonder that the final product is underwhelming, despite the best efforts of everyone on the team.

If these prior adventures in development have failed, why not try something radically different, like agile. If the organization is not willing to put the effort into a proper upfront research and design phase, then build something and hope you get useful feedback on it, right? It sounds good, but then why do we still find some of the same problems?

Tim hit it again. Often customers, or users, can’t tell us what they want, even when we do ask, or what they say they want is not the best way, or even a good way, to solve their problem. This can happen in waterfall or agile. Customer requests come from their perspective, which is via their domain expertise. They are experts at their tasks, but we are supposed to be experts at designing interactions. The key is to understand the problem and solve it, not just implement what our users request. Maybe the users’ suggestion is spot on. At the very least, these suggestions tell us where to focus our attention. We’ll likely find where pain can be removed or gain can be added. But when it comes to designing the solution, we need to bring our advanced knowledge of computer interactions to bear and conceive of the best ways to address their needs which may be better than our users could imagine.

While working on a call center application, I spent many hours sitting with agents, listening in on customer calls, and observing how they worked. Between calls, I asked a lot of questions about what I heard and what I observed. After only a day or two of this, I had a very good understanding of what their work was like and what their challenges were.

When I asked them what they wanted, they typically mentioned incremental improvements to their current tool, an old ‘green screen’ mainframe application. In fact, when asked if they wanted digital catalogs to replace their large rack of physical catalogs, they categorically said ‘no’. They wanted things similar to how they were. As one agent put it, “I don’t like change.”

However, the time I spent with them plus my understanding of the business challenges led to a radical design in a Flex environment that was light years ahead of what they were used to, and it did include digital catalogs. So how would the agents respond, especially given that we did something they expressly said they didn’t want?

We tested a rough prototype with a handful of novice and experienced agents. The feedback was overwhelmingly positive. At the end of the hour usability test, even the woman who didn’t like change wanted to know when the system was going to launch. She felt it was a dramatic improvement over what she had been using, and had mastered, over the past 13 years as an agent.

So do talk to your users. Ask them what they want. But when it comes to defining the solution, remember that you are expert when it comes to building tools. Just be sure to test your design to make sure you got it right.

To see the comments, or to add your own, click on the little speech bubble below. To subscribe to this blog using RSS, click here. Also, be sure to check out the Suggest a Topic page to vote on future post topics and to suggest your own.