Tag Archives: testing

Listen, Don’t Obey

As Tim correctly pointed out in his comment on the previous post, waterfall doesn’t exactly have a stellar record of success. In my experience, a major cause for that is shortchanging the upfront research that is necessary to ensure the success of subsequent stages of development. As I said in my comment, often the proper upfront research that would shape the vision of the product based on business need and customer understanding is sacrificed to meet aggressive deadlines and tight budgets. It’s no wonder that the final product is underwhelming, despite the best efforts of everyone on the team.

If these prior adventures in development have failed, why not try something radically different, like agile. If the organization is not willing to put the effort into a proper upfront research and design phase, then build something and hope you get useful feedback on it, right? It sounds good, but then why do we still find some of the same problems?

Tim hit it again. Often customers, or users, can’t tell us what they want, even when we do ask, or what they say they want is not the best way, or even a good way, to solve their problem. This can happen in waterfall or agile. Customer requests come from their perspective, which is via their domain expertise. They are experts at their tasks, but we are supposed to be experts at designing interactions. The key is to understand the problem and solve it, not just implement what our users request. Maybe the users’ suggestion is spot on. At the very least, these suggestions tell us where to focus our attention. We’ll likely find where pain can be removed or gain can be added. But when it comes to designing the solution, we need to bring our advanced knowledge of computer interactions to bear and conceive of the best ways to address their needs which may be better than our users could imagine.

While working on a call center application, I spent many hours sitting with agents, listening in on customer calls, and observing how they worked. Between calls, I asked a lot of questions about what I heard and what I observed. After only a day or two of this, I had a very good understanding of what their work was like and what their challenges were.

When I asked them what they wanted, they typically mentioned incremental improvements to their current tool, an old ‘green screen’ mainframe application. In fact, when asked if they wanted digital catalogs to replace their large rack of physical catalogs, they categorically said ‘no’. They wanted things similar to how they were. As one agent put it, “I don’t like change.”

However, the time I spent with them plus my understanding of the business challenges led to a radical design in a Flex environment that was light years ahead of what they were used to, and it did include digital catalogs. So how would the agents respond, especially given that we did something they expressly said they didn’t want?

We tested a rough prototype with a handful of novice and experienced agents. The feedback was overwhelmingly positive. At the end of the hour usability test, even the woman who didn’t like change wanted to know when the system was going to launch. She felt it was a dramatic improvement over what she had been using, and had mastered, over the past 13 years as an agent.

So do talk to your users. Ask them what they want. But when it comes to defining the solution, remember that you are expert when it comes to building tools. Just be sure to test your design to make sure you got it right.

To see the comments, or to add your own, click on the little speech bubble below. To subscribe to this blog using RSS, click here. Also, be sure to check out the Suggest a Topic page to vote on future post topics and to suggest your own.