Over the years, we have worked on many projects with robust requirements provided at the outset. And of course, by robust, what I mean is a deeply researched, well documented requirements deck collected by a crack team of business analysts. But how can it be that these projects are often less successful than one would think?
It could be that traditional information technology requirements focus on uncovering the wrong things and asking the wrong questions — a focus on process and analysis instead of needs and insights. Everyone who has ever read a 100+ page requirements document and truly felt they understood what a project needed to be and how it should be designed, please raise your hand. Anyone? Ok, I see a couple hands – did the ultimate realization of the project mirror your understanding? and more importantly, was it successful? Our experience says that it rarely, if ever happens this way.
However, our recent experience has shown us a different way to get to that Holy Grail of a successful outcome. By championing business needs and user needs early and often, we decrease the risk of an unsuccessful outcome by targeting concepts such as value, efficiency and motivation rather than compliance, business cases and stories that are ostensibly about users, but just reworked use cases or traditional “requirements”. As well, if you have the opportunity to conduct real research with your users, the insights generated are often far more valuable than analysis of existing data or SME’s interpretations of someone else’s needs. Insights are always more valuable than assumptions.
Here’s a brief example comparing two projects that were both internally focused initiatives for past enterprise clients. Both needed to improve their customer service agent’s efficiency and were faced with excessive swivel chairing between applications and saddled with legacy systems.
The first project started off well enough, with exhaustive task analysis with the “people in the trenches” and a good understanding of the scope and the opportunity. We built a robust model of the tasks required by a dozen different roles to serve customers. We identified high value tasks that take the most time and are done most often. However, at this point, it devolved into a traditional business analysis wasteland and the initial phase of implementation was driven entirely by selection of a technology vendor than by how the new application could best help the actual users.
The most telling point was instead of selecting an entire task based on value to the users, and creating solutions that could be streamlined into one new highly efficient flow, a variety of sub-tasks used across many tasks were aggregated as solution sets. So, instead of creating value (and efficiency) for the user, the approach was to create efficiency in implementation. If the goal is to decrease “swivel chairing” and time on task, shouldn’t the focus be on optimization of user’s needs, not your IT team’s timeline and vendor selection?
The outcome, which was the first phase of a multi-phased project, didn’t produce a significant short-term efficiency as we just changed out some of the existing tools for new tools. Even after the shorter learning curve was overcome, the overall value was greatly reduced until all the diverse applications were replaced.
The second project also started off with deep task analysis and ethnography. We created a similar analysis of tasks and sub-tasks that agents needed to do as part of their job. We then decided on what was the most challenging of these tasks in their current environment and the solution that would also provide the most value (and revenue) to the business. We settled on a specific upsell process based on failed transactions. The agents identified these customers based on certain criteria and then contacted them directly to try a higher margin product that had less qualification criteria. The identification and tagging of potential customers and tracking of success rates was a completely manual process using multiple tools that weren’t created for the task at hand.
The manual nature of the tools meant there were frequent “collisions” where multiple agents were trying to contact the same potential customer at once. So, not only was it frustrating for the agent, but also at times for the customer. We created a system where the customer experience was seamless and the agent was able to execute their job simply and efficiently. The outcome algorithmically determined potential customers and assigned them to available agents. The agents were able to manage their own queue and change status of their potential customers easily.
Management was finally able to automatically get immediate and reliable data on closing rates. They were also able to easily add more outbound campaigns for the sales team with minimal effort. In the first week, the department was able to realize a 1,000% (yes 100x) improvement in conversions over the previous 3 months. The time required to implement a new outbound campaign went from several hours to minutes. Success tracking (on which bonuses were based) went from completely manual with little or no controls to a completely automated process. Employees were able to be much more successful at the outset and not only increased revenue for the company, but increased their own income from bonuses attached to success rate. Everybody won.
Why was the second project more successful despite seemingly equivalent research? For internally-focused applications, like a call centre, creating end-to-end solutions for specific tasks will always produce dramatic efficiencies for agents. By focusing on task-oriented interfaces that are purpose-built to solve an end-to-end process, the opportunity for efficiency is significantly increased.
Next time you have a custom internal tool redesign, ensure that you understand the tasks being completed by your agents and replace complete task flows (vertically) and not by sub-task. The only way to understand the task flows is by thorough research and user needs analysis. Then make great decisions based on their needs not technical or platform requirements…