Build a client-facing MVP to access, monitor and optimise their campaign performance for Digital Assistant.
Digital Assistant reporting: building an MVP from scratch
Project lead and senior product designer
Multidisciplinary team of 12, consisting of technical architects, full-stack developers, product managers and web / product designers
May 2017 - Aug 2017
- Reduce the workload for Ve’s account managers, by empowering clients to act upon their own data
- Unify Ve’s various products in a single place by aligning legacy code with our latest technology
- Set a foundation for Ve's first SaaS offering by iteratively increasing functionality to this reporting platform
- Implement a new product design process
- Break down the silos between departments and territories
Ve’s existing reporting tool was a disjointed experience.
Originally designed and built in 2012, the existing reporting tool displayed campaign performance data helping Ve's Account Managers optimise their client's Adtech and Martech campaigns.
Without a product roadmap or vision, features were prioritised over focus creating an overly complex user journey filled with 'work-arounds.’ The existing reporting tool wasn't updated since its release and struggled in all areas of Stephen Anderson’s UX Hierarchy of Needs: functional, reliable and usable.
Through internal discovery research, it became increasingly unclear who the target users were and what goals they wanted to accomplish using the existing reporting tool. Google analytics confirmed the use patterns didn’t correlate with its design, as users jumped between pages and often use search to find content buried deep within the information architecture.
Although the existing reporting tool was originally intended as a client facing product, we found the AM's lack of trust discouraged them from sharing login credentials with their clients.
Rather than straining good commercial relationships by relying on an unfit product, Ve's AM’s took on an extra 600 minutes per client (x20 clients) to manually export, manipulate and send data to their clients via spreadsheets.
Finally putting clients in the front seat, and letting Ve's AM's help us navigate
We realised that surfacing performance data, in a simple and accessible manor for a global audience was a huge undertaking. We wanted to empower Ve's clients to act on their data, not passively rely on their AM's opinion.
We needed to create a solid definition of who was at the core of the product. Although we weren't focusing on AM's needs, they used the existing reporting tool to achieve a similar goal to clients: optimise campaigns based on performance data. If we could answer client pain-points about synthesising and actioning their data, we'd also answer around 80% of our AM's needs.
That said, for Platform to be a client success we needed buy-in from our AM's and wider commercial team. If AM's didn't trust our new product, they wouldn't risk their client relationships by sharing Platform. Without client feedback, we couldn't iterate.
We needed to identify customer needs, goals and frustrations to build up our knowledge of the client
In 2017, Ve shifted their commercial heavy strategy to a product-led, web-first offering. This meant building a reliable, client-facing reporting tool that clearly displayed campaign performance data for all three flagship solutions.
Historically, Ve had never spoken to their end users so the business was very excited to try a new direction. The quicker we could get prototypes in front of users, the quicker we could start understanding who was using our product, and what goals they wanted to achieve. We planned a Lean process, aiming to finish design in 10 weeks.
Kicking off phase one with cross-functional workshops to understand the problem space
The damage from the existing reporting tool was too impactful to reverse. We decided to redesign everything, including: an entirely new UI, information architecture and even the underlying database capturing and populating our data.
This project was not scoped as a 'new look and feel' for the existing reporting tool, it was a completely new start. We hoped this approach would create a deeper relationships with our AM's, encouraging them to collaborate on the project.
Much like Google’s Sprint mentality, we held ‘expert interviews’ and leveraged the expertise of our Developers, Designers, AM's and representatives from other departments. Major risks, opportunities and goals were collected and themed from stakeholders to priorities project activities
- Outlining user pain-points assumptions gave us initial feature ideas and solutions
- Ve’s commercial team provided a first hand look into the manual reporting process helping us priorities features
- Developers commented on feasibility providing the time estimates
- Identifying major business questions framed the our client research and usability testing sessions
- Sharing voices gave Ve’s employees ownership of Platform increasing adoption rate and trust
- Agreement on a prioritised list of hypothesis statements gave us success criteria to validate or dismiss feature ideas
- Mapping potential features directly to business goals scoped the MVP— unless features were directly adding value, they were not considered
These experiences formed our base for rebuilding the information architecture and making sure the most important features were always obvious and emphasised.
Ve spans 25 countries, so including people from various territories was vital in identifying territory specific needs and challenges. To understand the nuances of each territory, commercial stakeholders provided current reports (examples of what they were sending to clients). Identifying these differences ensured Platform covered the global needs of our customers.
As we began defining the MVP, internal discussions had never been more exciting. We had this chance to confront and properly challenge not only Platform's foundation, but also the foundation of Ve itself. Collaboration on this scale had never taken place, and Ve wanted to use the project as an opportunity to align internal strategy from different departments and territories.
Using prototypes as discussion tools, not end results of a design process
With a process of quick, prototype-driven iterations, we managed to get a feeling for the product. Rapid testing of paper wireframes kept Platform moving forward with confidence.
Sharing prototypes early with various territories ensured our designs were usable for clients around the world, and user flows ensured each persona's journey was consistent.
Working hand-in-hand with the Ve API and our development team, we worked with real data and content, deliberately keeping things as rough as we could.
All of a sudden it wasn’t about control and structure, typography, colour or dimensions: it was about communicating a bigger picture, a more direct contact with the content and tools we had at hand, and the core of how a feature worked.
These user sessions helped validate and disprove many of the assumptions detailed in our proto-personas from our kick-off workshops.
We spoke with some of our biggest UK clients, and our key takeaway was simple: they’re busy people who use Ve within a wider suite of marketing tools. This realisation was extremely eyeopening, as our upper management assumed our clients were checking their campaign performance daily.
These user sessions helped validate and disprove many of the assumptions detailed in our proto-personas from our kick-off workshops. Findings were documented as jobs, pains and gains and shared across departments to help align global marketing and commercial strategies.
Simplifying the interface based on customer's goals
Detailed information like traffic sources and device information on the homepage were a good example of over-informing, where the details could sometimes be overwhelming. Statistics around where traffic originated from and traffic split by device required an understanding of what each of these numbers and symbols meant, and how they were associated with the campaign itself.
While traffic and device information created context around campaign engagement, in user-testing we saw they confused people who didn’t understand how these numbers were distinctly useful from each other when making a decision to increase their marketing budget.
Adding a large amount of foreign elements that new users had to wrap their heads around gave the impression of a very technical product. Seeing as our users only spent about 10 minutes looking at Ve’s performance information, we needed to simplify.
This resulted in a careful consideration of which metrics, metadata, and context to include when presenting content. We stripped down the dashboard completely, only focusing on including key information that clearly aided our user’s needs.
We identified 'metrics and definition standardisation across territories' as a major Platform blocker. How England the European market were claiming 'clicks' and 'conversions' greatly differed from Japan and our other Asian markets. To combat this, surveys were sent to all major commercial stakeholders across every territory to define these major reporting metrics. Once defined, these definition and metric calculations were included in our low-fidelity wireframes for testing.
Releasing the MVP
Challenges with Lean projects
We understood the risk of running lean. Clients could now access their Digital Assistant metrics at any time, but continue relying on their AM for Email and Ads performance. This lead to an inconsistent user experience and reduced Platform's appeal to AM's and their clients. This made adoption quite difficult internally and externally because our AM’s were sending spreadsheets of email performance information alongside a ‘log-in’ link.
Another challenge with such a fast-paced project environment we identified was the lack of a proper brief from our PM. A majority of project thinking, planning and identifying constraints occurred during the project which slowed our entire project down. This was something we fixed during V2 and V3.