Daniel Kemery

MVP: Digital Assistant reporting

Platform was divided into three phases, allowing us to release and iterate with speed. Our MVP was an interface to access, monitor and optimise Digital Assistant campaign performance (Ve's flagship software product).
Project goals: establish a Lean design process involving Ve's customers, bring departments together with cross-functional workshops and set a design system foundation.

Role

Project lead and senior product designer

Team

Multidisciplinary team of 12

Time

May 2017 - Aug 2017

The existing reporting struggled in all areas of UX, and needed a complete redesign.

Functional

The underlying database didn't capture or display the correct information.

Reliable

A lack of trust in the data pushed staff to manually created spreadsheet reports for clients each month (60 minutes per client).

Usable

Usage patterns and research raised questions around the target user and their goals.

Expert interviews from various departments outlined business assumptions.

Workshops outlined business assumptions and prioritised stakeholder risks. Mapping potential features directly to business goals scoped the MVP— unless features were directly adding value, they were not considered.
Ve spans 25 countries, so including various territories was vital to ensure Platform covered the global needs of our customers. Commercial stakeholders provided current reports (examples of what they were sending to clients) to help us understand the nuances of each territory.
platform-mvp-kickoff
This project was an opportunity to align commercial and design strategy:
  • Identifying user pain-point assumptions gave us initial feature ideas
  • Understanding the manual reporting process provided context
  • Developers provided time estimates and feasibility
  • Listing questions framed user research and usability testing sessions
  • Sharing voices gave everyone ownership of Platform
  • Agreeing on a prioritised list of hypothesis statements gave us success criteria to validate or dismiss feature ideas
platform-mvp-kickoff-two

We believe reducing account manager's workload will be achieved if Ve customers are empowered to act on their own data with live, online reporting.

Rapid testing of paper wireframes kept Platform moving forward with confidence. Sharing early and using prototypes as discussion tools in various territories ensured our designs were usable for clients around the world.
platform-mvp-protopersonaplatform-mvp-flowsplatform-mvp-wireframes

Early, internal, mid-fidelity testing identified UX issues.

Little understanding of customer priorities led to an overly complicated interface.

A large amount of elements gave the impression of a technical product. Seeing as our users only spent about 10 minutes looking at Ve’s performance information, we needed to simplify.
Detailed information was important, but it wasn't the emotional driver behind customers checking their campaign performance.
Research confirmed people wanted to know 'everything was okay,' and prevent a surprise at the end of the month.
platform-mvp-wireframe-one

Data without context leaves people without actionable next steps

While traffic and device information created context around campaign engagement, people didn’t understand how these numbers were useful when making decisions to optimise campaigns.
This resulted in careful consideration of how metrics and metadata were presented.
We completely stripped down the dashboard, focusing only on key information that clearly aided our user’s needs.
platform-mvp-wireframe-two
platform-mvp-whiteboard

The standardisation of metrics, their definitions and calculations across all territories was a major blocker.

England and the European market were defining 'clicks' and 'conversions' differently from Japan and our other Asian markets.
To combat this, surveys were sent to commercial stakeholders across every territory asking to define these metrics. Once defined, these definition and metric calculations were tested with customers.

User research prioritised MVP features and design decisions

Our key takeaway was simple: customers are busy people who use Ve within a wider suite of marketing tools. Customers thoughts about their Ve campaigns for about 10 minutes a month, which disproved our initial hypothesis that clients were checking their campaign performance daily.
platform-mvp-usersession-one
These user sessions helped validate and disprove many of the assumptions detailed in our proto-personas from our kick-off workshops.
Findings were documented as jobs, pains and gains and shared across departments to help align global marketing and commercial strategies.
platform-mvp-personasplatform-mvp-usersession-two

Key features

Contextual metrics

Standardised definition and metric calculations across 25 global territories.
platform-mvp-dashboard-two
platform-mvp-dashboard-two

Time period comparison

Platform allows for standard (example: month on month) and custom time period comparisons.

Fully mobile responsive

The original 2012 design struggled in all areas of UX: functional, reliable and usable.
platform-mvp-dashboard-two
platform-mvp-dashboard-two

View sharing

Multiple people seeing the exact same information, facilitating a productive discussion.

The risk of running Lean is an inconsistent user experience.

Clients could now access their Digital Assistant metrics at any time, but continue relying on their AM for Email and Ads performance. This made adoption quite difficult internally and externally because our AM’s were sending spreadsheets of email performance information alongside a ‘log-in’ link.
Another challenge with such a fast-paced project environment we identified was the lack of a proper brief from our PM. A majority of project thinking, planning and identifying constraints occurred during the project which slowed our entire project down. This was something we fixed during V2 and V3.