Daniel Kemery

Email Remarketing reporting

Platform was divided into three phases, allowing us to release and iterate with speed. Our second release centred around Ve's Email Remarketing product.
Project goals: establish a Lean design process involving Ve's customers, bring departments together with cross-functional workshops and set a design system foundation.

Role

Project lead and senior product designer

Team

Multidisciplinary team of 12

Time

Aug 2017 - Oct 2017

How might we include email reporting within the MVP?

Functional

The underlying database didn't capture or display the correct information.

Reliable

A lack of trust in the data pushed staff to manually created spreadsheet reports for clients each month (60 minutes per client).

Usable

Usage patterns and research raised questions around the target user and their goals.

The second phase started with a cross-functional workshop

Integrate these additions into the MVP was our biggest challenge. Ve’s clients often ran Digital Assistant and Email Remarketing together and needed seamless access to both solution's performance data.
platform-mvp-kickoff
Building on the success of our first few workshops, our team decided to follow a similar structure. Leveraging internal experts, we explored user’s pain points to 
priorities the right features in a two-day kick-off.
Day one helped us understand the problem, and day two aligned the team as we rapidly ideated solutions to the identified business and user goals:
  • Increase client retention with a single source of truth
  • Support our position as the leading abandonment solution
  • Increase Platform adoption (remove 600 minutes per month per AM)
  • Be proactive— recommend improvements to clients (increasing our ‘perceived service’)
  • Moving to a web-first business with a single login
  • Releasing Email reporting is a huge commercial / product milestone (instilling ‘design’ trust and value in our AM’s)
platform-mvp-kickoff-two

We believe reducing account manager's workload will be achieved if Ve customers are empowered to act on their own data with live, online reporting.

Rapid testing of paper wireframes kept Platform moving forward with confidence. Sharing early and using prototypes as discussion tools in various territories ensured our designs were usable for clients around the world.
platform-mvp-protopersonaplatform-mvp-flowsplatform-mvp-wireframes

Early, internal, mid-fidelity testing identified UX issues.

Little understanding of customer priorities led to an overly complicated interface.

A large amount of elements gave the impression of a technical product. Seeing as our users only spent about 10 minutes looking at Ve’s performance information, we needed to simplify.
Detailed information was important, but it wasn't the emotional driver behind customers checking their campaign performance.
Research confirmed people wanted to know 'everything was okay,' and prevent a surprise at the end of the month.
platform-mvp-wireframe-one

Data without context leaves people without actionable next steps

While traffic and device information created context around campaign engagement, people didn’t understand how these numbers were useful when making decisions to optimise campaigns.
This resulted in careful consideration of how metrics and metadata were presented.
We completely stripped down the dashboard, focusing only on key information that clearly aided our user’s needs.
platform-mvp-wireframe-two
platform-mvp-whiteboard

The standardisation of metrics, their definitions and calculations across all territories was a major blocker.

England and the European market were defining 'clicks' and 'conversions' differently from Japan and our other Asian markets.
To combat this, surveys were sent to commercial stakeholders across every territory asking to define these metrics. Once defined, these definition and metric calculations were tested with customers.

User research prioritised MVP features and design decisions

Our key takeaway was simple: customers are busy people who use Ve within a wider suite of marketing tools. Customers thoughts about their Ve campaigns for about 10 minutes a month, which disproved our initial hypothesis that clients were checking their campaign performance daily.
platform-mvp-usersession-one
These user sessions helped validate and disprove many of the assumptions detailed in our proto-personas from our kick-off workshops.
Findings were documented as jobs, pains and gains and shared across departments to help align global marketing and commercial strategies.
platform-mvp-personasplatform-mvp-usersession-two

Key features

Dashboard

Standardised definition and metric calculations across 25 global territories.
platform-mvp-dashboard-two
platform-mvp-dashboard-two

Spark lines

Platform allows for standard (example: month on month) and custom time period comparisons.

Fully mobile responsive

The original 2012 design struggled in all areas of UX: functional, reliable and usable.
platform-mvp-dashboard-two
platform-mvp-dashboard-two

Performance overview

Multiple people seeing the exact same information, facilitating a productive discussion.

Detailed data

Multiple people seeing the exact same information, facilitating a productive discussion.
platform-mvp-dashboard-two