Daniel KemeryVe Global

Case study



About Ve Global

Ve is the complete acquisition, engagement and analytics solution for 5,000 eCommerce brands. Smart solutions seamlessly integrate advertising and marketing technology to connect all the dots in the user journey, drawing on a rich mixture of data to provide you with a complete picture of what an individual is looking for, and why.


Shifting towards 'web-first,' Ve needed an interface for clients and staff to access, monitor and optimise their campaign performance.


Platform provides campaign performance for all flagship solutions, and lays the foundation for Ve's SaaS offering.
Platform was divided into three briefs (one brief per solution) to provide early learnings and value from day one. The idea: deliver reporting on one solution, then iteratively add other solutions inheriting the design system — every addition improving Platform.


Project lead and senior product designer


Two product designers, and development teams located in Spain and Romania


May 2017 - today
  • Digital Assistant reporting: May 2017 — Aug 2017
  • Remarketing reporting: Aug 2017 — Oct 2017
  • Digital advertising reporting: Jan 2018— Mar 2018


Lean UX


Global stakeholder kick-off

Every Platform brief started with cross-functional workshops and internal discovery research to increase our understanding of the problem space and identify potential challenges. Much like Google’s Sprint mentality, we held ‘expert interviews’ and leveraged the expertise of our Developers, Designers, Account Managers and representatives from other departments. Major risks, opportunities and goals were collected and themed from stakeholders to priorities project activities.
Kicking-off meeting with product managers, developers and commercial stakeholders
Select workshop outputs:
  • Outlining assumptions of user pain-points gave us initial feature ideas and solutions
  • Ve’s commercial team provided a first hand look into the manual reporting process helping us priorities features
  • Developers commented on feasibility providing the time estimates
  • Identifying major business questions framed the our client research and usability testing sessions
  • Sharing voices gave Ve’s employees ownership of Platform increasing adoption rate and trust
  • Agreement on a prioritised list of hypothesis statements gave us success criteria to validate or dismiss feature ideas
  • Mapping potential features directly to business goals scoped the MVP— unless features were directly adding value, they were not considered
Creating and sharing hypothesis statements linking potential features to identified business goals
Ve spans 25 countries, so including people from various territories was vital in identifying territory specific needs and challenges. To understand the nuances of each territory, commercial stakeholders provided current reports (examples of what they were sending to clients). Identifying these differences ensured Platform covered the global needs of our customers.
Select findings:
  • Clients reading right to left vs left to right
  • Clients who’re unable to run particular solutions due to government restrictions and varying internet regulation
  • Finding a typeface to support global languages
  • Number length resulting from various currency
  • Different territories reported, defined and classified success in different ways (lack of standardisation)

Visualising workshop outputs

Rapid testing of paper wireframes kept Platform moving forward with confidence. Sharing prototypes early with various territories ensured our designs were usable for clients around the world, and user flows ensured each persona's journey was consistent.
Paper wiresframes, proto-personas and userflows helped refine a customer's journey through Platform
We identified 'metrics and definition standardisation across territories' as a major Platform blocker. How England the European market were claiming 'clicks' and 'conversions' greatly differed from Japan and our other Asian markets. To combat this, surveys were sent to all major commercial stakeholders across every territory to define these major reporting metrics. Once defined, these definition and metric calculations were included in our low-fidelity wireframes for testing.
Internal testing and sharing ideas early validated the prototype's direction and secured buy-in early from various stakeholders


Ownership through design studios

Similar to kick-off workshops, design studio teams consisted of visual designers, product designers, product managers, sales, account management and development from multiple territories. Design studio output directly influenced low-fidelity sketches, and ultimately Platform's final design.
Select design studio topics:
  • Defining and visualising Platform's information architecture
  • Creating Platform's homepage
  • Visualising multiple data sets
Crazy-eight exercises, idea sharing and refining from various design studio workshops

Going high-fidelity with an ‘Atomic Design’ pattern library

Research confirmed clients use Platform solely for insights and data, so success was defined as limited time spent within the Platform. Repeating common interface elements like graphs, tables and charts created a familiar flow from page to page.
Visual and interaction design audit to identify reuseable components and create Ve's first Atomic design system
Establishing a design system as we moved through the Platform project decreased the required design fidelity with development, as they were able to style components and spacing from our standardised set of rules.


Validating workshop assumptions through user sessions

Clients met a specific criteria list (example: ‘x’ amount of daily site traffic) and were prioritised by revenue amount in that specific solution. Because we rely so heavily on client feedback, we needed to ensure the opinions shaping our product were using our products daily.
  • 17 total external client sessions (select clients: Etihad, The Times, DigitasLBi, Concertgebouw, MediaCom, Gtech and more)
  • 20 internal tests (over three rounds of testing)
  • 8 different territories (Sao Paulo, Amsterdam, Stockholm, London, Dubai, Moscow, Hong Kong, and Sydney)
Hosting a user session (usability testing and research) with The Times (England) that included commercial stakeholders
Behavioural questions were targeted around prototype assumption areas — these were things assumed during the project’s kick-off workshop. Our user sessions helped test our assumption led product with clients from an early stage, increase the usability of Platform, and required clients to pull real-life examples.
Working alongside our stronger client relationships, client testing / interview sessions were coordinated. These one-hour sessions helped increase the trust Ve's clients had with our organisation, validated market interest for our product and signed up clients for our Beta release.
The clients saw the product team's engagement as a sign of excellent customer service and they were happy that they had the opportunity to provide their input. This in turn, improved my relationship with the client and paved the way for further conversation to drive app revenue, so was arguably the best outcome we could hoped for!
-- George Culff, Account manager from Ve Global


Saying hello

Entering Platform in any of our 25 territories prompts a time-appropriate greeting, in the local language. This helps localise Platform for every user in every market.
Key research takeaway: going above the functional MVP into an MLP (most loved product) helps the product feel more refined, creating a more friendly user experience.
Saying hello translates into 12 different languages based on a client's browser location

A quick overview

Research uncovered that our clients use Ve within a suite of other marketing solutions, and need to access basic campaign data as easy as possible. Day-to-day, Ve’s clients want to know: 1) what solutions and features are active and 2) a quick snap-shot into their performance.
Key research takeaway: clients were unfamiliar with campaign details because they relied on their Ve account manager, meaning Platform needed to ‘handhold’ clients.
A web-first approach should replicate the 'I have your back' approach of working with an Account Manager

Giving the big picture

Platform has a variety of users, each with different objectives. Some of Ve’s clients crave performance data, while others are simply checking campaigns are running as expected. Platform gets more granularity by page. Piggy-backing off the well known ‘sales funnel’ pattern Ve’s clients already use, Platform prevents the feeling of being overwhelmed.
Key research takeaway: clients spend ten minutes looking at campaign data provided by their account manager each month (meaning they’re more concerned about top-line performance vs granular detail)
Platform aims to provide exactly what a customer needs, with an option to see more

Card states

The average Ve client is unaware of the full product offering, which gave us a great opportunity to use Platform as an education tool. Depending if a solution is active or inactive, Platform’s cards simply switch from a campaign snap-shot to educational information.
Key research takeaway: clients were unfamiliar with Ve’s total suite of solutions, meaning Platform needed to educate clients in our other solutions.
Platform aims to provide exactly what a customer needs, with an option to see more

Defining metrics and terms

With 25 territories manually reporting on Ve’s solutions, combatting jargon is impossible. One major challenge we overcame was standardising all product definitions and metric calculations. Every Ve client now reports using the same metrics.
Key research takeaway: clients are unfamiliar with our internal, Ve terminology meaning Platform needed to redefine how we talked about our solutions.
Providing a common language for staff and their clients puts everyone on the same page, regardless of territory

Comparing time periods

Time period comparison is an essential part of any reporting tool and Platform allows for standard (example: month on month) and custom time periods.
Time period comparison helps optimise campaigns, and provides relevant benchmarking for clients

Actionable insights

By nature, reporting on campaign performance is passive. Platform aims to be proactive, by turning data into insight directing users to what’s important. Eventually, the ‘insight feed’ will make campaign update suggestions based on past performance.
Key research takeaway: reporting is all about storytelling, and actionable steps meaning Platform needed to be proactive and suggest campaign optimisations.
The insight feed surfaces important, relevant information making every Platform experience productive

Sharing views

Give clients transparency and autonomy in running their campaigns was the brief objective, but research proved the ‘human touch’ Ve’s account managers added increased the value of our service. Adding a ‘share link / share view’ feature allows multiple people to see the exact same screen and information, facilitating a productive discussion.
Key research takeaway: ‘human touch’ increased the perceived value through client eyes, and completely automating the process lowered Ve’s value.
Sharing views helps keep the conversation flowing

Designing for mobile

With mobile usage increasing each year, Platform needs to account for every type of media breakpoint. Users aren't 'punished' for using mobile, and specific elements are redesigned to account for smaller screens.
Key research takeaway: clients of all sizes use mobile to ensure their campaigns are running as expected.
Platform has full functionality on all breakpoints, and elements like the 'abandonment funnel' are redesigned to account for these smaller screens