ecommerce
Spotlight|August 21, 2018

What's Holding Retail E-Commerce Teams Back?

Ecommerce teams, you know you’ve got it tough. With limited resources and a barge load of data, you’re expected to maximize on-site conversions and ensure your site’s UX is tiptop, using nothing but your wits, experience and a few software tools.

But you know this already, the reason you bother reading blogs like this is, shock horror, you may, quite possibly, actually want to hear tangible solutions to the challenges you face.

Sit tight, we got your back, in this three part blog series, we’ll cover how your team can navigate some of the most pressing issues in ecommerce holding you back.

 

Challenge 1: Adopting the right approach to A/B testing

 

moss bros

“STOP A/B TESTING, YOU’RE WASTING YOUR TIME!”

Matt Henton, Head of ecommerce, Moss Bros

At our Future:Retail rebel conference a few months ago, washing machine enthusiast Matt Henton, provoked a partisan ecommerce crowd with this quote during his presentation. It was of course, largely tongue in cheek but there was a point to be made. Matt inferred that most teams waste a hella’ lot of time testing absolutely everything and instead, brands should better prioritize the tests they’re running and/or just implement the tests where they’re very confident of a win.

Matt implored teams to just fix ‘“the broken shit” on their sites. A quick win is to check your 404 logs and see if there’s reoccurring issues. If there’s a particular 404 url that’s getting visited multiple times then just fix that issue before it becomes a problem and affects your revenues/long term user experience. Matt extolled the need to really understand what users are actually doing (where they’re getting frustrated and clicking multiple times on a particular ‘chunk’ of content for example).

What’s holding many ecommerce back is the fact they struggle to quickly answer the below questions:

Businesses facing these questions often rely on traditional analytics for answers. But traditional analytics can only tell you what customers are doing on your site, not why and how they’re doing it. Some have used session replay tools to attempt to understand behavior, but the same frustrations arise. Brands should look for tools that display aggregated user journeys visually, enabling them to understand why customers are leaving their site as well as measure the revenue and behavioural contribution of any ‘block’ of content. There’s a huge need to understand your golden or broken customer journeys, feed actionable insights to test hypothesis and recognise why tests are winning or inconclusive.

Mud wrestling with HiPPOs

In every organisation a powerful and dangerous animal lurks. The HiPPO.

hippo

Does this HiPPO remind you of a certain someone?

The HiPPO (highest paid person’s opinion) effect could have a detrimental impact on testing ideas and raises the common challenge in ecommerce teams whether to test based on The Data or based on Opinion.

How many of you typically test an hypothesis for one or both of the below reasons?

“Because I had an idea and I wanted to see if it worked…”

“Because my manager told me to…”

More common than not, testing roadmaps are centered around the above rather than those who are using data to drive, back and explain your testing. Digital teams need to come to a single source of the truth. Your initial approach to testing could be one of many; your trade team, your marketing team, your CEO, or worse, the “that’s just how we’ve always done it” mantra.

Data v opinion

Testing is tough, and people are used to doing things a certain way, so it’s hard to convince teams to change their approach. But as a digital team, the reality is that your job is to state facts, not opinions or biased hypotheses. Every brand will have preconceptions about their customers and what they want. Ecommerce teams across all industries need to develop a culture of replacing preconceptions with data.

Teams should trust the data more, user behavior is super nuanced, numbers alone won’t uncover usability issues. If you’re blindly following good performance tests, this can lead to a cycle of chasing quick fix features and lead to a disjointed product. The questions arise:

How do you measure test success?

Is conversion rate always the appropriate metric?

More than Conversion Rate Optimization?

Success is more than just CRO, as you want users to return and purchase on your site time and time again. The right approach is to balance CRO with solving users problems creates long term value. In an ideal world, you’d want to run quantitative and qualitative testing at the same time, but how many brands have the time and resources to do this?

Instead, set metrics goals before launching a test and then report on these alongside the traditional ‘win/loss’ view. You’ll need to ensure you don’t get addicted to small win’s – they’re important and the cumulative effect can be significant but you don’t want them to be a barrier to innovative product work that can delight your customers.

webcast Dune ContentSquare

Louise Vallender (Head of E-Commerce) and Steve Thomson (Head of User Experience) at Dune Group believed that all too often, classic mistakes retail e-commerce teams can be avoided. So much so they partnered with ContentSquare to highlight how brands across every industry can resolve these challenges quickly.

If you want to hear answers (or at least potential solutions) for the above, join us on August 22nd at 4pm GMT.

[Sign up for the webcast here]

Author
Matthew Robinson

Matthew is the UK Head of Marketing based in ContentSquare’s London office. In this role he leads strategic vertical demand generation campaigns and communications as well as occasionally writing about himself in the third person. In his spare time Matthew enjoys jaunts to pretentious pop-ups, desperately tries to cram in as many trips abroad as possible just for the Instagram likes and tries in vain to understand why anyone would have any other pizza topping than pepperoni.

data benchmark
See How You Rank