How Sky UK are creating positive change through customer journey experimentation


Katie Leask

July 1, 2021 | 5 min read

At our latest CX Circle event, Simon Elsworth from Sky UK and Isabelle Zdatny from Qualtrics XM Institute discussed how using data from Qualtrics and Contentsquare is helping create lasting change for Sky’s customers.

In an informal question and answer session, Simon, Sky UK’s Senior Experimentation Manager, chatted to Isabelle, Qualtrics XM Institute’s XM Catalyst, about his experience of integrating these customer experience tools to optimize Sky’s digital journey.

Experimentation at Sky UK

Over the last six years, Simon has been building and running Sky’s experimentation program.

“At Sky, it’s not just about running A/B tests on our website,” he says. “Our experimentation program is something we believe needs to be front and center in everything we do.” This means that Sky uses experimentation for all its digital launches and product releases.

And their key focus is bringing together the digital and human side of the customer experience, ensuring it’s as good as it can be.

And thanks to integrations with Contentsquare and Qualtrics, Sky has seen some incredible results in their experimentation program.


Isabelle: How does digital fit into Sky’s broader customer experience strategy?

“We see our digital offer as a fully integrated part of our customer service,” explains Simon.

“At Sky UK, customer experience is everybody’s problem. Wherever your work, whoever you are; making sure that things are working properly for our customers really is everybody’s problem”

So whether in digital, customer service, or marketing; creating the best customer experience possible is on everyone’s agenda at Sky – Simon doesn’t see digital as a separate entity.

Isabelle: Can you break down your digital customer experience improvement process? How do you identify where there are issues and pain points?

“I could spend all day talking about this,” laughs Simon. “But one thing that’s really important to us is the integration between the tools we use. We use Contentsquare alongside Qualtrics and that’s where we get the real value of artificial intelligence in uncovering our customer’s frustrations.”

Sky uses Qualtrics to monitor customer sentiment using direct feedback about their online experience combined with user behavior and website analytics data from Contentsquare. 

Integrating these tools allows them to understand exactly which customer journeys are causing their users frustration, both quantitatively and qualitatively, then effectively prioritize their efforts to improve the experience.

That’s where we can use data to make a real difference. We can go away and fix the most common problems our customers are having and make their entire journey easier to navigate.

Using direct customer feedback from Qualtrics, Sky can then quantify the size of the problem with Contentsquare. Then, once necessary optimizations have been made, they can track how the changes impact customer journeys.

“We need to ensure the experience we put in front of customers is doing exactly what we think it is,” continues Simon, and tools like Contentsquare’s session replay allow them to see exactly how their customers interact with website updates.

This helps Sky mitigate risk when launching site changes. “With Contentsquare, we know we can prove whether something is working or not and whether the dev effort we’ve put into something adds value,” says Simon.

This gives Sky great control over their experimentation program, letting them prove (or disprove) each user experience hypothesis with the data to back it up.

What’s more, data from Qualtrics gives them the opportunity to run much better tests.

Instead of using conversion or call metrics in our experimentation and A/B tests, we’re starting to use really cool things like the proportion of customers that are rating an experience either difficult or easy (through Qualtrics feedback tabs).

“We can now understand the percentage of customers that tell us Sky makes them feel valued as a customer. And to run an A/B test and be able to prove that metric – rather than just looking at sales and calls – that’s where things become interesting.”

Isabelle agrees; “One of the huge benefits of digital is having so many tools at your disposal to understand exactly what your customer does online. Then, when you combine this [quantitative data] with their feedback and understand why they did that – it helps you become more predictive and personalize the experience and has all these positive knock-on effects.” 


Isabelle: So, how quickly do you try to ‘fail fast’ to test out new ideas?

“That’s a really great question – as fast as we can!” laughs Simon. “But we have a concept in my team of a ‘minimum viable experiment’. I’m sure anybody that’s worked with product or development teams will understand the minimal viable product (MVP). But for us, it’s ‘what is the least-touch thing we can do today to prove this idea?’” 

This might be something really simple, such as asking customers directly whether they’d like the option of a live chat feature. “One of the things we could do really easily and quickly is add a CTA to the site that says ‘Would you like to use live chat?’, says Simon. 

“Then we can gauge what the actual interest is in a new product. The product might not even exist yet, but it’s all about getting the idea out there super fast.”

Interestingly, one of Simon’s main experimentation goals is to lower the success rate of their A/B tests; a take that often sparks confusion, particularly among senior teams at Sky UK. But there’s a method to Simon’s madness.

“[Unsuccessful tests] means we’re testing bigger, scarier things. It means we’re testing things that we don’t know the answer to. If everything we do is winning, then we’re being too safe. We’re not challenging anything.”

According to Simon, the real magic happens when you find out the idea you thought was dangerous is actually what makes the biggest difference to the customer.


Isabelle: How do you deal with delays in your experiments (for example, when the lead time to set up is a week and then you wait months for the results); does this limit your speed?

“We prioritize our experiments using two metrics,” says Simon. “Firstly, how much support does this idea have around the business? Is it something that one team has come up with or does it have a lot of strategic and senior buy-in?” 

This helps the team ensure they’re never waiting around for results on tests with little weight. When they do launch a new experiment, it’s likely to be something with implications that will make a real difference – something worth waiting for.

The second metric Simon uses is how fast can they get insights. “At Sky, we try to run things that we can get quick learnings from.” This means the majority of Sky’s experimentation program is based around quick wins, small changes, and fast experimentation cycles.


And finally, Simon’s favorite experimentation secret…

Isabella’s last question for Simon was how to get buy-in from senior stakeholders for experimentation projects. 

Simon’s insider secret? “Get other people to do it for you!”. No, but really. This is actually a tried and tested business plan, and here’s why…

“If I go and talk to the senior team about experimentation, I come in as the guy that’s trying to sell experimentation. Whereas if my head of finance comes into a senior stakeholder meeting and says, ‘The experimentation team has given me all this data, so I’m happy to sign off on their product development over the next 12 months.’ It changes things.” 

According to Simon, this is where pushing experimentation gets really easy. “Because you can guarantee the people in that meeting will be on the phone to you saying ‘How do I get this experimentation data, too?’” Simple.