No one likes to wait. Even at a nice restaurant, if the food takes too long to arrive, you’ll start wriggling on your chair and looking around for the waiter. Worst case, you might even leave.
This is nothing new: people are impatient by nature, and your users are no different. If you ask them to wait too long for their requested content, they’ll enjoy the experience less and may eventually drop off.
But what’s less defined is the true impact of user experience on your business metrics. Without this understanding, you won’t know how much to invest in optimizing loading performance.
The good news is there’s a simple way to do this—and we’re going to walk you through the steps.
What is perceived performance?
Perceived performance is a subjective measure of how your site performs according to your user.
When we talk about perceived performance, we’re not concerned with the endpoint latency or the amount of server memory used by the backend service. Perceived performance metrics are, first and foremost, user-centric.
Examples of perceived performance metrics
The most popular examples are Web Vitals, proposed by Google. They aim to provide a unified way of measuring perceived performance across different websites.
Let’s look at specific metrics from that set: one of them is the Largest Contentful Paint (LCP). The measurement starts when the user requests the page to load, for example, by clicking a link. It ends when the biggest visual part of the page, such as an image or a block of text, appears on the screen.
The element that occupies the most space on the screen is considered the most important for visitors, so it serves as a useful proxy metric for a website’s overall loading performance.
It’s a universal approach that allows you to compare completely different web pages, from a newspaper article to an ecommerce checkout page.
![[Visual] examples core web vitals](http://images.ctfassets.net/gwbpo1m641r7/1vFPo3Ldz2XKO1ZL6PIVzn/aac81a0220e6ddba7338a13e587b5b33/examples-cwv.png?w=1920&q=100&fit=fill&fm=avif)
An example of a Core Web Vitals report
When Web Vitals don’t work
In the world of SaaS products, we’re usually more focused on specific customer journeys and jobs to be done (JTBD). Often, it’s not the element taking up the most space that provides the greatest value to our customers.
Consider the below example of a SaaS sign-up page: people go there to use the registration form, although most of the screen area is filled with marketing copy and customer logos on the right-hand side.
![[Visual] Sign up page Contentsquare](http://images.ctfassets.net/gwbpo1m641r7/60ecw8FPqL2UmcJqjMbdyL/8bd6e21a1506f50f858c84d5fdc3d7aa/Screenshot_2025-11-05_at_16.31.33.png?w=3840&q=100&fit=fill&fm=avif)
A sign-up page example from Contentsquare.
In this scenario, the Largest Contentful Paint isn’t the best metric to focus on because the most important element for the user isn’t the element taking up the most space.
Instead, you may decide to define your own ‘web vital’ focused on users, such as the time it takes for the ‘Sign up’ buttons to appear on the screen.
Thankfully, existing tech makes it easy to collect this type of data. For example, the Element Timing API, available in Chromium-based browsers, allows the measurement of arbitrary HTML elements. Shims for other browsers exist, and it’s not difficult to figure out a custom solution that works with single-page applications written in popular front-end frameworks, too.
How to connect the dots
After deciding which page element is the most important for our users, you can start collecting and visualizing data from a sample of real user sessions (Real User Monitoring or RUM).
Option 1: slow it down and see what happens
To understand the impact of potential changes, you need to correlate your business KPIs—in our example above that would be the conversion rate of the registration form—with the performance perceived by users.
One way to do this is to run an A/B test, where one group of visitors would get an artificially slowed-down experience while the other would be served as usual. By comparing conversions between these two groups, you could see the impact of the slowdown and, thus, the impact of potential speedup.
![[Visual] ab-test](http://images.ctfassets.net/gwbpo1m641r7/1Sc3LiWSypZwP9cX09dHUx/d9f53fa69e679732f41ac2faad836bc4/ab-test.png?w=1920&q=100&fit=fill&fm=avif)
Data from a hypothetical A/B test
Option 2: divide and analyze
If you don’t want to run an A/B test, you can create and review a distribution histogram to see how users behave based on the performance of your site.
![[Visual] Hypothetical histogram](http://images.ctfassets.net/gwbpo1m641r7/2xj3vaJz1W0wK37lXXa0Uv/75447061c705aa3fd2d83c5eb665c8d9/histogram.png?w=1920&q=100&fit=fill&fm=avif)
Data from a hypothetical histogram
Business opportunity and optimization
Finally, you’ll want to tie conversion rates back to your metrics to monitor your improvements.
Tools like Contentsquare’s Experience Monitoring track performance metrics and technical issues and tie them back to conversion rates, so you can prioritize which changes will have the biggest impact on user experience and your bottom line.
![[Visual] impact quantification at Contentsquare](http://images.ctfassets.net/gwbpo1m641r7/5LEALEky77QcrDsm85VgW0/5d17d7815d304382149959dbd2c65a46/impact-quantification.png?w=1920&q=100&fit=fill&fm=avif)
The team at optical retailer Specsavers used Contentsquare’s Experience Monitoring to monitor speed-related metrics to ensure visitors had a good experience on their site (and converted).
While reviewing the data, the Specsavers team noticed that users with the slowest LCP scores were likely to bounce immediately and were less likely to make a purchase. They also noticed that users who experienced a faster site were more likely to click around and turn into customers.
Knowing this, Specsavers was confident that putting resources towards LCP improvements would help boost perceived performance, and in turn, increase conversions.
The result? Specsavers saw a +24% increase in ecommerce purchase rates and a -25% reduction in bounce rates.
Measure perceived performance to improve your user experience
Measuring the perceived performance of your product and correlating it with business KPIs can help you find opportunities to improve your metrics. The data needed for this investigation is available in your data lake or can be collected quickly.
Once you identify focus areas, you can easily estimate the return on investment from performance optimization. The key to success is finding the right communication language between engineering and product. Improving speed bit by bit will bring you the value in small iterations.

Dana is a copywriting specialist with deep expertise in creating assets like blog posts and landing pages that position organizations as the obvious first choice in their market. She holds a Bachelor of Business Administration in Marketing and has over 10 years of experience helping leading B2B brands drive traffic and increase conversions. Having taught more than 1,000 entrepreneurs the art of persuasive copywriting, Dana brings unique insight into what resonates with audiences and delivers results.
![[Blog] [Visual] Perceived performance optimization](http://images.ctfassets.net/gwbpo1m641r7/1IVk6m2Z4Nrxievl7vXGIy/8cf2dc036eb47d62b4636a0eb59fbcbb/Copy_of_pexels-diva-plavalaguna-6147020.jpg?w=3840&q=100&fit=fill&fm=avif)

![[Visual] [customer experience] analysis](http://images.ctfassets.net/gwbpo1m641r7/XUwnBhWlEXk8qVJI7Mgrp/b18408226718e952884e05c2a0e8beec/customer-experience-analysis-team-work.jpeg?w=1920&q=100&fit=fill&fm=avif)