As streaming services become the norm, users want one thing: a better quality of experience (QoE). For these services and OTT providers, failure to produce a high-quality experience can shift a business from success to failure. But today, service providers need to be using their data to make fractional improvements as opposed to massive shifts.

So how do they go about improving the experience? The answer is fairly simple – data. Data is now available in huge numbers which can provide service providers with insights into what their audience is watching, how many times the content is viewed or streaming figures. It’s all very well having access to this data, but if it isn’t understood, it might as well be rendered useless.

So how can data be used and understood for a better quality of experience?

Investing in the right tools

In video and OTT, we have seen a significant growth in tools and platforms promising to analyse the mass of data that an OTT service will generate and provide QoE insights to help the service provider run their business. The vast majority of these tools are client based and sit inside a device or application and summarise the data into a dashboard. This dashboard provides the service provider with a snapshot of device performance.

Testing accuracy

But how do they work? In the main, they make assumptions based on pre-configured parameters e.g. if A happens and you see error code XYZ then this is the likely outcome. In many cases, this will be correct but still assumes many other elements are configured and behaving correctly. Although the answer provided by these tools may often be the right one it isn’t always, and as a service provider how can you fully ensure that what you see on the dashboard is an accurate reflection of your network? And even if the dashboard is accurate, do you know where the real issues are and how to fix them?

By gathering all the raw data logs and using a unique identifier such as an IP address or unique session ID (complying with GDPR of course), the end-to-end customer journey can be mapped. This process will identify any issues not being picked up by the monitoring tools, (they are only usually configured to look for certain fault conditions), it will validate that the faults being reported are correct, for example, is an exit before video start really and exit before video start and it will also enable fault finding. If an error is occurring, where is that error occurring? Is it a timeout in the app or all the reported errors coming from devices that are not officially supported?

Chasing fractions

Today a service provider is rarely chasing massive operational improvements. The improvements are incremental and a percentage point here and there can have a significant impact on the business performance. If improving the quality of experience reduces churn by just 1% then there will be a boost to the bottom line through reduced cost of acquisition and increased revenues from retained subscribers. Not only that but the staff can focus on developing new features and services as opposed to feeling like they are constantly firefighting.

In summary, the monitoring platforms are essential to the business operation. However, they are only as good as the data they receive and do not diagnose the exact nature of the issues. Any OTT service provider should consider how they validate their data to guarantee trust in their dashboard and how they can use that end to end picture to diagnose and fix any issues that are damaging the subscribers quality of experience.

Our CTO, Chris Wood, spoke on a panel with Ian Munford, Director of Product Marketing at Akamai about how to better understand your data to improve quality of experience. Watch the full presentation here.

Leave a Reply

Your email address will not be published. Required fields are marked *