NEWS

Mobile Data Customer Experience in Nordics & Baltics

Omnitele Benchmark: Nordics Operators

The benchmarks of this campaign were conducted in Estonia, Denmark and Latvia during March, April and May 2015 respectively. All leading operators in the target countries were included. Operators have continuously improved their networks since the measurements and thus the presented results illustrate a snap-shot view of the perceived customer experience and network performance within the given time and location context.

countries and operators 2

Capturing the Customer Experience

Traditionally mobile network benchmarks and drive test campaigns focus heavily on network KPIs such as throughput and coverage. In mobile business today, the experienced service quality plays a significant role as a means of competitive differentiation. Most importantly, the subscribers desire smooth smartphone application experience.

Significant part of the smartphone customer experience is attributed to the data connection provided by the user’s mobile operator. The performance of these mobile data connections – and the resulting service/application experience is in the heart of our “be-the-customer” benchmark approach. We have specifically developed the whole process to illustrate typical end-user experience.

The measurements of this customer experience benchmark consisted of country-wide drive test campaigns and stationary hot-spot testing in selected locations.

  • The drive tests focused on assessing data capacity and voice call service quality.
  • The stationary tests involved testing of popular smartphone applications: WWW browsing, Facebook, Twitter, Instagram, YouTube and Dropbox.
smartphone apps with bg

Mobile voice call results are omitted in this summary due to relatively small differences in operators’ performance and generally high quality levels.

Relatively High Downlink Speeds

Although we say that the raw downlink bitrate is not self-sufficient for comparing mobile operators today, it still gives a good indication of the network capacity and quality. While FTP file transfer is not very popular smartphone use case we still included the test case to have visibility to the baseline network performance. After all, the Mbps is something that many users can relate to their expectations.

Best FTP data download results were achieved in TDC Denmark network, over 45 Mbps on average over the whole country, EMT Estonia follow close with 44 Mbps. Rest of the operators in Latvia, Denmark and Estonia fall in 20-30 Mbps region, except for Tele2 Latvia which is somewhat behind the others with 11 Mbps. The relatively high data rates are mostly attributed to country-wide 4G LTE network deployments in the region.

FTP downlink bitrate

 

Notable Differences in Web Browsing Experience

Waiting time or refreshing time (i.e. time-to-content) is a very important factor of the actual customer experience for many popular applications: end-users typically don’t really care about the bits, bytes and pings but instead want the requested content on mobile screen as fast as possible.

WWW Page Waiting Time represents the time between user requesting a web page and completing the content download. Tests showed that LMT Latvia obtained the fastest web browsing results, followed closely by TDC Denmark. TDC and LMT performed rather well in other application tests too, as indicated in following charts.

www waiting time baltics 3

Nevertheless it has to be taken into consideration that the web pages browsed in each country were selected according to country trends, and thus the absolute value comparison between countries may not be fully comparable. On the other hand, the figures serve as a good indicator of how long customers typically wait for the most visited web pages in the respective countries.

Mobile Facebook User Experience

Uploading content to social networks is one of the most popular use cases among smartphone users today. In this campaign we included status update, picture upload and wall refresh use cases of mobile Facebook. The very same tests were executed in all countries. The chart below illustrates the operator specific average times of the mentioned test cases.

facebook waiting time baltics 03

As expected, the competitive positioning in Facebook response time follows closely the trend seen in WWW tests. But we find some dissimilarities as well: Bite and Tele2 Latvia have rather long Facebook waiting times despite the relatively fast web browsing. TDC Denmark is very strong in Facebook tests, almost twice as fast as closest competitor Telenor.

Whole Experience: Average Application Response Time

In addition to web browsing and Facebook tests, we measured also Twitter, Instagram, YouTube and Dropbox services. Averaging the time-to-content results of Web browsing, Facebook, Twitter, Instagram and YouTube, shows that TDC Denmark offers the fastest response times on average. LMT Latvia follows very close.

spp response time baltics 02

Application Response Times – Country Average View

Looking at the results on per country basis (that is averaging over all operators within a given country), the results indicate that performance-wise the measured mobile markets are rather close to each other. The average application response times of all measured countries are within half a second margin. Prior to the campaign we anticipated more performance differences, but the results imply that if such differences exist, they are at least not dramatic.

app response time baltics ctry average

In Numbers

For all of you readers who feel like being number crunchers, all results are summarised in below table, which you can also download in .xlsx format.

baltics benchmark data table 03

Independent and Objective

Omnitele conducted the benchmarks independently and on its own cost with in-house project team. In order to achieve independent and objective view, we sourced the SIM cards directly through operator shops – like real subscribers do.

The test routes and indoor/outdoor test locations were selected by Omnitele in blind-test fashion, preferring locations with high likelihood of end-user service usage. The test cases were selected and configured as per expected end-user behaviour.

For further information on the methodology and country-specific details, please refer to the respective public summary reports, links available in the further reading -section below.

For more information, please contact: