Public Safety LTE MVNO Assessment –
Puzzling with Analytics?
Agile multi-source data analysis for continuous LTE MVNO quality assessment. Demand necessary operator data to create broadband services meeting the demanding public safety user requirements
Let’s dive into the study of Five Technology Evolution Steps made by Suomen Virveverkko Oy and awarded in London 5th of March 2015. It suggests as the first step to jump into a lightweight LTE Mobile Virtual Network Operator (MVNO) partnership. An example of such is national roaming, which enables use of all operators in the country for Public Safety use. And it works well, or does it? Yes, if you can live with technical insufficiencies such as delivery of consumer service quality without mission-critical traffic prioritization and pre-emption, no real obligation by any party to perform at high standards, lack of group calls across different MVNO operator radio networks and eventually inconvenience in selecting most dominant operator. For sure it’s the quickest and easiest shortcut into the world of new broadband services opportunities.
For the second and third steps of more tighter LTE MVNO integration, one first needs to understand the competitive position of each and every public operator in the country. Why not just pick up the obvious partner? In my opinion two reasons, first to build up resilient partnerships and second to form an initial understanding of delta investments required to fulfill demanding Public Safety user service requirements. As we know, not everyone is standing on the same line. In reality, we are asking to evaluate a legally bounding partnership not just live with a loose relationship such as national roaming. So what do we want to evaluate? Too many things to be written here, but first and foremost the relative quality position per operator determined by factors such as geographical coverage, population coverage and planned vs. actual service quality due its important in measuring partner’s capabilities in delivering the promise. As we all know, this is a snapshot in time, telecom business changes dynamically and we need to be prepared to ride along the wave. What’s good enough quality? It’s determined by Public Safety user requirements for the plethora of new LTE enabled applications and naturally a benchmark against TETRA service quality experience.
Traditionally service quality is benchmarked with Drive Testing, whereby measurement teams are driving around a preplanned test route using applicable test equipment operated by skilled engineers. It’s widely used due to its well accepted methodology, it provides with facts that decision-makers are looking for. In practice, we are generating test samples emulating customer perception for each location at given time to further analyse and draw conclusions on service quality in a wider area. Now here’s the ugly part. Can we really draw operator service quality conclusions from test samples done once or twice per annum? It’s a compromise – the best that has existed so far – also it is statistically relevant and exact science that we engineers love.
What’s then the right approach? With my Predictive Analytics background, I wanted to repeat a nice saying: ‘The more DATA, the better RESULTS’. In my opinion, we need more data in order to accomplish the mission of comprehensively comparing service quality. Luckily it’s not about a boat floating in a big wide ocean, since the world is full of various strings of data either public or secret, however there’s no single data source providing with the appropriate answer right away.
Enriched data sources are crowdsourcing measurement data (Open Signal, Ookla Speedtest, etc.), select operator OSS service quality KPIs and user-driven network design coverage simulations. What’s critical here is the mix and match between the so called Customer and Network perceived data, practically depending on whether data source is externally available or operator in-house. For example, Drive Test data is about samples of each and every location at given time, where as crowdsourcing has much more frequent samples on a generic level. Luckily they do complement each other. One would have to form an interpretation using analytics rather predictively vs. reactively to draw the full picture about history and tomorrow. It could be dangerous and risky business to draw conclusions out of single source sample data, insufficient analysis or old school backwards-leaning approach. For us engineers, analysing such data gets into the world of non-exact science based on propabilities which will not draw the most perfect picture but the best approximation. Trust me, this is better than few samples here and there twice per annum.
We dream about having frequent, near real-time – high quality analysis cost efficiently into our decision-making. It is crucial to demand data from operators, without data there’s no analysis. So, let’s throw all the data into one bucket and analyse everything – what are we waiting for? Let me express another learning from my experience. It’s not a BIG BANG for the most perfect solution at once since the exact need is yet unclear and also it may change during the time which obviously leads into a multi-million and multi-year exercise. Can you afford it? I rather would encourage for an Agile approach, whereby we flexibly and quickly start from one angle, learn from it, add more data and analysis, draw new conclusions. It’s about a journey into a well managed MVNO partnership through proper quality assessment, continuous and proactive monitoring to fulfill the demanding public safety requirements in the LTE era. The world cannot be built in a day.
Good luck with your endeavour!
If you’d like to discuss more about the issue, call/email me and let’s sit down together!