TESTING METHODOLOGY

The methodology of the P3 connect Mobile Review is the result of more than 15 years of testing mobile networks. Today, network tests are conducted in more than 80 countries. They were carefully designed to evaluate and objectively compare the performance and service quality of mobile networks from the users’ perspective.

The P3 connect Mobile Review for Singapore includes the ­results of an extensive data drive­test as well as a sophisticated crowdsourcing approach. 


DRIVETESTS

The drivetest was conducted on 12 measurement days from May 2nd to May 16th, 2018. All samples were collected between 8am and 10pm. Two ­drivetest cars ­drove a total of 2,100 kilometres, covering 99.5 per cent of Singa­pore‘s total population.

Each of P3‘s drivetest cars was equipped with arrays of Samsung Galaxy S8 Smartphones. ­These „Cat 9“ phones are capable of supporting 300 Mbit/s download and 50 Mbit/s upload speeds. All data measurements were done in 4G preferred mode. Each car carried one smartphone per operator. 


DATA TESTING

For the web tests, the test smartphones accessed web ­pages according to the widely ­recognised Alexa ranking. In addition, the static Kepler test web ­page as specified by ETSI (Euro­pean ­Telecommu­­nica­tions Standards Institute) was used.

In order to test the data ­service performance, files of 3 MB and 1 MB for download and upload ­were transferred from or to a test ­server located on the Internet. In addition, the peak data performance was tested in uplink and downlink directions by assessing the amount of data that was transferred within a seven seconds time period.

Another discipline was the playback of ­YouTube videos. It took into account that ­YouTube dynamically adapts the video ­resolution to the available band­width. So, in addition to success ratios, start times and playouts without interrup­tions, YouTube measure­ments also determined ­average ­video ­resolution. All tests were conducted with the best performing ­mobile plan available from each operator.


RANKING AND GRADING

As this Mobile Review is ­intended just as an indication of network performance ana quality, we abstained from ranking or ­grading the operators considered based on the number of total achieved points. However, the ­published percentages give a good indica­tion of their achievements.


CROWDSOURCING

As an addition to the drivetests, P3 conducted thorough crowd-based analyses of the ­Singapore networks. These analyses are based on crowd data that had been gathered in two periods of three months each. P3 decided to follow this approach because our crowdsourcing metrics are ­based on three-month periods. As the drivetests were conducted in May 2018, we wanted to factor in crowdsourcing data from this ­period as well ­— thus, we have ­included the months from April to June 2018. On the other hand, we wanted to present as current data as possible, which is why we also included the period from July to September 2018. The consideration of two crowdsourcing periods of a duration of three months each allows us to also have a look at trends and developments between the two compared periods.

For the collection of crowd data, P3 has integrated a background diagnosis processes into 800+ ­diverse Android apps. If one of these applications is ­installed on the end-user’s phone, data collec­tion takes place 24/7, 365 days a year on this device. Reports are generated for every 15 minutes and daily sent to P3‘s cloud servers. Such ­reports ge­ne­rate just a small number of bytes per message and do not include any personal user data. Interested parties can deliberately take part in the data gathering with the specific ”U get“ app (see box below).    

Other crowdsourcing ­solutions have a very technical user base. Thus, their results are typically skewed towards high-end, ­heavy data users. With the integ­ration into more than 800 diverse apps covering different market segments, P3 has generated data which is a fair and equal repre­sentation as ­opposed to that of classical speed test apps. The unique crowdsourcing ­technology allows P3 to collect data about real-world customer experience in a truly passive way – wherever and whenever customers use their smartphones.

P3‘s crowdsourcing data set is the most realistic, ­since it is the most diverse that is currently ­available in the market in terms of locations, geography, times, devices, subscriptions, ­networks, technologies and ­smartphone usage patterns. P3 applies advanced big data analytics to distill the essence of information from the bulk data. By ana­lysing data according to predefined ­metrics, P3 can provide information for the optimisation of networks and also show whether ­networks live up to the expectations of their customers.


RATING OF NETWORK COVERAGE

For the assessment of network coverage, P3 lays a grid of 2 by 2 kilometers over the whole test area. The so-called evaluation areas generated this way are then sub-divided into 16 smaller ­tiles. In ­order to ensure statistically relevant statements, P3 requires a certain number of users and measurement values per operator for each tile and each evaluation area. If these thresholds are not met by one of the considered operators, this part of the map is not considered in the assessment to ensure fair terms.

Even more relevant results are accomplished by not only determining the mere network ­coverage but also considering its ­quality. The parameter ­Quality of Co­verage reveals whether voice and data services actually work in the respective evaluation area. P3 does this because not in each area that allegedly provides network reception, the mobile services can actually be used. For this reason, the percentage for Quality of Coverage is always a little lower than the corresponding ­coverage value. We specify these values each for the coverage of voice services (2G, 3G and 4G combined), Data (3G and 4G ­combined) and 4G only.


ASSESSMENT OF DATA THROUGHPUTS

Additionally, P3 investigates the data rates that were ­actually ­available to each user. For this purpose, we have determined the best obtained data rate for each user during the ­evaluation period and then calculated the ­average of ­these values. In addition, we have determined the so-called P90 values for the top throughput of each evaluation area as well as of each user‘s best throughput. P90 values specify the threshold in a statistical distribution, below which 90 per cent of the gathered values are ranging – or ­above which 10 per cent of the values are situated. These values depict how fast the network is under ­favourable conditions.


DATA SERVICE AVAILABILITY

Another performance indicator considered in the crowd results is the Data Service Availability. This parameter indicates the availabi­lity of a network and the number of outages or service ­degradations respectively. 

In order to differentiate network glitches from normal variations in network coverage, we apply a precise definition of service degradation: A degradation is an event where data connectivity is impacted by a number of cases which significantly exceeds the expectation level. To judge whether an hour of interest is an hour with degraded service, the algorithm looks at a sliding window of 168 hours before the hour of interest. This ensures that we only consider actual network service degradations differentiating them from a simple loss of network coverage of the respective smartphone due to prolonged indoor stays or similar reasons.

In order to ensure the statistical relevance of this approach, a ­valid assessment month must fulfil clearly designated prerequisites: A valid ­assessment hour consists of a predefined number of ­samples per hour and per operator. The exact number depends on factors like market size and number of operators. A valid assess­ment month must include at least 90 per cent of valid assessment hours (again per month and per operator).

Boxes were mounted into the rear and side windows of each measurement car supporting the smartphones used for the drivetests.

Boxes were mounted into the rear and side windows of each measurement car supporting the smartphones used for the drivetests.

 
All test phones used in the drivetests were operated and supervised by P3‘s unique control system.

All test phones used in the drivetests were operated and supervised by P3‘s unique control system.


Participate in our crowdsourcing

Everybody interested in becoming a part of our global crowdsourcing panel and obtaining insights into the ­reliability of the mobile network that her or his smartphone is ­logged into, can most easily participate by installing and using the “U get“ app. This app exclusively ­concentrates on network analysis and is available ­under uget-app.com.

“U get“ checks and visualises the current mobile network per­formance and contributes the results to our crowdsourcing platform. Join the global community of users who understand their personal wireless performance, while contributing to the world’s most comprehensive ­picture of the mobile customer experience.

U Get App Hand.png