WiFi Testing with Ixia IoT

As previously discussed, RF testing has always been a major unknown to some extent because environmental factors make it extremely difficult to tell exactly what is going on with the system. I don’t think it really needs to be said but previous reviews and any controversy regarding the quality of RF has always lead to a ring of confusion and back and forth with no clear-cut answers, at least in the public domain. The Transformer Prime and Pixel C reception issues have all been cases where I’ve seen a lot of confusion over whether a problem really existed in the hardware, software, or with the end user.

Most people really don’t have any understanding of how wireless transmission works, probably because it’s not really something you can see. As far as I know, no one is capable of seeing radio waves, even at high frequencies like 60 GHz. Of course, the problem is that for quite some time our testing was also not really ideal for seeing the quality of an RF implementation. While iPerf does provide some useful data, free space testing means that we’re dealing with channel conditions that inherently cannot be controlled. As a result, the only sensible test we could do with iPerf was focus on maximum throughput in the best conditions we could provide. The only thing that this can highlight is the upper bound of efficiency for WiFi due to the carrier sense multiple access scheme in most cases, and rarely detects a whole class of problems that affect user experience on WiFi.

In order to test these things we’ve moved to using a proper testing system that is actually used by at least a few OEMs today, namely Ixia IoT. While we discussed the possibilities for testing, at this time due to the RF isolation chamber used we are limited to AP simulation only, so we can’t properly simulate clients in the channel without restricting ourselves to a single spatial stream for both the AP and client. This wouldn’t be a very useful test if set up in this manner as most devices today that we’re testing have support for two spatial streams, and many routers have three or even four spatial streams at this point.

The first set of results we can talk about that will be of interest is rate vs range. This is a fairly simple test at a conceptual level, as it simply tries to see how well a device can maintain its performance in the face of reducing signal to noise ratio for a given modulation and coding scheme. This is a good high level test of how well a device can maintain a connection as reception degrades. In this test the HTC 10 had an initial RSSI of -28 dBm while the GS7 was at -21 dBm and the iPhone 6s at -22 dBm, which allows us to calculate the path loss and determine the RSSI as a function of the transmit power.

The results of this test are interesting to say the least. Off the bat, every device had different RSSIs measured, so this meant that everything had different levels of path loss. The HTC 10 seemed to have the most path loss, while the Galaxy S7 and iPhone 6s were functionally identical. However it looks like RSSI is really an insufficient metric here because while the iPhone 6s was able to reach maximum throughput using NSS 2 MCS 8, the HTC 10 and Galaxy S7 did its best at NSS 2 MCS 4 or 5. I suspect this may be just due to placement as device positioning strongly affects MIMO as receive-side spatial correlation reduces the gains that MIMO can provide. Regardless, the HTC 10 somehow manages to beat the Galaxy S7 through much of the curve, but for some reason suffers from a reduction in throughput at higher transmit power. It's worth mentioning though that this test doesn't allow for testing of antenna gain or similar tests. Given various levels of futzing about with the device positioning in the test chamber I'm fairly confident that the Galaxy S7 is consistently better with regard to path loss, so even if it doesn't perform as well at a given RSSI it tends to have a higher RSSI than the HTC 10 by about 5 dBm which is fairly significant.

Finally, the other test that we can run at this time is the roaming latency test, which tests how well a device can hop from one access point to another as the received transmit power rises and falls. If you ever rely on WiFi to work as you walk around any building larger than a single apartment unit, you’re going to feel the effects of high roaming latency as VOIP calls or any real-time network application will either experience interruption or drop altogether if roaming is not implemented properly.

WiFi Roam Latency

In the case of the Galaxy S7, roaming latency is honestly rather wanting. In the best case the Galaxy S7's roaming latency appears to be acceptable, but it's still significantly worse than the best we've seen so far. It seems that Samsung's algorithms have issues with edge cases as I've seen multiple instances so far where the device just can't handle roaming consistently. Despite consistent positioning and identical test setup I've seen cases where the Galaxy S7 has problems with consistent roaming. Even with the simple case of 10 dBm to -45 dBm at 3 dBm drop per second, I've encountered weirdness where the device drops from the network altogether claiming that the password given was incorrect (it wasn't) or a few successful handovers followed by getting stuck on a single access point or dropping from the network entirely. Even in the best set of trials performed I still saw 3 of 64 trials fail to roam correctly. The performance is certainly far better than something like the Google Pixel C, but Samsung should really be focusing on improving here.

Video Performance Charge Time and Miscellaneous
Comments Locked

266 Comments

View All Comments

  • realbabilu - Thursday, July 7, 2016 - link

    If we bought for performance. The lags or the speed of apps is quite small than last one or two years ago oneplus one s801, you can't fell it big difference unless on benchmarking apps.
    The photos can tell different story, you can know it good or bad than last year mobile or other features like ois
  • ntp - Thursday, July 7, 2016 - link

    That's a very thoughtful reply, Impulses, thanks. But with the small sensors of smartphones I think 3/4ths of a stop is a significant advantage, more so than in the case of large sensor cameras, since we'd care about the F number in low light scenarios, where the ISO will be high. I'm just saying it should be better emphasized so people understand the real world advantages it gives.
  • Impulses - Thursday, July 7, 2016 - link

    It's a valid point, I'm just saying you can't look at that in a vacuum, specially since you're not looking at an ILC anyway... If you can't swap any parts, then the end result is all that really matters and that includes sensor efficiency, post processing (unless you're shooting RAW, a rarity on phone users), the presence of OIS, effectiveness of the latter, and even things like how smart the auto mode is...

    That last bit is probably beyond AT's more data driven evaluation, but a phone that relies to heavily on OIS for instance (or HDR) might take more blurry shots under real life conditions... That actually does favor a faster aperture but the point is emphasising specs in a vacuum is pointless.
  • mavsaurabh - Tuesday, July 12, 2016 - link

    I just wish to reply regarding aperture , you rightly said that f no causes cost to skyrocket in case of lslr ens all else being equal, the catchphrase is all being equal ! In case of constraints of mobile photography, lens stack size & weight limitations, heat produced etc leads to various compromises like plastic lens which coupled with bigger apertures leads to higher corner aberration, diffraction etc etc. in end as Impulses wrote what matters is the end result which is fine tuned balance of various compromises made !
    I am a pixel peeper and street, landscape photgrapher by hobby with 25 years of film and digital shooting through Slr's , mobiles , compacts, Fuji X100.
    My observation is that samsung uses hard sharpening and over saturated colors which "creates" pleasing photos on phone screen but if you display it on decent monitor and zoom to even 50% you will see various artefacts and no latitude for post processing. Now most of casual photographers will like larger than life portrayal or smearing of face pimples etc by clever use of face detection but hey any one who loves photography will differ!
    I completely agree about fast focussing advantages but honestly i am yet to use a mobile camera with lens fast enough to freeze pet/ child movement in indoor light to take advantags of fast focussing.
    Only phone which was able to do that with use of proper flash though is Nokia 808 Pureview and kudos to its mazing manual controls plus superb post processing which bettered apple even in natural post processing !
  • beggerking@yahoo.com - Wednesday, July 6, 2016 - link

    please just stop those BS apple biased benchmarks at 10x lower resolution... just take them off the chart! its not even a good comparision and serves no uses.
  • realbabilu - Thursday, July 7, 2016 - link

    What you see on your screen mobile is what you get. Offscreen just measuring the gpu can do,basically it useless for user because you can't see it.
  • lilmoe - Wednesday, July 6, 2016 - link

    "The one notable shortfall here is that Samsung only allows 800 ISO max in manual ISO mode when the true maximum is 1250"

    I had that number in mind when I read it last night, and was too lazy to test. I've tested it now and my unit can go up to 1600 ISO. Is that also a variable difference in Samsung's sensor (mine is Samsung made), or is the extra third stop on mine extended?
  • Chris_m1296 - Wednesday, July 6, 2016 - link

    Joshua ho, how did the exynos 8890 manage this score on slingshot es 3.1 unlimited? mine only got 2223 and even 3dmark themselves list the exynos version at 2223.
  • UtilityMax - Wednesday, July 6, 2016 - link

    Some people complain that the review is too harsh. But my personal view is that if this is a +650USD smartphone that _also_ happens to be carrier locked, it'd better be not just good, but _excellent_ in every respect. Otherwise, it's not clear what exactly justifies the price premium over a phone like Oneplus 3 or why a typical shopper should choose this over an apple product.
  • Impulses - Thursday, July 7, 2016 - link

    I kinda agree... I still feel some areas could've been better tested given how long the review date got dragged out, but there was still content here that's pretty unique to AT. I think the market, overall, is definitely giving the high end OEM too much of a pass given the prices phones are now commanding.

    A $1,000+ laptop with performance sapping bloat that the user can't remove (that aren't part of the core OS) would get ripped to shreds. It's time $700+ phones were held to the same standards.

Log in

Don't have an account? Sign up now