WiFi Testing with Ixia IoT

As previously discussed, RF testing has always been a major unknown to some extent because environmental factors make it extremely difficult to tell exactly what is going on with the system. I don’t think it really needs to be said but previous reviews and any controversy regarding the quality of RF has always lead to a ring of confusion and back and forth with no clear-cut answers, at least in the public domain. The Transformer Prime and Pixel C reception issues have all been cases where I’ve seen a lot of confusion over whether a problem really existed in the hardware, software, or with the end user.

Most people really don’t have any understanding of how wireless transmission works, probably because it’s not really something you can see. As far as I know, no one is capable of seeing radio waves, even at high frequencies like 60 GHz. Of course, the problem is that for quite some time our testing was also not really ideal for seeing the quality of an RF implementation. While iPerf does provide some useful data, free space testing means that we’re dealing with channel conditions that inherently cannot be controlled. As a result, the only sensible test we could do with iPerf was focus on maximum throughput in the best conditions we could provide. The only thing that this can highlight is the upper bound of efficiency for WiFi due to the carrier sense multiple access scheme in most cases, and rarely detects a whole class of problems that affect user experience on WiFi.

In order to test these things we’ve moved to using a proper testing system that is actually used by at least a few OEMs today, namely Ixia IoT. While we discussed the possibilities for testing, at this time due to the RF isolation chamber used we are limited to AP simulation only, so we can’t properly simulate clients in the channel without restricting ourselves to a single spatial stream for both the AP and client. This wouldn’t be a very useful test if set up in this manner as most devices today that we’re testing have support for two spatial streams, and many routers have three or even four spatial streams at this point.

The first set of results we can talk about that will be of interest is rate vs range. This is a fairly simple test at a conceptual level, as it simply tries to see how well a device can maintain its performance in the face of reducing signal to noise ratio for a given modulation and coding scheme. This is a good high level test of how well a device can maintain a connection as reception degrades. In this test the HTC 10 had an initial RSSI of -28 dBm while the GS7 was at -21 dBm and the iPhone 6s at -22 dBm, which allows us to calculate the path loss and determine the RSSI as a function of the transmit power.

The results of this test are interesting to say the least. Off the bat, every device had different RSSIs measured, so this meant that everything had different levels of path loss. The HTC 10 seemed to have the most path loss, while the Galaxy S7 and iPhone 6s were functionally identical. However it looks like RSSI is really an insufficient metric here because while the iPhone 6s was able to reach maximum throughput using NSS 2 MCS 8, the HTC 10 and Galaxy S7 did its best at NSS 2 MCS 4 or 5. I suspect this may be just due to placement as device positioning strongly affects MIMO as receive-side spatial correlation reduces the gains that MIMO can provide. Regardless, the HTC 10 somehow manages to beat the Galaxy S7 through much of the curve, but for some reason suffers from a reduction in throughput at higher transmit power. It's worth mentioning though that this test doesn't allow for testing of antenna gain or similar tests. Given various levels of futzing about with the device positioning in the test chamber I'm fairly confident that the Galaxy S7 is consistently better with regard to path loss, so even if it doesn't perform as well at a given RSSI it tends to have a higher RSSI than the HTC 10 by about 5 dBm which is fairly significant.

Finally, the other test that we can run at this time is the roaming latency test, which tests how well a device can hop from one access point to another as the received transmit power rises and falls. If you ever rely on WiFi to work as you walk around any building larger than a single apartment unit, you’re going to feel the effects of high roaming latency as VOIP calls or any real-time network application will either experience interruption or drop altogether if roaming is not implemented properly.

WiFi Roam Latency

In the case of the Galaxy S7, roaming latency is honestly rather wanting. In the best case the Galaxy S7's roaming latency appears to be acceptable, but it's still significantly worse than the best we've seen so far. It seems that Samsung's algorithms have issues with edge cases as I've seen multiple instances so far where the device just can't handle roaming consistently. Despite consistent positioning and identical test setup I've seen cases where the Galaxy S7 has problems with consistent roaming. Even with the simple case of 10 dBm to -45 dBm at 3 dBm drop per second, I've encountered weirdness where the device drops from the network altogether claiming that the password given was incorrect (it wasn't) or a few successful handovers followed by getting stuck on a single access point or dropping from the network entirely. Even in the best set of trials performed I still saw 3 of 64 trials fail to roam correctly. The performance is certainly far better than something like the Google Pixel C, but Samsung should really be focusing on improving here.

Video Performance Charge Time and Miscellaneous
Comments Locked

266 Comments

View All Comments

  • realbabilu - Thursday, July 7, 2016 - link

    I wish AT can check the touchscreen latency between these mobile. Sometimes you got powerful chip but still some lags noticeable.
    I want to see where the real apps excluding the benchmark app that can utilize all Cores and where the cpu can use smart management where the app need use all power and which not. Simple that sometimes a 2D games like
    Air attack 2 could raise temperature very high. Smart management cpu may decrease the power that doesn't needed for those apps.

    Good review. And nice to review high end to cheap Chinese phones, so we can now where it pays..
  • Magicpork - Thursday, July 7, 2016 - link

    So it took them 4 months to write up an Apple Biased review... no wonder the reputation of anandtech has fallen so much recently..
  • KoolAidMan1 - Sunday, July 10, 2016 - link

    This comment section taught me that reality is biased.

    Instead of being mad at Anandtech that a GS7 got BTFO by an iPhone SE, maybe you should demand more from Google and OEMs to improve their hardware and operating system.
  • Ihabo - Thursday, July 7, 2016 - link

    Galaxy S7 Edge Wins In:

    +IPS68 Water & Dust Resistant
    +Quad HD Super Amoled Screen ( brighter under sun light ) and immersive
    +Dual Pixel New Technology Camera - Very Fast in Auto Focus & Low Light
    +3,600mAh Battery last one day easily of heavy usage
    +Exynos8890 Better than SnapDragon 820 in Battery Performance.
    +Wireless Charging
    +No Heat while fast charging

    Phone of the Year no Doubt
  • beggerking@yahoo.com - Monday, July 11, 2016 - link

    now only if they'd bring removable battery back... then i'd be all over it.

    still keeping my s5 for the time being.
  • AJP - Thursday, July 7, 2016 - link

    Regarding the Galaxy S7 and S7 Edge Review, when comparing benchmarks please take into account the APPLE Iphone screen resolutions are much lower. That will bias the results in their favour and should be considered before making any comments.
  • blackcrayon - Thursday, July 7, 2016 - link

    Did you even read the review? For as long as I can remember, they've been showing both onscreen and offscreen GPU benchmarks for this very reason. And they specifically mention it in the review that the iPhone GPU keeps up on-screen because of the lower resolution.
    As for the CPU benchmarks, it comes down to Apple's really high single core performance and optimized browser engine. One advantage of designing both the hardware and all of the software in tandem.
  • JoeDuarte - Thursday, July 7, 2016 - link

    Does anyone know what the author means by Google's optimizations - or lack thereof - for Chrome on Android? What optimizations? Does Google normally do something special? I don't understand what he's referring to.

    Also, what does he mean by Samsung's lack of optimization of the UI? Is there a standard set of optimizations that OEMs do on Android phones? Is he talking about low level C code, or ARM assembly or something?
  • Impulses - Thursday, July 7, 2016 - link

    In the case of the browser, there's optimizations other browsers can and have done for specific SoC, it used to be a lot more common before Chrome for Android being the stock browser tho it's still prevalent... I'm guessing for whatever reason Google has never implemented such hardware specific optimizations.

    In the case of the UI, there's a lot of Samsung elements added atop the base OS that do drag performance down, other OEM have scaled back their OS customizations or fine tuned then over time (namely Moto and HTC to an extent)... Samsung's approach is still pretty heavy handed.
  • UltraWide - Thursday, July 7, 2016 - link

    "Samsung is better than anybody else at learning from its competitors. "A market reader is sort of the classic fast follower," explains Barry Jaruzelski, senior partner at Booz&Co and the co-author of the Global Innovation 1000. "It doesn't mean they ignore their customers, but they're very attuned to what competitors are doing and what other people are bringing to market first and observing what seems to be gaining traction, then very rapidly coming up with their own version of that innovation."

    http://www.businessinsider.com/samsung-corporate-s...

    That's always been Samsung's strength, it will take time to change the whole corporation's mantra.

Log in

Don't have an account? Sign up now