It’s been quite some time since I first started writing for AnandTech, and without question there’s been a lot of changes that have happened to our testing methodologies in the past few years. One of the main issues that I’ve always been thinking about while working through reviews is how we could improve our testing methodology in a meaningful way outside of simply updating benchmarks to stay current. Internally we’ve been investigating these issues for quite some time now, and these changes have included the addition of SoC power efficiency comparisons and display power efficiency measurements. There are a lot of other changes here and there that I still want to make, but one of the major unexplored areas has been wireless radio performance.

Wireless performance testing is probably one of the hardest things that we could test, and for a time I had almost given up hope on deploying such testing within AnandTech. However life has a way of playing out differently than I expect, and in the past few months we’ve been working with Ixia to make Wi-Fi testing a reality. Ixia, for those of our readers who aren't familiar with the company, is a traditional player in the networking test space. They are perhaps best known for their Ethernet test products, and more recently have been expanding into wireless and security testing with the acquisition of companies like VeriWave and BreakingPoint Systems.

We have done Wi-Fi testing before, but in the past we were mainly focused upon a relatively simple and arguably not particularly interesting test case: maximum throughput in ideal conditions. It was obvious that Wi-Fi in many devices is still not perfect, as subjective differences in reception and reliability can feel obvious. However, without any data or methods of replication it was hard to really prove that what we felt about wireless performance was really the case.

A few months ago, Ixia brought me into their offices to evaluate their WaveDevice system, which a Wi-Fi testing device uniquely suited to solving our testing needs. This system is effectively integrates a number of tools into a single chassis, including: Wi-Fi access points, traffic generators, programmable attenuators to set path loss, channel emulators to simulate a certain kind of RF environment in terms of interference and bandwidth, packet sniffers and analyzers, and signal/spectrum analyzers. These tools are implemented such that each layer of the Wi-Fi protocol can be analyzed, from the physical link layer of raw bits encoded at the carrier frequency of 2.4 of 5 GHz, to the application level where everything is neat bitstreams transmitted or received from specified addresses.

Of course, hardware is just one half of the equation. In order to provide a full solution, Ixia also has a full suite of software to make it possible to run common tests without spending excessive amounts of time developing custom tests. While WaveDevice supports custom tests through its API, WaveDevice out of the box supports a simple throughput test, which is effectively like iPerf but with more advanced statistics. There's also a general data plane test to evaluate throughput when varying the traffic direction, traffic type, frame size, and frame rate. Other than these basic tests, WaveDevice also has tests for rate vs range, roaming latency, and traffic mix testing. In the case of rate vs range, it's possible to run an automated sequence of throughput tests while varying frame rate, transmit power, and physical link rates. Meanwhile in the interesting case of roaming latency, we can test how long it takes for a Wi-Fi device to hop from one access point to another when fading out the signal of one access point and fading in the signal of another. Finally, the traffic mix test allows for a test of throughput when faced with competing traffic that is also transmitting.

Ixia has also made sure to support a range of OSes in order to make sure that almost any device can be tested against WaveDevice. In addition to maintaining iOS and Android applications, it seems that WaveAgent is a simple C application at its core that can be easily used for embedded systems like wearables and Wi-Fi cameras, so it seems that the "device" in WaveDevice really refers to any client with a Wi-Fi chipset.

While at first glance it might seem like these tests are pretty simple, it turns out that there's a huge amount of nuance to them. To try and understand the nuance I'm talking about, it's best to start with a basic primer on how Wi-Fi works.

Wi-Fi Basics: L1/PHY Layer
POST A COMMENT

44 Comments

View All Comments

  • JoshHo - Tuesday, March 29, 2016 - link

    The important part with all half-duplex technologies is to understand that while maximum throughput is a nice figure to have, it's more a statement of spectral efficiency. Your connection past the LAN may only be 20 Mbps, but traffic within the network can exceed 20 Mbps and higher throughput means that the spectrum occupied is available more often for other users. Reply
  • will1956 - Tuesday, March 29, 2016 - link

    i like to think i know somethings about electronics etc, then I read a article like this or read a comment like the above and then realise i really don't know much about the technicalities of computers etc Reply
  • evancox10 - Friday, March 25, 2016 - link

    I have no doubt that Apple is running these types of tests, looking at how they affect the user experience, and then improving the areas that are weak. Whereas the rest of the consumer electronics industry thinks it's sufficient to throw in the latest chipset from the vendor, run a synthetic benchmark showing it's faster, and then slap a large number on the specs sheet. Without ever just trying out the stupid things to see how they work.

    I often wonder if Samsung, Motorola, etc. ever give their devices to their executives before releasing them to just *use* for a week or two, and have them report any major issues. Judging by the large number of problems I discover in so-called flagship devices, I seriously doubt it. Or, if they do discover these issues, maybe the engineers are just clueless as to how to fix it.

    Rumblings/rumors from people in the industry suggest that smartphone design at Apple is driven by measurable/controlled user experience tests (e.g. mimic a finger swipe and objectively measure the response), but driven by synthetic server benchmarks (SPECmark) from the 1980's at other companies.

    In other words, not surprised at the excellent performance here by the iPad, especially in the handoff test. The difference is astounding.

    And this comes from someone who doesn't currently own or use ANY Apple products, the exact opposite of an Apple fanboy.
    Reply
  • Daniel Egger - Friday, March 25, 2016 - link

    > I have no doubt that Apple is running these types of tests, looking at how they affect the user experience, and then improving the areas that are weak. Whereas the rest of the consumer electronics industry thinks it's sufficient to throw in the latest chipset from the vendor, run a synthetic benchmark showing it's faster, and then slap a large number on the specs sheet. Without ever just trying out the stupid things to see how they work.

    Absolutely correct. Where pretty much all other companies try to impress by raw performance data and every now and then throw in an oddball like partially perfect design or a remarkable attention to detail here or there Apple always tries to cover as many bases as possible at once. The desperation and despair this causes at the competition can be easily seen by out of proportion "scandals" like the death grip, bend-gate and others...

    The truth is: If you don't even aim for perfection you're definitely not going to reach it. Apple is one of the few companies I know who at least try -- VERY hard.
    Reply
  • zodiacfml - Saturday, March 26, 2016 - link

    Impressive, it is. Yet, I feel Samsung also does as their flagship devices often show good performance, at least in terms of Wi-Fi. Reply
  • skavi - Sunday, March 27, 2016 - link

    My S6 has abysmal roaming performance, but good WiFi speed. Sometimes I feel like Samsung only care about the things it can put on a spec sheet (octa core, four gigs, quad hd, etc.). This is what's driving me away from Samsung towards Apple who seems to pay an obscene level of detail to every part of the experience. Reply
  • will1956 - Tuesday, March 29, 2016 - link

    yeah thats what i always think of Apple. Attention to the small details.
    Details may be small but a lot of small problems result in big problems.
    Another thing i like about Apple is that they like Quality over Quantity, something the Android manufacturers are finally realising, e.g. Qualcomn with its quad core 820 which is about twice as good as the octa core 810, and better than the deca cores Mediateks.
    Reply
  • DarkXale - Tuesday, March 29, 2016 - link

    The iOS division perhaps - the Mac devision sure as hell isn't though, considering the subpar performance and stability of Macs on a lot of networks. (Networks which iPads don't struggle with) Reply
  • will1956 - Tuesday, March 29, 2016 - link

    that's interesting to know. Thanks Reply
  • Oubadah - Friday, March 25, 2016 - link

    The Pixel C's Wi-Fi is broken: https://productforums.google.com/forum/#!topic/nex... Reply

Log in

Don't have an account? Sign up now