Wi-Fi Basics: L2/MAC Layer

The next layer up in OSI model is the data link layer, which helps to encapsulate raw bits into something more manageable. The key to understanding this subject (and many other technology concepts) is seeing everything in layers of abstraction, because otherwise it would be incredibly difficult to talk about and analyze aspects of electronics without getting lost in an excess of mostly irrelevant information.

In the case of Wi-Fi, the MAC layer, which is within the OSI data link layer, is where a ton of the intelligence resides on the device (this in particular setting it apart from LTE). At its most basic form, the MAC layer hides the reality of the physical link layer with an abstracted network. To the parts of the networking stack operating above the MAC layer, the network appears to be solely composed of endpoints. Furthermore this abstract network appears to be full duplex, which means that data can be received and transmitted simultaneously.

Obviously, infrastructure-mode Wi-Fi isn’t solely composed of endpoints, and neither is Wi-Fi full duplex. Rather, due to the nature of how radios work, Wi-Fi is half duplex (radios can only talk or receive), as trying to transmit in full-duplex mode results in self-interference due to the receiver picking up the signal from the transmitter. There’s actually a significant amount of research going into solving the duplex problem to improve data rates and spectral efficiency, but from a commercial perspective most consumer devices don’t have this kind of technology yet.

In order to enable this kind of abstraction, there are a lot of systems in place in the Wi-Fi standard. While it might be relatively simple to abstract away the fact that the network actually has an access point that routes communications from one endpoint to another, but emulating full-duplex communication in a half-duplex medium is surprisingly complex. In general, only one device on the access point/spectrum can transmit at any given time, and if this rule is broken you end up with a lot of interference effects and whatever data was being transmitted is pretty much as good as lost. However it should be noted that there is an exception here with the latest generation devices supporting MU-MIMO, where multiple antennas are used to create areas of constructive and destructive interference so that “beams” are created to allow for multiple simultaneous transmissions.

As a result of these limitations, the MAC layer has to require that both devices and access points cooperate in sharing the spectrum using a time division scheme known as Carrier Sense Multiple Access with Collision Avoidance, or CSMA/CA. Devices connected to the AP have to make sure to listen to the channel with a receiver to first verify that the channel is clear, then can transmit data by first sending a request to send packet and waiting for a clear to send packet. Once the clear to send is received, the device can transmit its data. Because there’s no guarantee here that some other device didn’t also simultaneously transmit, the device has to listen for an acknowledgement by the access point that the data was received. If an ack is not received by a certain period of time, the device has to assume that someone else was also attempting to transmit and respond by backing off on transmissions for a specified period of time before transmitting again.

WaveDevice in turn is actually able to specifically test this part of the MAC layer using its ecosystem performance test, in which clients are simulated by WaveDevice and the device of interest is tested to see whether its CSMA/CA algorithms are designed so that appropriate throughput levels are maintained in the face of competing traffic. It turns out that some devices can be too aggressive and collide with other traffic, or too passive and spend too much time backing off. Getting too far from ideal in either direction will seriously affect throughput, so from a validation standpoint this is a test that is of interest as soon as you’re in environments like a convention center or press conference where there may be hundreds, if not thousands of other devices in the vicinity all on the same few channels.

Another part of the MAC layer that is important to understand for the purposes of Wi-Fi testing is rate adaptation. While WaveDevice allows for manual control of the Modulation and Coding Scheme (MCS) used by the device in addition to the number of spatial streams for MIMO and other settings like guard interval and channel bandwidth, it’s important that a device selects all of these things automatically and correctly. This is necessary in order to ensure that packet loss and retransmission isn’t happening at excessive rates in higher parts of the networking stack and that throughput at higher layers is maximized. Importantly, unlike the cellular world, Wi-Fi lacks channel quality indicators that allow for the device and access point to directly determine what the ideal modulation and encoding scheme is. This means that rate adaptation has to happen based upon factors like packet/frame loss rates.

Meanwhile it’s also important for the device to avoid transmitting signals at excessive power levels, as power consumed by the power amplifier directly affects battery life. Given that power amplifiers generally have a power-added efficiency of somewhere around 40% in modern mobile devices, it’s not entirely surprising to have a power amplifier consume somewhere around 1W of power alone, even before considering other parts of the RF chain or the rest of the device. Using a real world example here, our web browsing battery life test is long enough that even an average difference of 200 mW can cause a runtime difference measured in hours, so proper control of transmit power is definitely important. It's also important for the Wi-Fi chain to go to appropriate sleep states in order to save power. When implemented improperly, there can be some pretty serious knock-on effects in terms of idle battery life because unnecessary wake-ups can lead to waking the main CPU, which is relatively enormous in terms of power consumption on a mobile device.

From a testing perspective, these aspects can also be tested on WaveDevice by looking at how a device performs by testing for throughput while steadily decreasing transmit power on the access point. This rate vs range test can also be a test of the RF front end/physical layer, though it requires that the test chamber is set up properly to ensure that the device receives a constant transmit power and multipath propagation regardless of angle to avoid issues with anisotropy (in the real world, devices vary in their transmit and receive capabilities based on their angle and orientation). This test also allows for direct measurement of the ability of a device’s Wi-Fi chipset to demodulate and decode the signal in the face of decreasing SNR and received power, in addition to its ability to select the ideal MCS rate to maximum throughput and reduce packet loss.

Wi-Fi Basics: L1/PHY Layer Wi-Fi Performance: iPad Pro and Pixel C
Comments Locked

44 Comments

View All Comments

  • profquatermass - Tuesday, March 29, 2016 - link

    So, no setting up a typical office or home space to test in?
  • LostWander - Wednesday, March 30, 2016 - link

    The tool allows for a more consistent simulation what happens in those environments, if a device does well during testing with this tool it's actually more reliable indicator of performance in multiple environments then testing in a "typical" office set up. Consistent reproducible results. Science!
  • renjithis - Wednesday, March 30, 2016 - link

    Last page 3rd last paragraph expansion of CSMA/CA should be Collision Avoidance instead of Carrier Avoidance?
    Sorry it took so long.

    Please delete this comment after correction or if i was wrong
  • Rollo Thomasi - Monday, April 4, 2016 - link

    When testing range/thoughput you only talk about varrying the transmission power. Could it not be worth it to test varrying the latency as well to simulate distance to the access point?

    Also introducing av frequency shift to simulate the dopler effect of a receiver moving relative to the transmitter could be interesting.

    These are just thoughts of mine. Maby the effects of latency and frequency shifts are so small that it doen't matter.

Log in

Don't have an account? Sign up now