WiFi Performance

Apple hasn’t spared upgrading WLAN connectivity on the 4S, though the improvement isn’t quite as dramatic as what I was hoping for. The 4S uses BCM4330, Broadcom’s newest WLAN, Bluetooth, and FM combo chip (though the latter still isn’t used). We’ve seen this particular combo chip in the Samsung Galaxy S2, and no doubt BCM4330 will start popping up a lot more in places where its predecessor, BCM4329 was used, which was everything from the 3GS to the 4 and in virtually innumerable Android devices. BCM4330 brings Bluetooth 4.0 support, whereas BCM4329 was previously Bluetooth 2.1, and still includes the same 802.11b/g/n (2.4 GHz, single spatial stream) connectivity as the former, including only tuning 20MHz channels (HT20). I was hoping that the 4S would also include 5 GHz support, after seeing SGS2 include it, however the 4S still is 2.4GHz only.

Encircled in red: The iPhone 4S' 2.4 GHz WiFi+BT Antenna

In addition, the 4S locates the WiFi antenna in the same place as the CDMA iPhone 4. If you missed it back then, and have read the previous cellular connectivity section, you’re probably wondering where the WiFi and Bluetooth antennas went, given the absence of a stainless steel band for them. The answer is inside, printed on a flex board, like virtually everyone else does for their cellular antennas. It’s noted on the FCC-submitted schematic, but I also opened up the 4S I purchased and grabbed a picture.


Left: iPhone 4S with WiFi RSSI circled, Right: iPhone 4

Given the small size of this antenna, you might be led (deceptively) to think it has worse sensitivity or isotropy. It’s interesting to me that this is actually not the case. Subjectively, I measured slightly better received signal strength on the 4S compared to a 4 side by side, and upon checking the FCC documents learned the 4S’ WLAN antenna has a peak gain of –1.5 dBi compared to –1.89 dBi on the 4, making it better than the previous model. That said, the two devices have approximately the same EIRP (Equivalent Isotropically Radiated Power) for transmit when you actually work the math out.

WiFi Performance

Moving to a newer WLAN combo chip helps speed WiFi throughput up considerably in our test, though I’m starting to think that the bigger boost is actually thanks in part to a faster SoC. As a reminder, this test consists of a 100MB PDF hosted locally loaded over 802.11n (Airport Extreme Gen.5), throughput is measured on the server. On MobileSafari, the PDF document is loaded in its entirety before being rendered, so we’re really seeing WiFi throughput.

GPS

The iPhone 4 previously used a BCM4750 single chip GPS receiver, and shared the 2.4 GHz WiFi antenna as shown many times in diagrams. We reported with the CDMA iPhone 4 that Qualcomm’s GPS inside MDM6600 was being used in place of some discrete solution, and showed a video demonstrating its improved GPS fix. I suspected at the time that the CDMA iPhone 4 might be using GLONASS from MDM6600 (in fact, the MDM6600 amss actually flashed onto the CDMA iPhone 4 includes many GLONASS references), but never was able to concretely confirm it was actually being used.

MDM6610 inside the 4S inherits the same Qualcomm GNSS (Global Navigation Satellite System) Gen8 support, namely GPS and its Russian equivalent, GLONASS. The two can be used in conjunction at the same time and deliver a more reliable 3D fix onboad MDM6610, which is what the 4S does indeed appear to be using. GPS and GLONASS are functionally very similar, and combined support for GPS and GLONASS at the same time is something most modern receivers do now. There are even receivers which support the EU’s standard, Galileo, though it isn’t completed yet. This time around, Apple is being direct about its inclusion of GLONASS. The GPS inside MDM6610 fully supports standalone mode, and assisted mode from UMTS, GSM, OMA, and gpsOneXTRA.

Just like with the CDMA iPhone 4, I drove around and recorded a video to illustrate GPS performance, since unfortunately iDevices still don’t report direct GPS NMEA data. The 4S has a very constant error radius circle in the Maps application and shows little deviation while traveling, whereas the 4 sometimes wanders, changes horizontal accuracy, and velocity. In addition, the 4S GPS reports the present position in the proper lane the whole time as well, while the 4 is slightly shifted. I don’t think many people complained about the GPS performance on the 4, but both time to fix and overall precision are without a doubt improved over the GSM/UMTS 4. Subjectively, indoor performance seems much improved, and I’ve noticed that the iPhone 4S will report slightly better horizontal accuracy than the 4 (using MotionX-GPS on iOS) indoors. Unfortunately we can’t perform much more analysis since again real NMEA data isn’t presented on iOS, instead location is abstracted away using Apple’s location services APIs.

Noise Cancelation

The iPhone 4 included a discrete Audience noise processor and second microphone for doing some advanced common mode noise rejection. This reduced the amount of background noise audible to other parties when calling from a noisy environment, and is a feature that virtually all of this latest generation of smartphones has included. The 4S still includes that second microphone (up at the top, right next to the headset jack), though the discrete Audience IC is gone. It’s possible that Audience has been integrated into the A5 SoC itself, or elsewhere, or the 4S is using Qualcomm’s Fluence noise cancelation. I spent considerable time digging around and couldn’t find anything conclusive to indicate one possible situation over the other.

We recently started measuring noise rejection by placing a call between a phone under test and another phone connected to line-in on an audio card, then ramping volume up and talking into the handset. The 4S doesn’t get spared this treatment, and I’ve also included the 4 and 3GS (which has no such common mode noise rejection) for comparison.

iPhone 4S Noise Rejection Demonstration - GSM/UMTS - AT&T by AnandTech
iPhone 4 Noise Rejection Demonstration - GSM/UMTS - AT&T by AnandTech
iPhone 3GS Noise Rejection Demonstration - GSM/UMTS - AT&T by AnandTech

Subjectively, the 4S has further improved ambient noise rejection over the 4. I ran this test twice to make sure it wasn’t a fluke, and indeed the 4S subjectively has less noticeable ambient noise than the 4 even at absurd volume levels.

We’ve also placed the usual test calls to the local ASOS weather station and recorded the output. I can’t detect any difference in line-out quality of the voice call for better or worse, at least on GSM/UMTS. I’d expect the 4S to offer exactly the same quality on CDMA as the CDMA iPhone 4.

Apple iPhone 4S (GSM/UMTS) - ASOS Test Call by AnandTech

One thing I should note is that there does seem to be a bit more perceptible line noise on the 4S’ earpiece when on phone calls. It isn’t a huge difference, but there is definitely a bit more background noise on the 4S earpiece than the 4 in calls. The original 4S that Anand purchased had a noticeable and distracting amount of background noise, though swapping that unit out seems to have somewhat mitigated the problem (he still complains of audible cracking via the earpiece during calls). I’ve tested enough iPhone 4 handsets (and been through several) to know that there is a huge amount of variance in earpiece quality, (even going through one with an earpiece that sounded saturated/overmodulated at every volume setting), so I wager this might have been what was going on.

Siri Display
Comments Locked

199 Comments

View All Comments

  • metafor - Tuesday, November 1, 2011 - link

    When you say power efficiency, don't you mean perf/W?

    I agree that perf/W varies depending on the workload, exactly as you explained in the article. However, the perf/W is what makes the difference in terms of total energy used.

    It has nothing to do with race-to-sleep.

    That is to say, if CPU B takes longer to go to sleep but it had been better perf/W, it would take less power. In fact, I think this was what you demonstrated with your second example :)

    The total energy consumption is directly related to how power-efficient a CPU is. Whether it's a slow processor that runs for a long time or a fast processor that runs for a short amount of time; whichever one can process more instructions per second vs joules per second wins.

    Or, when you take seconds out of the equations, whichever can process more instructions/joule wins.

    Now, I assume you got this idea from one of Intel's people. The thing their marketing team usually forgets to mention is that when they say race-to-sleep is more power efficient, they're not talking about the processor, they're talking about the *system*.

    Take the example of a high-performance server. The DRAM array and storage can easily make up 40-50% of the total system power consumption.
    Let's then say we had two hypothetical CPU's with different efficiencies. CPU A being faster but less power efficient and CPU B being slower but more power efficient.

    The total power draw of DRAM and the rest of the system remains the same. And on top of that, the DRAM and storage can be shut down once the CPU is done with its processing job but must remain active (DRAM refreshed, storage controllers powered) while the CPU is active.

    In this scenario, even if CPU A draws more power processing the job compared to CPU B, the system with CPU B has to keep the DRAM and storage systems powered for longer. Thus, under the right circumstances, the system containing CPU A actually uses less overall power because it keeps those power-hungry subsystems active for a shorter amount of time.

    However, how well this scenario translates into a smartphone system, I can't say. I suspect not as well.
  • Anand Lal Shimpi - Tuesday, November 1, 2011 - link

    I believe we're talking about the same thing here :)

    The basic premise is that you're able to guarantee similar battery life, even if you double core count and move to a power hungry OoO architecture without a die shrink. If your performance gains allow your CPU/SoC to remain in an ultra low power idle state for longer during those workloads, the theoretically more power hungry architecture can come out equal or ahead in some cases.

    You are also right about platform power consumption as a whole coming into play. Although with the shift from LPDDR1 to LPDDR2, an increase in effective bandwidth and a number of other changes it's difficult to deal with them independently.

    Take care,
    Anand
  • metafor - Tuesday, November 1, 2011 - link

    "If your performance gains allow your CPU/SoC to remain in an ultra low power idle state for longer during those workloads, the theoretically more power hungry architecture can come out equal or ahead in some cases."

    Not exactly :) The OoOE architecture has to perform more tasks per joule. That is, it has to have better perf/W. If it had worse perf/W, it doesn't matter how much longer it remains idle compared to the slower processor. It will still use more net energy.

    It's total platform power that may see savings, despite a less power-efficient and more power-hungry CPU. That's why I suspect that this "race to sleep" situation won't translate to the smartphone system.

    The entire crux relies on the fact that although the CPU itself uses more power per task, it saves power by allowing the rest of the system to go to sleep faster.

    But smartphone subsystems aren't that power hungry, and CPU power consumption generally increases with the *square* of performance. (Generally, this wasn't the case of A8 -> A9 but you can bet it's the case to A9 -> A15).

    If the increase in CPU power per task is greater than the savings of having the rest of the system active for shorter amounts of time, it will still be a net loss in power efficiency.

    Put it another way. A9 may be a general power gain over A8, but don't expect A15 to be so compared to A9, no matter how fast it finishes a task :)
  • doobydoo - Tuesday, November 1, 2011 - link

    You are both correct, and you are also both wrong.

    Metafor is correct because any chip, given a set number of tasks to do over a fixed number of seconds, regardless of how much faster it can perform, will consume more energy than an equally power efficient but slower chip. In other words, being able to go to sleep quicker never means a chip becomes more power efficient than it was before. It actually becomes less.

    This is easily logically provable by splitting the energy into two sections. If 2 chips are both equally power efficient (as in they can both perform the same number of 'tasks' per W), if one is twice as fast, it will consume twice the energy during that time, but complete in half the time, so that element will ALWAYS be equal in both chips. However, the chip which finished sooner will then have to be idle for LONGER because it finished quicker, so the idle expense of energy will always be higher for the faster chip. This assumes, as I said, that the idle power draw of both chips being equal.

    Anand is correct, because if you DO have a more power efficient chip with a higher maximum wattage consumption, race to sleep is the OFTEN (assuming reasonable idle times) the reason it can actually use less power. Consider 2 chips, one which consumes 1.3 W per second (max) and can carry out '2' tasks per second. A second chip consumes 1 W per second (max), and can carry out '1' task per second (so is less power efficient). Now consider a world without race-to-sleep. To carry out '10' tasks over a 10 second period, Chip one would take 5 seconds, but would remain on full power for the full 10 seconds, thereby using 13W. Chip two would take 10 seconds, and would use a total of 10W over that period. Thus, the more power efficient chip actually proved less power efficient.

    Now if we factor in race-to-sleep, the first chip can use 1.3 for the first 5 seconds, then go down to 0.05 for the last 5. Consuming 6.75W. The second chip would still consume the same 10W.

    Conclusion:

    If the chip is not more power effficient, it can never consume less energy, with or without race-to-sleep. If the chip IS more power efficient, but doesn't have the sleep facility, it may not use less energy in all scenarios.

    In other words, for a higher powered chip to reduce energy in ALL situations, it needs to a) be more power efficient fundamentally, and b) it needs to be able to sleep (race-to-sleep).
  • djboxbaba - Monday, October 31, 2011 - link

    Well done on the review Brian and Anand, excellent job as always. I was resisting the urge to tweet you about the eta of the review, and of course I end up doing it the same day as your release the review :).
  • Mitch89 - Monday, October 31, 2011 - link

    "This same confidence continues with the 4S, which is in practice completely usable without a case, unlike the GSM/UMTS iPhone 4. "

    Everytime I read something like this, I can't help but compare it to my experience with iPhone 4 reception, which was never a problem. I'm on a very good network here in Australia (Telstra), and never did I have any issues with reception when using the phone naked. Calls in lifts? No problem. Way outside the suburbs and cities? Signal all the way.

    I never found the iPhone 4 to be any worse than other phones when I used it on a crappy network either.

    Worth noting, battery life is noticeably better on a strong network too...
  • wonderfield - Tuesday, November 1, 2011 - link

    Same here. It's certainly possible to "death grip" the GSM iPhone 4 to the point where it's rendered unusable, but this certainly isn't the typical use case. For Brian to make the (sideways) claim that the 4 is unusable without a case is fairly disingenuous. Certainly handedness has an impact here, but considering 70-90% of the world is right-handed, it's safe to assume that 70-90% of the world's population will have few to no issues with the iPhone 4, given it's being used in an area with ample wireless coverage.
  • doobydoo - Tuesday, November 1, 2011 - link

    I agree with both of these. I am in a major capital city which may make a difference, but no amount or technique of gripping my iPhone 4 ever caused dropped calls or stopped it working.

    Very much an over-stated issue in the press, I think
  • ados_cz - Tuesday, November 1, 2011 - link

    It was not over-stated at all and the argument that most people are right handed does not hold a ground. I live in a small town in Scotland and my usual signal strength is like 2-3 bars. If browsing on net on 3G without case and holding the iPhone 4 naturaly with left hand (using the right hand for touch commands ) I loose signal completely.
  • doobydoo - Tuesday, November 1, 2011 - link

    Well the majority of people don't lose signal.

    I have hundreds of friends who have iPhone 4's who've never had any issue with signal loss at all.

    The point is you DON'T have to be 'right handed' for them to work, I have left handed friends who also have no issues.

    You're the exception, rather than the rule - which is why the issue was overstated.

    For what it's worth, I don't believe you anyway.

Log in

Don't have an account? Sign up now