It seems that each time an LTE handset comes out, there’s invariably some perceived issue with connectivity and stability. This time, focus is being placed on Verizon’s CDMA/LTE variant of the Galaxy Nexus, and the issue surrounds LTE connectivity robustness compared to the other LTE handsets out there.

I’ve been running battery life tests on our LTE Galaxy Nexus review unit since release day (a process that takes a considerable amount of time and results in our reviews posting a while behind everyone else’s), but have had some time to run tests and gauge subjective performance. I found that LTE connectivity and performance felt above average, subjectively, and noted that in a tweet. After complaints started to surface, I spent a considerable amount of time reading the threads on XDA and other places around the web trying to discern what the complaints are about. I’ve seen a couple of big misconceptions that I think really get to the heart of the matter.

First off, is some background. The Verizon CDMA/LTE Galaxy Nexus (codename “mysid”) uses a combination of Samsung CMC221 and Via Telecom CBP 7.1 for LTE and CDMA 1x/EVDO connectivity, respectively. This is virtually identical (unsurprisingly) to the Droid Charge, which used a CMC220 for LTE and the same CBP 7.1. The CMC22x family is UE Category 3, which currently is the highest for shipping devices and means it can handle up to 100 Mbps downstream with 20 MHz FDD. To date, all of the LTE basebands in Verizon LTE devices have been UE category 3 with the exception of Motorola’s devices, which are all UE category 2, but I digress. We’ve reached out to Samsung Semiconductor about what’s changed between CMC220 and 221, but doubtless the changes improve connection stability and reliability.

Speeds thus far have also been excellent. I’ve squeezed in 183 speedtests between battery life testing, and have seen some of the fastest LTE connectivity out of the Galaxy Nexus to date. After testing so many Motorola LTE devices with UE Category 2 modems, it’s refreshing to see this kind of performance out of a UE Category 3 device.

Downstream

Upstream

Latency

The issue that most people talk about centers around signal strength, and this is where a few misconceptions kick in. I’ve gotten a few emails and tweets and read pages on forums where people are implicitly comparing CDMA2000 1x/EVDO field strength to LTE field strength. The issue here is that on basically all of the LTE/CDMA Verizon handsets, the field under “Signal Strength” in about refers to EVDO signal strength, and not LTE signal strength. The two aren’t comparable at all for a host of reasons - different spectrum (800 MHz and 1900 MHz for 1x/EVDO as opposed to 700 MHz for LTE), and different cells (there’s some correlation, but not every Verizon base station has LTE onboard). The end result is that if you’re comparing 1x/EVDO signal strength to LTE signal strength, you’re making an absolutely meaningless apples to oranges comparison.


This is not a valid comparison - LTE versus EVDO signal strength

The Galaxy Nexus (and really just Android 4.0) now correctly reports and accommodates LTE by reporting its signal strength under “About->Status” and visualizing that as bars appropriately. Switch to EVDO on the Galaxy Nexus and signal strength appropriately changes to reflect an entirely different air interface’s signal strength. It’s nice to see people using dBm instead of bars when possible (which are effectively meaningless as a comparison metric), but now that there are multiple air interfaces on handsets, we have to be explicit about what numbers we’re actually comparing.

This reporting is a problem I’ve talked about at length in more than one LTE handset review, and to date I only know of ways to show LTE signal strength and channel quality on a few handsets. Samsung’s Droid Charge (courtesy Samsung’s excellent ServiceMode application viewed through *#0011# after some unlock trickery) and the Bionic (through logcat and grepping for the radio signal status daemon) report LTE field strength, but only if you dig for them.

Comparing LTE Signal Strength the Right Way

So how does the LTE Galaxy Nexus compare to the Droid Charge and Bionic, the two handsets we can actually view LTE signal strength in dBm on? Very closely as a matter of fact.

I have a Bionic kicking around which has to go back very soon, but fired up logcat and put the Galaxy Nexus next to it. The Bionic reports signal strength pretty constantly whereas in Android 4.0 the number has some hysteresis, but here the numbers are pretty darn close, with the Bionic hovering between -91 and -95 dBm, and the Galaxy Nexus reporting an average of -92 dBm.


Left: Motorola Droid Bionic (logcat showing LTE signal strength), Right: Galaxy Nexus  

Since the Droid Charge is the only other handset I know how to show LTE signal strength on, I tracked a friend down at a local cafe with one and fired up service mode. Again, what’s shown under “About->Status” on the Droid Charge is actually EVDO signal strength. Here the Galaxy Nexus shows -107 dBm and the Droid Charge shows -108 dBm.


Left: Samsung Droid Charge (ServiceMode) Right: Galaxy Nexus

The Droid Charge is another hilarious example of why you can’t compare bars at all, as the Charge shows a positively laughable 4 out of 5 bars in an area with very low LTE signal strength, whereas the Galaxy Nexus (moreover, Android 4.0) has a very conservative and realistic strength to bars mapping. Carriers love to make things out to be better than they really are, however, and the result is this kind of hilarious visualization which portrays LTE signal as being much better than it really is if you stare at bars all day.

Verizon confirming though a tweet that there’s some sort of signal issue affecting the Galaxy Nexus confuses me, since from my perspective there isn’t any issue at all. The only real issue that exists is that the Galaxy Nexus (and really just the stock Android 4.0 signal strength to bars mapping) doesn’t line up with what Verizon has shipped on other devices, thus leading people to make apples to oranges comparisons and imagine an issue. I wager that some of this confusion is also compounded from the number of Verizon customers that are just now getting their first LTE handset with the Galaxy Nexus. It might be surprising to discover that LTE coverage right now isn't nearly as good as 1x/EVDO, but these things will improve as the carrier's LTE rollout continues. The other big disclamer is that I haven't fully investigated 1x/EVDO performance on the Galaxy Nexus, but this will end up being virtually identical to the Droid Charge.

There’s a CDMA and LTE baseband update coming with the LTE Galaxy Nexus’ 4.0.3 update as shown above, but this will likely do more to address connection stability than change the way anything is reported. Given how much attention this has gotten, however, I would not be surprised to see Google make a change to its signal strength to bars mapping for LTE and placebo away an issue that never really existed to begin with. That's also an unfortunate change, since from my perspective the Galaxy Nexus is one of the first handets that doesn't have an unrealistic mapping. In the meantime, we're still working on our Galaxy Nexus review where we'll take a complete look at the LTE/CDMA and GSM/UMTS Galaxy Nexii. 

Update:

As predicted, Verizon has made a statement to The Verge and Computerworld stating that there's nothing wrong with the RF performance characteristics or baseband firmware on the LTE/CDMA Galaxy Nexus. Instead, they will upstream some changes to Android to make the device report its bars visualization in line with the rest of its 4G LTE hardware portfolio. 

"[Verizon] will adjust the signal strength indicator to more closely match other Verizon Wireless devices.

Comments Locked

154 Comments

View All Comments

  • slowhand - Tuesday, December 20, 2011 - link

    in a weak 3g signal area at work in a metal building... placing my galaxy nexus near a moto razr - the razr had 3g and 4 bars, my galaxy nexus 1x and no bars... every day...

    something is different, eh?

    Sitting on it now waiting to see of there is a fix other than a placebo... I cannot download where the razr does so effortlessly....

    just dont like the locked batt compartment of the razr or I would get one.
  • wpwoodjr - Tuesday, December 20, 2011 - link

    Brian's results for the Nexus on Speedtest.net aren't that impressive... I can regularly get 24 mbs on my Bionic if I use the Washington, DC based server.
  • name99 - Tuesday, December 20, 2011 - link

    "The Galaxy Nexus (and really just Android 4.0) now correctly reports and accommodates LTE by reporting its signal strength under “About->Status” and visualizing that as bars appropriately. Switch to EVDO on the Galaxy Nexus and signal strength appropriately changes to reflect an entirely different air interface’s signal strength. It’s nice to see people using dBm instead of bars when possible (which are effectively meaningless as a comparison metric), but now that there are multiple air interfaces on handsets, we have to be explicit about what numbers we’re actually comparing."

    I think it's a mistake that you are giving so much weight to "signal strength". Rather than educating the population, you're simply spreading the misinformation that this is a useful metric.

    Let's review:
    * THE significant fact about wireless digital communication is that the signal strength varies DRAMATICALLY (ie by factors of tens of dB) over a very small spatial range (about the size of the phone) and time-scale (10s of ms).
    * This in turn means that the MEAN of the signal distribution tells us little about how we can usefully use the signal.
    * Furthermore, the primary "noise" source for most cell systems nowadays is not thermal noise (ie a largely fixed number), it is noise from other devices. This matters because signal strength, by itself, is meaningless --- what matters in terms of how easy it is to figure what bits have been transmitted is the ratio of the signal to the noise PLUS INTERFERENCE.

    * So, given all this, what do you expect "signal strength" to display? The mean strength of an electrical signal? Or (CORRECT) this strength weighted in some fashion that reflects how much data the signal can carry? Should it display the strongest signal of what the two antennas measure or the weakest or an average? Should it be modified if the receiving tower has four smart reception antennas that can do a better job of receiving a weak signal? Should it be decreased if there is a lot of interference in the cell? Should it be increased if I'm using a new chip with a smarter FEC algorithm that can do a better job of extracting signal from noise? etc etc etc

    * The actual engineering of the system is not blind to this. A variety of modulation schemes, a variety of FEC schemes, and now a variety of ways to utilize multiple antennas (both at the base station and at the phone) attempt to work around and exploit these statistical characteristics. But these further add to the irrelevance of "signal strength".

    * But what this all boils down to is that "signal strength" MEANS FSCKALL. Yes, if your "signal strength" is zero bars, then you're going to get no throughput. But beyond that, all bets are off. The number anyone cares about is throughput. That's what should be reported in reviews (and that's what should, one way or another, be displayed as "bars").
    I guess this obsession with "signal strength" comes from the fact that it is *something* that the phone can show you in the absence of any actual transmissions. But the fact that it's easy to measure doesn't change that fact that it's now 2011, not 1988, and it is a MEANINGLESS indicator.

    Now I don't expect the phone companies or the carriers to do anything about this soon. They've all experienced (and created) too much stupidity (from Antennagate to "more bars in more places") to have any incentive to improve what they are displaying to the user.
    But AnandTech is not under the same constraints. There is nothing stopping you from writing, in big bold letters: "LISTEN UP MORONS. STFU TALKING ABOUT SIGNAL STRENGTH BECAUSE IT SIMPLY REVEALS YOUR CLUELESSNESS."
  • ssddaydream - Wednesday, December 21, 2011 - link

    I mostly agree that signal strength isn't a good indicator of overall signal quality.
    Signal strength is certainly more useful than bars. Bars, of course, are nearly useless.
    Does anybody know if there is a way to measure noise floor with the phone?
    Signal to noise ratio, in my experience, is a decent indicator of signal quality. I think there are exceptions, but I've had very good luck with all the wireless links I've used that have a SNR greater than 40db.

    For all practical purposes (assuming that the bar levels are well-adjusted as found by testing), what more useful metric do you have than bars?
    Face it, dbm or db is confusing.
    You complain about bars but what simple and useful method do you suggest?
    If I want a better idea about signal quality, I usually ping a website as I run a speedtest.
  • name99 - Wednesday, December 21, 2011 - link

    I said what the correct metric is --- the data rate. In simplified terms, what should be displayed is the cell phone equivalent of the WiFi MCS index.
    I also said what the problem is --- you don't have an exact handle on this number until you start transmitting data. However I suspect you can figure out some reasonable approximation based on what you do know, and that approximation will be a whole lot more useful than a dBm number.
  • Korey_Nicholson - Wednesday, December 21, 2011 - link

    Geometry 101, independent variable on the x-axis (number of tries or time), dependent variable on y-axis (results or measure)
  • Korey_Nicholson - Wednesday, December 21, 2011 - link

    And that means don't use a bar graph for this data...use a scatter plot lmao...c'mon!
  • AnnoyedGrunt - Wednesday, December 21, 2011 - link

    I believe the graphs are a histogram showing the number of occasions each speed was achieved. IMO they make sense.
  • crankerchick - Wednesday, December 21, 2011 - link

    "Where did Brian ever say that there aren't 4G to 3G handoff issues (threshold at which it switches, hanging on the switchover etc) or that there wasn't a 3G signal strength issue?

    Everyone's accusing him of this on all the blogs now since the article caught on but that isn't at all what he said."

    I think people are making assumptions from Brian's article and putting words in his mouth. My take-away from the article is simply him reporting how things are reported on the Nexus compared to other phones.

    However, his comments in this comment section to seem to indicate that he does indeed feel there is no signal issue and that it is a perception problem due to differences in the reporting from one phone to the next. This I disagree with. There is a problem. When i'm on EvDO, my signal readings are consistently lower than my Droid and the Rezound. I plan to compare to my husband's Thunderbolt when I get a chance. And it has already been stated multiple times that the Galaxy Nexus isn't handling 3G/4G handoff as gracefully as some other phones or holding the 4G signal as well as some other phones. Although this wasn't the focus of Brian's article, I do hope that it will be a part of the Nexus review when it comes out. I really do believe there is something not playing well in the Nexus--it's hard not to when there are report after report along with my own experiences, that I can see plain as day that the Nexus isn't keeping 4G or 3G signal as well as other phones, regardless of what signal strength reading is reported by any of the phones.

    I always look forward to AnandTech reviews for being more substantial, informative, and technical and really delving into performance as opposed to just reporting on specifications like most reviewers do. I hope the forthcoming review will focus on the RF performance of this device, because something is wrong--either by design or by a fault.
  • Tabs - Wednesday, December 21, 2011 - link

    I don't see him saying anything at all about EVDO - he said as much in replies to me after the article was posted too.

    I see the same thing as you do on EVDO - lower dB than my OG Droid at the same location in low signal areas. I think that is likely a real issue.

    A bunch of the commenters at other blogs though (Droid Life, Engadget etc) are jumping all over this article as if he claimed there are no 3G issues or that there's not an LTE/3G handoff threshold issue, when that's not at all what he claimed.

Log in

Don't have an account? Sign up now