Cellular

I stated before that AT&T’s HTC One X is really a One XL, and that the L connotes LTE inside. The reason is that the phone is based around Qualcomm’s MSM8960 which includes the company’s latest and greatest baseband. It’s the same block as what’s in MDM9615 (which we await with bated breath), and again gets 28nm goodness. MSM8960 supports virtually every air interface - CDMA2000 1x/EVDO up to Rev. B (multicarrier), GSM/EDGE, WCDMA (up to DC-HSPA+ Cat.24), TD-SCDMA for China, and of course LTE up to category 3 with 3GPP Release 9. A couple specifications pages erroneously list the One X AT&T as supporting AWS for WCDMA, however the device does not work with T-Mobile WCDMA and the One XL page lists the correct air interface support.

HTC One X AT&T- Network Support
GSM/EDGE Support 850 / 900 / 1800 / 1900 MHz
WCDMA Support 850 / 1900 / 2100 MHz
LTE support 700 MHz (Band 17), AWS (Band 4) - UE Category 3
Baseband Hardware MSM8960 Baseband
HSPA Speeds HSDPA 21.1 (Cat 14) / HSUPA 5.76 (Cat.6)

In the case of the AT&T One X, we’re talking about 5 or 10 MHz FDD-LTE and HSPA+ up to 21.1. Even though the baseband can do multicarrier HSPA+ with ease, AT&T still is only running 16QAM (HSDPA 14.4) in most markets, and 64QAM in some (HSDPA 21.1), though I’ve never ever seen it. The device is limited using some parameters set in build.prop like I’ve seen before:

ro.ril.hsdpa.category=14
ro.ril.hsupa.category=6
ro.ril.hsxpa=4

HSDPA Category 14 corresponds to 21.1 Mbps, and category 6 on the uplink is what everyone is running at maximum right now, at 5.76 Mbps. The FCC filing for the AT&T One X includes note that this is indeed LTE Category 3, and the expected LTE Band 4 and 17 compliance with both 5 and 10 MHz channels.

 

The AT&T One X uses circuit switched fallback (CSFB) to deliver WCDMA 3G voice when in an LTE market, the switch to IMS voice will come later, but for now know that there’s no simultaneous voice and LTE, you hard handover to WCDMA, do the call, then hand back up.


HTC One X AT&T Antenna Locations

One of the first things I usually do on any smartphone that's handed to me is look for Field Test, and on WCDMA/UMTS HTC phones that's usually found through dialing *#*#7262626#*#*. I know that at least one prototype HTC One X (AT&T) model was verified to have field test which launched with that well known dialer code. Unfortunately, all of the HTC Ones have no such field test/engineering menus - I've searched using all the tricks I know and found nothing. That said you can still get LTE RSRP and RSCP under About -> Networks, or from alogcat on the One X AT&T. I would still prefer proper FieldTest with RRC state information, it's unfortunate to see HTC sanitizing release images prior to launch, and I'm not sure what the motivations possibly could be for removing this even on the international variants.

 

To test AT&T LTE on the HTC One X, I drove a total of over 350 miles up to and around Phoenix, AZ, an AT&T 10 MHz FDD market, and ran over 180 tests using the speedtest.net app, which I then exported and made some pretty histograms from. The results are pretty positive, with a few spikes over 60 Mbps - as a reminder the theoretical maximum for 10 MHz FDD-LTE on a UE Category 3 device is 73 Mbps.

Downstream LTE Upstream LTE Latency LTE

AT&T LTE is quite fast, although it is admittedly still nascent and thus not quite as loaded with as many devices as Verizon’s. That said 60+ Mbps tests are always good fun to see without much effort at all. I drove around and spotted a number of LTE base stations in Phoenix, AZ with remote radio heads, which means vastly reduced cable losses.


One of my best AT&T LTE tests

On HSPA+ in my home market, I was able to hit impressive speeds thanks to the combination of both the default LTE “pta” APN for AT&T data, and Rx diversity on the One X.

Downstream HSPA Upstream HSPA Latency HSPA

On HSPA+ I’m able to hit right up near the 14.4 theoretical maximum for single carrier WCDMA with 16 QAM in my market. This is very impressive considering other devices I have routinely get 10–11 Mbps maximum in the same conditions.

GNSS

Like many other Qualcomm based devices, the HTC One X uses the gpsOneGen 8A with GLONASS GNSS system for location. It locks almost instantaneously indoors, and performs great.

The GLONASS behavior of this system is like other QCT systems I’ve seen, wherein it only looks for the GLONASS constellation when GPS SNR is low. You can see these with GPS Test and satellites numbered 65–68.

WiFi

WLAN and BT 4.0 on the HTC One X is courtesy onboard “wcnss_wlan” which I take it means the WiFi baseband onboard MSM8960 with some external RF (WCN3660). What’s interesting to me is that this is the first device I’ve seen that will tune 40 MHz channels on 5 GHz. The full breakdown is as follows - single spatial stream 802.11a/b/g/n with 20 MHz channels on 2.4 GHz, 40 MHz channels on 5 GHz, with all modes supporting the short guard interval rates. That means up to a 72 Mbps rate for 2.4 GHz, and 150 Mbps on 5 GHz. Note that the internal WiFi information screens erroneously report 65 Mbps in all conditions even when the MCS negotiated with an AP is higher.

WiFi Performance

HTC includes the proper band preference tab inside the advanced settings for WiFi, alongside an interesting high performance / higher power checkbox.

I’m not entirely sure what the checkbox does, but the band preference tab (which mirrors that from Galaxy Nexus) works properly. Unsurprisingly the One X posts the highest WLAN throughput I’ve seen from a smartphone to date.

NFC

The One X includes NFC tag and beaming support courtesy the ubiquitous NXP PN544. It works as expected and is exposed in the right way as far as I can tell. I tested both beaming with a Galaxy Nexus and my trusty NFC tag from the Nexus S review.

 

Calls and Speakerphone

Noise suppression on the HTC One X is courtesy an Audience A1028 voice processor, a part we’ve seen a lot of in recent years. The One X again locates a primary microphone at the very bottom, and a secondary microphone at the very top of the device. With these two, the Audience chip can do some DSP and isolate out noise very effectively. I’ve recorded a demonstration the way we normally do just so illustrate, and unsurprisingly it works very well.

HTC One X AT&T - Noise Rejection by AnandTech

Next up, I tested speakerphone volume the same way we always do, using an Extech digital sound data logger placed 3 inches from the device while calling the same ASOS weather station.

Speakerphone Volume - 3 inches Away

The speakerphone on the One X is on the backside and unfortunately lies nearly planar with the surface. Speakerphone volume on the HTC One X is loud enough to be good, but not chart topping.

Display - Infinity Screen Conclusions and Final Thoughts
Comments Locked

137 Comments

View All Comments

  • MrMilli - Tuesday, May 1, 2012 - link

    "On the GPU side, there's likely an NVIDIA advantage there as well."

    How do you get to this conclusion?
    Qualcomm scores a little bit higher in Egypt as in the Pro test of GLBenchmark. I don't know why you would put any importance to the off-screen tests for these two devices since they both run the same resolution (which is even 720p) which takes me to my next points. Actual games will be v-synced and how does the Tegra suddenly become faster than the Adreno even though they both are still rendering at the same resolution as on-screen but just with v-sync off. I've always had a hard time accepting the off-screen results of GLBenchmark because there's no way to verify if a device is actually rendering correctly (or maybe even cheating). Can you imagine testing a new videocard in the same fashion?
  • metafor - Tuesday, May 1, 2012 - link

    Results can vary with v-sync because Tegra could be bursting to higher fps values. The offscreen isn't a perfect test either but it gives you an idea of what would happen if a heavier game that didn't approach the 60fps limit would be like.

    Of course, those games likely won't have the same workloads as GLBenchmark, so it really wouldn't matter all that much.
  • ChronoReverse - Tuesday, May 1, 2012 - link

    The offscreen test is worthless really.

    If at 720p, the same benchmark, except it puts an image on the screen, shows that the S4 GPU is faster than the Tegra3 GPU, then how useless is the offscreen test showing the opposite?

    Furthermore, neither the S4 nor Tegra3 comes close to 59-60FPS, both tipping at around the 50FPS range.

    It's pretty clear that by skipping the rendering, the offscreen test is extremely unrealistic.
  • metafor - Wednesday, May 2, 2012 - link

    It doesn't need to come close. It just needs to burst higher than 60fps. Let's say that it would normally reach 80fps 10% of the time and remain 40fps the other 90%. Let's say S4 were to only peak to 70fps 10% of the time but remained at 45fps the other 905. The S4's average would be higher with v-sync while Tegra's would be higher without v-sync.

    The point of the benchmark isn't how well the phone renders the benchmark -- after all, nobody's going to play GLBenchmark :)

    The point is to show relative rendering speed such that when heavier games that don't get anywhere close to 60fps are being played, you won't notice stutters.

    Of course, as I mentioned, heavier games may have a different mix of shaders. As Basemark shows, Adreno is very very good at complex shaders due to its disproportional ALU strength.

    Its compiler unfortunately is unable to translate this into simple shader performance.
  • ChronoReverse - Wednesday, May 2, 2012 - link

    That's still wrong. If you spike a lot, then your experience is worse for 3D games. It's not like we don't know that minimum framerate is just as important.

    As you mentioned stutters, a device that dips to 40FPS would be more stuttery than one that dips only to 45FPS.
  • metafor - Thursday, May 3, 2012 - link

    I'm not disagreeing. I'm just saying that v-sync'ed results will vary even if it's not close to 60fps. Because some scenes will require very little rendering (say, a panning shot of the sky) and some scenes will require a lot of heavy rendering (say, multiple characters sword fighting, like in Egypt).

    The average fps may be well below 60fps. But peak fps may be a lot higher. In such cases, the GPU that peaks higher (or more often) will seem worse than it is.

    Now, an argument can be made that a GPU that also has very low minimum framerates is worse. But we don't know the distribution here.
  • Chloiber - Monday, May 7, 2012 - link

    Well the benchmark doesn't measure your experience in 3D games but the fps.
  • snoozemode - Tuesday, May 1, 2012 - link

    The ATRIX has a LCD pentile RGBW display, as well as the HTC one S, so LCD is definitely not a guarantee for RGB. Maybe you should correct the article with that.
  • snoozemode - Tuesday, May 1, 2012 - link

    Sorry one s is obviously amoled.
  • ImSpartacus - Tuesday, May 1, 2012 - link

    It's also the passable RGBG pentile, not the viled RGBW pentile.

Log in

Don't have an account? Sign up now