A Farewell to the Dell XPS 14

Sandy Bridge isn’t the only game in town, of course, so we’ve got a few other items to cover. After the discussion of Sandy Bridge on the previous page, we also want to take this chance to talk about what will likely be our last Arrandale laptops. First up we have the Dell XPS 14 L401x, the “little brother” of the XPS trio announced last October. In terms of specs, the L401x is virtually identical to the L501x, only in a smaller chassis.

Our test unit came with the same i5-460M CPU, GT 420M graphics, and 56Wh battery. Visually, the three XPS laptops are all the same: rounded corners with a silver matte finish, and aluminum palm rests. The design works well enough, though we tend to prefer cleaner lines. However, some of the extra features offered in the XPS 15 were enough that it garnered our Gold Editors’ Choice award—specifically, we liked the combination of a high quality 1080p LCD upgrade, Optimus Technology graphics, and excellent audio. We mentioned in December that the 1080p upgrade had disappeared from the Dell website, but we’re happy to report that the display upgrade is once more back in stock. (You can find it under the “Colors” configuration area—it’s now priced at $195 instead of $130, but the base price has dropped to $799 as opposed to $899 so the final cost with the 1080p display is now under $1000.)

We mention the things we liked about the XPS 15 as a jumping off point for the XPS 14, because unfortunately it loses most of the best features. The L401x still comes (rather, came) with JBL speakers and WAVES Maxx Audio, but without the subwoofer it lacks the bass punch of the larger models. There’s no LCD upgrade available either, and the standard 1366x768 display is just as poor as the other LCDs we’ve lambasted during the previous couple of years. Finally, the smaller chassis apparently doesn’t have enough space for USB3.0 support, so that feature goes missing as well. Smaller isn’t always better, and putting the same components into a more cramped space also resulted in generally higher temperatures and noise levels—we’d certainly be concerned about upgrading the L401x to the quad-core Clarksfield processors!

There were some good aspects to the design, though. For instance, idle battery life improved 16% and Internet battery life went up 35%. (We noted in the L501x review that our Internet test did quite poorly, and it appears the L401x doesn’t suffer the same fate.) H.264 playback remains nearly the same at just under three hours. For gaming, the 768p resolution is also a better fit for the GT 420M GPU—1080p is simply too much for anything but low detail settings on most games without a faster GPU. The final benefit of course is the size and weight. Personally, 14” laptops are one of my favorite form factors in terms of combining a smaller size with a comfortable keyboard, reasonable display size, and battery life/portability. If the XPS 14 had offered a 1440x900 quality LCD upgrade, it would have been nearly “perfect” for mainstream use.

Ultimately, I’m not surprised to see the XPS 14 disappear. It wasn’t a bad laptop design, but there was very little on tap that you couldn’t already find in the Inspiron 14R. The main change is that you got NVIDIA Optimus in place of either Intel IGP or ATI HD 550v (a lower clocked and renamed HD 4650). The XPS keyboard also offered backlighting and the palm rest is aluminum rather than plastic. If you configure similar performance, the XPS price premium is fairly reasonable: $899 for the XPS 14 we have compared to $809 for the Inspiron 14R with an i5-460M, 4GB RAM, HD 550v, and 500GB HDD. That’s worth $90, certainly, but it still feels like it waters down the XPS (“Xtreme Performance System” back in the day) brand.

Anyway, we’ve added results from the XPS L401x to Mobile Bench as well. Combined with the updated NVIDIA 266.58 drivers, graphics performance is actually up relative to the L501x in most games, and you can see how the two models compare. Like the notebook on the previous page, the L401x came with a Western Digital Black HDD instead of the ubiquitous Seagate 7200.4 that was in the L501x we tested, so PCMark scores are up as well. However, CPU performance was down slightly in Cinebench and x264 encoding, and temperatures were up. It looks as though the smaller chassis couldn’t cope with the heat as well, and the result is that Intel’s Turbo Boost isn’t able to run quite as fast in CPU intensive benchmarks.

We expect Dell to come out with Sandy Bridge XPS laptops sometime in the next two or three months, but we’ve been told not to expect an XPS 14 update. That’s a shame, as we still think it could be a great form factor. Imagine a high quality 14” LCD with better performance and thermals—sort of like the Dell XPS equivalent of the MacBook Pro 13. That’s what we’d really like to see Dell do with the XPS brand: look at the Apple MacBook line here. The MacBook has basic features and performance at a reduced price; move up to the MacBook Pro 13 and you get essentially the same performance, but you add the unibody chassis and a much nicer LCD. In fact, every laptop in the MacBook Pro lineup has a good contrast LCD with reasonable color accuracy and a nearly ideal (for sRGB work) color gamut. So ditch the rounded corners, improve the build quality even more, and make every XPS laptop come with a quality LCD; then we’d have a brand that we could point to and say, “Sure, it costs more, but at least you get quality for your dollar.” Reusing the base Inspiron chassis just doesn’t seem like a great idea for a “performance” brand.

Which brings up another laptop: HP’s Envy 14. Long heralded as a great combination of price, performance, build quality, and features, users were asking for a review of the Envy 14 for a long time. HP never did manage to get us one, but one of our readers was kind enough to let us borrow his Envy 14—complete with the 1600x900 Radiance display upgrade!—to run it through our tests. At this stage, it’s too late in the game for a full review of the Envy 14, and like the XPS 14, the Radiance LCD is no longer available. However, with this mobile update already in the works we have a perfect chance for a rundown of the Envy 14. I’ll turn this over to Vivek, since he was the one to actually lay hands on the fabled laptop.

Why Sandy Bridge Matters for Notebooks HP’s Envy 14: An LCD That Was Too Good to Last?
Comments Locked

49 Comments

View All Comments

  • vikingrinn - Tuesday, February 8, 2011 - link

    @BWMerlin You might be right, but 17.3" display in 15.6" size chassis not entirely implausible (although not sure if they slimmed down the chassis of the G73 for the G73SW release?), as the M17x R3 had been slimmed to almost the same size chassis as the M15x and also had 900p as a display option.
  • JarredWalton - Tuesday, February 8, 2011 - link

    Note that I updated the article. MSI said I could pass along the fact that the testing was done with their GT680R. It's certainly fast enough for gaming, though there are some areas that could be improved (unless you like glossy plastic). Now we wait for PM67 version 1.01....
  • vikingrinn - Tuesday, February 8, 2011 - link

    @JarredWalton Thanks for the update - looking forward to a review of both the M17x R3 and G73SW soon then! ;)
  • stmok - Monday, February 7, 2011 - link

    "What we know of Llano is that it will combine a K10.5 type CPU architecture with a midrange DX11 GPU (something like the HD 5650), integrated into a single chip."

    Firstly, AMD's Llano will be marketed as its "A-series" APU line. (Where G-series, E-series and C-series belong to their Bobcat-based lines.)

    Llano is a modified version of the Athlon II series with Radeon HD 5550 GPU as its IGP. The APU will feature Turbo Core 2.0 Technology (power gating, etc). It will use DDR3-1600 memory.

    Llano's x86 cores are codenamed "Husky".

    The IGP in Llano has two versions:
    One is codenamed "Winterpark" => Only in dual-core versions of APU.
    One is codenamed "Beavercreek". => Only in triple and quad-core versions of APU.

    For TDP spec, there will be two distinct lines for the desktop version of Llano.
    => 65W (dual-cores and low power quad-cores) and 100W (triple and quad-cores).

    As well the solution will allow for Hybrid-Crossfire configuration.
    => Llano IGP + Radeon HD 6570 or HD 6670 video cards.

    Performance wise...(According to AMD's presentation I saw.)

    Dual-core Llano
    => Overall, lags slightly behind Athlon II X2 250 (3.0Ghz) and Pentium E6500 (2.93Ghz)

    Quad-core Llano
    => Its slightly slower than a current Athlon II X4 630 with Radeon HD 5550 discrete video card.

    So in the end...

    Sandy Bridge => Far better CPU side. Not as good with IGP.
    Llano => Far better IGP. Not as good on CPU side.

    If you want an APU that will be revolutionary, its best if you wait for "Trinity" in 2012.
  • Taft12 - Monday, February 7, 2011 - link

    This is great detail, more than I have ever seen about Llano before now (and thanks a bunch for it!)

    Is this from publically available AMD documentation? You said this was from a presentation you saw...
  • Kiijibari - Monday, February 7, 2011 - link

    First, you wrote APU, even though there is no Bulldozer APU, yet. Zambezi and Interlagos/Valencia are normal CPUs. You correctly mentioned Trinity later, which is an APU, but that is already Bulldozer v2.0, and it is far away due in 2012.

    Second, you stated that cache-sizes are unkonwn - they are not:
    See AMD's blog, link removed due to SPAM detection bot.

    Third you speculate about a launch similar to the K8's in 2003, however; it is already know that desktop parts will launch *prior* to server parts in Q2:
    <Link removed due to SPAM detection, just read the analyst day slides again>
  • JarredWalton - Monday, February 7, 2011 - link

    I've corrected some of the text to clarify the meaning. Orochi is the eight-core design, with "Zambezi" for desktops and "Velencia" destined for servers. AFAICT, it's the same chip with different packages depending on the market (and I'd guess AMD is using the extra time between desktop and servers to do extra validation). Zambezi is also apparently a name for the desktop platform in general, unless the "four core and six core Zambezi" won't get a separate name.

    Given the purported size of the Orochi core, I can see four-core and six-core being harvested die, but they're still going to be huge. Right now, it appears the eight-core will have 16MB total L2 cache (2MB per core!) and an additional 8MB L3 cache. Long-term, the four-core and six-core should get separate designs so they don't have to be quite so large. Those are the chips that I expect won't be out for desktops until Q3/Q4.
  • Cow86 - Tuesday, February 8, 2011 - link

    Sorry there Jarred, first time poster, long time reader, but I hád to correct you on this :P Two things are wrong in what you say:

    1) The 8 core, 4 module bulldozer chip will have 8 MB of L2 cache (2 MB shared per MODULE, not core), and 8 MB L3 cache. This has been confirmed by Fruehe in discussions plenty of times, and you'll find it all over the web.

    2) Whilst you can indeed expect the 6-core to be harvested (as it will also keep the 8 MB of L3 cache) it is rather clear the 4-core will be separate, like the dualcore athlon II is now as well. The clue to this is the fact that the 4 core chip will only have 4 MB of L3 cache.

    http://www.techpowerup.com/134739/AMD-Zambezi-Bull...

    Look at the roadmap :)
  • JarredWalton - Wednesday, February 9, 2011 - link

    Oh, I guess I read the "2MB per module" wrong -- thought they had said 2MB per core. Somewhere else said 16MB cache, and that then made sense, but if it's 16MB cache total that also works. Anyway, long-term it would be potentially useful to have separate die for 3-module and 2-module as well as the standard 4-module, because even the 6-core is still going to have 2MB cache and 2 cores disabled. However, the time to do such a redesign might make it too costly, so maybe not. There's nothing to prevent AMD from disabling part of the L3 cache as well as the cores for a 4-core version though -- we've already seen Athlon X2 that were harvested Phenom X4 for instance. That's definitely not something you want to do a lot if you can avoid it, obviously.
  • DanNeely - Monday, February 7, 2011 - link

    "There’s actually a lot more work involved in moving a Redwood GPU architecture to 32nm, as most of the Intellectual Property (IP) related to GPUs targets the so-called half-nodes (55nm, 40m, and in the future 28nm). It’s one reason we expect AMD to eventually move all of their CPU and GPU production to such nodes, but that's a ways off and Llano will use the same process size as Intel’s current CPUs."

    What's actually different between the two? I assumed it was just a case of what they picked as the next scaling point. There've been a number of GPUs in the past that have dropped from half to full to half node again as each one became widely available. I'd've assumed the main engineering challenge would be optimizing for the quirks in GF's processes instead of TSMC's.

Log in

Don't have an account? Sign up now