HP’s Envy 14: An LCD That Was Too Good to Last?

I’d like to put a huge, huge shout out to Khoa Tran (theguynextdoor on the AT forums), who sent us his personal Envy 14 system for a couple of weeks just so that we could review it. HP never managed to get one to us for review, so Khoa coming through for us was an awesome move. It just goes to show how great our readership is—seriously, we love you guys.

So, on to the Envy. The Envy 14 is part of the second generation of Envy notebooks. It slots in nicely between the first gen 13” and 15” Envys, replacing both in one go and creating space for the range-topping Envy 17 we reviewed recently. And as we’ve mentioned previously, it’s a decently powerful notebook. Inside, we find Intel’s first generation Core i5 and i7 processors, ATI’s HD 5650 graphics card, a minimum of 4GB RAM, and the best screen of any notebook on the market. Unfortunately, that screen isn’t available anymore, but we’ll get to that in a bit.

So here's the part we all know—the styling is pretty derivative of Apple’s MacBook Pro line. But while the lines are similar from afar, up close the Envy isn’t actually as close to the ever-popular Apple portable as it first seems. The textured aluminum on the lid has an interesting, swirled pattern, and the slightly convex palm rest is rendered in the same material. Overall, the industrial design is quite good, and the build quality is just as good as one could expect from an aluminum-bodied notebook. It’s not quite on the level that Apple has reached with the MacBook Pro line, but it’s getting there.

That’s a pretty common theme with the Envy 14—it’s like HP’s take on the Apple formula. The backlit chiclet-style keyboard looks and feels nearly identical to the MacBook Pro’s keyboard. If you’ve used a unibody MacBook Pro, you know that’s a good thing. Unfortunately, HP still hasn’t figured out how to make a buttonless trackpad work. Far too often, you move the cursor to where you want it, and when you go to click in the designated part of the touchpad, the cursor ends up on the other side of the screen. You get used to the touchpad in the Envy, but it can be aggravating. But the best part of the entire package is the screen.

Gallery: HP Envy 14


HP’s 1600x900 Radiance display is a revelation, no way around it. The max brightness is 331 nits, with a black level of 0.325 nits. That works out to a contrast ratio of 1018:1. The first time I saw that, I ran the numbers through the calculator a second time to make sure that was actually right. If I’m not mistaken, it’s the highest contrast ratio we’ve seen on any notebook screen to date. [Ed: ASUS' G73J series managed about the same, but with a max brightness of 184 nits the Radiance display definitely gets the win.] Take that, Apple. Unfortunately, that screen is no longer available.

Back in September, we heard whispers that the screen was sold out, and HP removed the option from the Envy 14 ordering system. The option came back briefly in January, and we were then told that the Radiance display was sold out for good. So, if you want an Envy 14, you’re going to be stuck with the standard, mediocre 1366x768 panel.

Performance-wise, it’s around the same as other notebooks that have the Clarksfield/HD 5650 combo underhood. It’s not an uncommon combination, but for a 14” system it’s definitely on the higher side of the performance scale. It’s about on par with the XPS 14, as you can see in Mobile Bench. Battery life is basically identical across the board, but the comparison really shows what you give up with the standard LCD over the upgraded panel.

I’d assume that HP is updating the Envy 14 to Sandy Bridge in the near future (the 17 has already been upgraded), around the same 2-3 month time frame as the Dell XPS line. Unlike the XPS 14, the Envy 14 is a lock to be around for a long time to come—it’s been a huge seller for HP and it’s easy to see why. I enjoyed my time with it as much as I enjoyed the MacBook Pro last year, though a large part of that was due to the amazing display. But as much as I loved the Radiance display, I must acknowledge that the rest of the notebook is quite good. The industrial design and build quality are among the best for mainstream PC portables, and the price-to-performance ratio is quite good as well. For a starting price of just under a grand, you get a solid amount of stuff—2.66GHz Core i5, the HD 5650, 4GB memory, and a 750GB hard drive. If only the Radiance display was still an option.

A Farewell to the Dell XPS 14 And in the AMD Corner…
Comments Locked

49 Comments

View All Comments

  • vikingrinn - Tuesday, February 8, 2011 - link

    @BWMerlin You might be right, but 17.3" display in 15.6" size chassis not entirely implausible (although not sure if they slimmed down the chassis of the G73 for the G73SW release?), as the M17x R3 had been slimmed to almost the same size chassis as the M15x and also had 900p as a display option.
  • JarredWalton - Tuesday, February 8, 2011 - link

    Note that I updated the article. MSI said I could pass along the fact that the testing was done with their GT680R. It's certainly fast enough for gaming, though there are some areas that could be improved (unless you like glossy plastic). Now we wait for PM67 version 1.01....
  • vikingrinn - Tuesday, February 8, 2011 - link

    @JarredWalton Thanks for the update - looking forward to a review of both the M17x R3 and G73SW soon then! ;)
  • stmok - Monday, February 7, 2011 - link

    "What we know of Llano is that it will combine a K10.5 type CPU architecture with a midrange DX11 GPU (something like the HD 5650), integrated into a single chip."

    Firstly, AMD's Llano will be marketed as its "A-series" APU line. (Where G-series, E-series and C-series belong to their Bobcat-based lines.)

    Llano is a modified version of the Athlon II series with Radeon HD 5550 GPU as its IGP. The APU will feature Turbo Core 2.0 Technology (power gating, etc). It will use DDR3-1600 memory.

    Llano's x86 cores are codenamed "Husky".

    The IGP in Llano has two versions:
    One is codenamed "Winterpark" => Only in dual-core versions of APU.
    One is codenamed "Beavercreek". => Only in triple and quad-core versions of APU.

    For TDP spec, there will be two distinct lines for the desktop version of Llano.
    => 65W (dual-cores and low power quad-cores) and 100W (triple and quad-cores).

    As well the solution will allow for Hybrid-Crossfire configuration.
    => Llano IGP + Radeon HD 6570 or HD 6670 video cards.

    Performance wise...(According to AMD's presentation I saw.)

    Dual-core Llano
    => Overall, lags slightly behind Athlon II X2 250 (3.0Ghz) and Pentium E6500 (2.93Ghz)

    Quad-core Llano
    => Its slightly slower than a current Athlon II X4 630 with Radeon HD 5550 discrete video card.

    So in the end...

    Sandy Bridge => Far better CPU side. Not as good with IGP.
    Llano => Far better IGP. Not as good on CPU side.

    If you want an APU that will be revolutionary, its best if you wait for "Trinity" in 2012.
  • Taft12 - Monday, February 7, 2011 - link

    This is great detail, more than I have ever seen about Llano before now (and thanks a bunch for it!)

    Is this from publically available AMD documentation? You said this was from a presentation you saw...
  • Kiijibari - Monday, February 7, 2011 - link

    First, you wrote APU, even though there is no Bulldozer APU, yet. Zambezi and Interlagos/Valencia are normal CPUs. You correctly mentioned Trinity later, which is an APU, but that is already Bulldozer v2.0, and it is far away due in 2012.

    Second, you stated that cache-sizes are unkonwn - they are not:
    See AMD's blog, link removed due to SPAM detection bot.

    Third you speculate about a launch similar to the K8's in 2003, however; it is already know that desktop parts will launch *prior* to server parts in Q2:
    <Link removed due to SPAM detection, just read the analyst day slides again>
  • JarredWalton - Monday, February 7, 2011 - link

    I've corrected some of the text to clarify the meaning. Orochi is the eight-core design, with "Zambezi" for desktops and "Velencia" destined for servers. AFAICT, it's the same chip with different packages depending on the market (and I'd guess AMD is using the extra time between desktop and servers to do extra validation). Zambezi is also apparently a name for the desktop platform in general, unless the "four core and six core Zambezi" won't get a separate name.

    Given the purported size of the Orochi core, I can see four-core and six-core being harvested die, but they're still going to be huge. Right now, it appears the eight-core will have 16MB total L2 cache (2MB per core!) and an additional 8MB L3 cache. Long-term, the four-core and six-core should get separate designs so they don't have to be quite so large. Those are the chips that I expect won't be out for desktops until Q3/Q4.
  • Cow86 - Tuesday, February 8, 2011 - link

    Sorry there Jarred, first time poster, long time reader, but I hád to correct you on this :P Two things are wrong in what you say:

    1) The 8 core, 4 module bulldozer chip will have 8 MB of L2 cache (2 MB shared per MODULE, not core), and 8 MB L3 cache. This has been confirmed by Fruehe in discussions plenty of times, and you'll find it all over the web.

    2) Whilst you can indeed expect the 6-core to be harvested (as it will also keep the 8 MB of L3 cache) it is rather clear the 4-core will be separate, like the dualcore athlon II is now as well. The clue to this is the fact that the 4 core chip will only have 4 MB of L3 cache.

    http://www.techpowerup.com/134739/AMD-Zambezi-Bull...

    Look at the roadmap :)
  • JarredWalton - Wednesday, February 9, 2011 - link

    Oh, I guess I read the "2MB per module" wrong -- thought they had said 2MB per core. Somewhere else said 16MB cache, and that then made sense, but if it's 16MB cache total that also works. Anyway, long-term it would be potentially useful to have separate die for 3-module and 2-module as well as the standard 4-module, because even the 6-core is still going to have 2MB cache and 2 cores disabled. However, the time to do such a redesign might make it too costly, so maybe not. There's nothing to prevent AMD from disabling part of the L3 cache as well as the cores for a 4-core version though -- we've already seen Athlon X2 that were harvested Phenom X4 for instance. That's definitely not something you want to do a lot if you can avoid it, obviously.
  • DanNeely - Monday, February 7, 2011 - link

    "There’s actually a lot more work involved in moving a Redwood GPU architecture to 32nm, as most of the Intellectual Property (IP) related to GPUs targets the so-called half-nodes (55nm, 40m, and in the future 28nm). It’s one reason we expect AMD to eventually move all of their CPU and GPU production to such nodes, but that's a ways off and Llano will use the same process size as Intel’s current CPUs."

    What's actually different between the two? I assumed it was just a case of what they picked as the next scaling point. There've been a number of GPUs in the past that have dropped from half to full to half node again as each one became widely available. I'd've assumed the main engineering challenge would be optimizing for the quirks in GF's processes instead of TSMC's.

Log in

Don't have an account? Sign up now