Final Words

Expecting a sequel to be a reincarnation of the original is just setting yourself up for disappointment. A good sequel will be able to stand on its own, independent of whatever may have come before it. Nehalem is Intel's Dark Knight, it lacks the reinvention that made Conroe so incredible, but it continues what was started in 2006.

The Core i7's general purpose performance is solid, you're looking at a 5 - 10% increase in general application performance at the same clock speeds as Penryn. Where Nehalem really succeeds however is in anything involving video encoding or 3D rendering, the performance gains there are easily in the 20 - 40% range. Part of the performance boost here is due to Hyper Threading, but the on-die memory controller and architectural tweaks are just as responsible for driving Intel's performance through the roof.

The iTunes results do paint a downside to Nehalem, there are going to be some situations where Intel's new architecture doesn't offer a performance advantage over its predecessor. If you're not doing a lot of 3D rendering or video encoding work and you already have a Core 2 Quad, the upgrade to Nehalem won't be worth it. If you're still stuck on a Pentium 4 or something similarly slow by today's standards, a jump to Nehalem would be warranted.

Gaming performance is actually better than expected for Nehalem, there were enough cases where the new architecture pulled ahead despite its very small L2 cache that I wouldn't mind recommending it for gamers. In most GPU limited situations however you won't see any performance improvement, at least with today's GPUs, over Penryn.

While posting some very impressive performance gains, Nehalem is nearly as much about efficiency. Hyper Threading alone delivers a 0 - 30% increase in performance at a 0 - 15% increase in power consumption; the problem is that Nehalem's efficiency is only as good as its performance and in those areas where Nehalem can't outperform Penryn, its power efficiency suffers.

I can't help but wonder if what we saw with the QX9770 is indicative of a larger Nehalem advantage, if Penryn's power consumption truly does increase dramatically as clock speed goes up, while Nehalem is able to reel it back in. If that is indeed the case, then Nehalem is even more important for the future of the Core microarchitecture than I originally thought. You could consider it the reverse-Prescott in that case, if its design choices are meant to keep power consumption under control as clock speed ramps up.

It seems odd debating over the usefulness of a processor that can easily offer a 20 - 40% increase in performance, the issue is that the advantages are very specific in their nature. While Conroe reset the entire board, Nehalem is very targeted in where it improves performance the most. That is one benefit of the tick-tock model however, if Intel was too aggressive (or conservative?) with this design then it only needs to last two years before it's replaced with something else. I am guessing that once Intel moves to 32nm however, L2 cache sizes will increase once more and perhaps bring greater performance to all applications.

Quite possibly the biggest threat to Nehalem is that, even at the low end, $284 is a good amount for a microprocessor these days. You can now purchase AMD's entire product line for less than $180 and the cost of entry to a Q9550 is going to be lower, at least at the start, than a Core i7 product. There's no denying that the Core i7 is the fastest thing to close out 2008, but you may find that it's not the most efficient use of money. The first X58 motherboards aren't going to be cheap and you're stuck using more expensive DDR3 memory. If you're running applications where Nehalem shines (e.g. video encoding, 3D rendering) then the ticket price is likely worth it, if you're not then the ~10% general performance improvement won't make financial sense.

It also remains to be seen what will happen to the Nehalem market once Intel introduces the LGA-1156 version next year for lower price points. By introducing a $284 part this early Intel appears to be courting the Q6600/Q9450/Q9550 buyers to the LGA-1366 platform, which would mean that the two-channel Nehalems are strictly value parts and perhaps there won't be much fragmentation in the market as a result.

Intel has two thirds of the perfect trifecta here. Nehalem brings the ability to work on more threads at a time, redefining video encoding and 3D rendering performance, its SSDs shook the storage world, that just leaves Larrabee...

Gaming Performance
Comments Locked

73 Comments

View All Comments

  • Kaleid - Monday, November 3, 2008 - link

    http://www.guru3d.com/news/intel-core-i7-multigpu-...">http://www.guru3d.com/news/intel-core-i...and-cros...
  • bill3 - Monday, November 3, 2008 - link

    Umm, seems the guru3d gains are probably explained by them using a dual core core2dou versus quad core i7...Quad core's run multi-gpu quiet a bit better I believe.

  • tynopik - Monday, November 3, 2008 - link

    what about those multi-threading tests you used to run with 20 tabs open in firefox while running av scan while compressing some files while converting something else while etc etc?

    this might be more important for daily performance than the standard desktop benchmarks
  • D3SI - Monday, November 3, 2008 - link


    So the low end i7s are OC'able?

    what the hell is toms hardware talking about lol
  • conquerist - Monday, November 3, 2008 - link

    Concerning x264, Nehalem-specific improvements are coming as soon as the developers are free from their NDA.
    See http://x264dev.multimedia.cx/?p=40">http://x264dev.multimedia.cx/?p=40.
  • Spectator - Monday, November 3, 2008 - link

    can they do some CUDA optimizations?. im guessing that video hardware has more processors than quad core intel :P

    If all this i7 is new news and does stuff xx faster with 4 core's. how does 100+ core video hardware compare?.

    Yes im messing but giant Intel want $1k for best i7 cpu. when likes of nvid make bigger transistor count silicon using a lesser process and others manufacture rest of vid card for $400-500 ?

    Where is the Value for money in that. Chukkle.
  • gramboh - Monday, November 3, 2008 - link

    The x264 team has specifically said they will not be working on CUDA development as it is too time intensive to basically start over from scratch in a more complex development environment.
  • npp - Monday, November 3, 2008 - link

    CUDA Optimizations? I bet you don't understand completely what you're talking about. You can't just optimize a piece of software for CUDA, you MUST write it from scratch for CUDA. That's the reason why you don't see too much software for nVidia GPUs, even though the CUDA concept was introduced at least two years ago. You have the BadaBOOM stuff, but it's far for mature, and the reason is that writing a sensible application for CUDA isn't exactly an easy task. Take your time to look at how it works and you'll understand why.

    You can't compare the 100+ cores of your typical GPU with a quad core directly, they are fundamentaly different in nature, with your GPU "cores" being rather limited in functionality. GPGPU is a nice hype, but you simply can't offload everything on a GPU.

    As a side note, top-notch hardware always carries price premium, and Intel has had this tradition with high-end CPUs for quite a while now. There are plenty of people who need absolutely the fastest harware around and won't hesitate paying it.
  • Spectator - Monday, November 3, 2008 - link

    Some of us want more info.

    A) How does the integrated Thermal sensor work with -50+c temps.

    B) Can you Circumvent the 130W max load sensor

    C) what are all those connection points on the top of the processor for?.

    lol. Where do i put the 2B pencil to. to join that sht up so i dont have to worry about multiply settings or temp sensors or wattage sensors.

    Hey dont shoot the messenger. but those top side chip contacts seem very curious and obviously must serve a purpose :P

  • Spectator - Monday, November 3, 2008 - link

    Wait NO. i have thought about it..

    The contacts on top side could be for programming the chips default settings.

    You know it makes sence.Perhaps its adjustable sram style, rather than burning connections.

    yes some technical peeps can look at that. but still I want the fame for suggesting it first. lmao.

    Have fun. but that does seem logical to build in some scope for alteration. alot easier to manufacture 1 solid item then mod your stock to suit market when you feel its neccessary.

    Spectator.

Log in

Don't have an account? Sign up now