Visual Studio 2008: Compiler Performance

You guys asked for it and finally I have something I feel is a good software build test. Using Visual Studio 2008 I'm compiling Chromium. It's a pretty huge project that takes over forty minutes to compile from the command line on the Core i3 2100. But the results are repeatable and the compile process will stress all 12 threads at 100% for almost the entire time on a 980X so it works for me.

I don't have a full set of results here but I'm building up the database. The 2600K manages a 12% lead over the previous generation high end chips, but it can't touch the 980X. The 2500K does well but it is limited by its lack of Hyper Threading. The Phenom II X6 1100T beats it.

Visual Studio 2008: Compile Chromium

Flash Video Creation

Sorenson Squeeze Flash Video Creation

Excel Math Performance

Excel Monte Carlo Simulation

Excel Math Operations

File Compression/Decompression Performance Gaming Performance
Comments Locked

283 Comments

View All Comments

  • karlostomy - Thursday, January 6, 2011 - link

    what the hell is the point of posting gaming scores at resolutions that no one will be playing at?

    If i am not mistaken, the grahics cards in the test are:
    eVGA GeForce GTX 280 (Vista 64)
    ATI Radeon HD 5870 (Windows 7)
    MSI GeForce GTX 580 (Windows 7)

    So then, with a sandybridge processor, these resolutions are irrelevant.
    1080p or above should be standard resolution for modern setup reviews.

    Why, Anand, have you posted irrelevant resolutions for the hardware tested?
  • dananski - Thursday, January 6, 2011 - link

    Games are usually limited in fps by the level of graphics, so processor speed doesn't make much of a difference unless you turn the graphics detail right down and use an overkill graphics card. As the point of this page was to review the CPU power, it's more representative to use low resolutions so that the CPU is the limiting factor.

    If you did this set of charts for gaming at 2560x1600 with full AA & max quality, all the processors would be stuck at about the same rate because the graphics card is the limiting factor.

    I expect Civ 5 would be an exception to this because it has really counter-intuitive performance.
  • omelet - Tuesday, January 11, 2011 - link

    For almost any game, the resolution will not affect the stress on the CPU. It is no harder for a CPU to play the game at 2560x1600 than it is to play at 1024x768, so to ensure that the benchmark is CPU-limited, low resolutions are chosen.

    For instance, the i5 2500k gets ~65fps in the Starcraft test, which is run at 1024x768. The i5 2500k would also be capable of ~65fps at 2560x1600, but your graphics card might not be at that resolution.

    Since this is a review for a CPU, not for graphics cards, the lower resolution is used, so we know what the limitation is for just the CPU. If you want to know what resolution you can play at, look at graphics card reviews.
  • Tom - Sunday, January 30, 2011 - link

    Which is why the tests have limited real world value. Skewing the tests to maximize the cpu differences makes new cpus look impressive, but it doesn't show the reality that the new cpu isn't needed in the real world for most games.
  • Oyster - Monday, January 3, 2011 - link

    Maybe I missed this in the review, Anand, but can you please confirm that SB and SB-E will require quad-channel memory? Additionally, will it be possible to run dual-channel memory on these new motherboards? I guess I want to save money because I already have 8GB of dual-channel RAM :).

    Thanks for the great review!
  • CharonPDX - Monday, January 3, 2011 - link

    You can confirm it from the photos of it only using two DIMMs in photo.
  • JumpingJack - Monday, January 3, 2011 - link

    This has been discussed in great detail. The i7, i3, and i5 2XXX series is dual channel. The rumor mill is abound with SB-E having quad channel, but I don't recall seen anything official from Intel on this point.
  • 8steve8 - Monday, January 3, 2011 - link

    the K processors have the much better IGP and a variable multiplier, but to use the improved IGP you need an H67 chipset, which doesn't support changing the multiplier?
  • ViRGE - Monday, January 3, 2011 - link

    CPU Multiplier: Yes, H67 cannot change the CPU multiplier

    GPU Multiplier: No, even H67 can change the GPU multiplier
  • mczak - Monday, January 3, 2011 - link

    I wonder why though? Is this just officially? I can't really see a good technical reason why CPU OC would work with P67 but not H67 - it is just turbo going up some more steps after all. Maybe board manufacturers can find a way around that?
    Or is this not really linked to the chipset but rather if the IGP is enabled (which after all also is linked to turbo)?

Log in

Don't have an account? Sign up now