As any ‘family source of computer information’ will testify, every so often a family member will want an upgrade.  Over the final few months of 2012, I did this with my brother’s machine, fitting him out with a Sandy Bridge CPU, an SSD and a good GPU to tackle the newly released Borderlands 2 with, all for free.  The only problem he really had up until that point was a dismal FPS in RuneScape.

The system he had been using for the two years previous was an old hand-me-down I had sold him – a Core2Duo E6400 with 2x2 GB of DDR2-800 and a pair of Radeon HD4670s in Crossfire.  While he loves his new system with double the cores, a better GPU and an SSD, I wondered how much of an upgrade it had really been.

I have gone through many upgrade philosophies over the decade.  My current one to friends and family that ask about upgrades is that if they are happy installing new components. then upgrade each component to one of the best in its class one at a time, rather than at an overall mediocre setup, as much as budget allows.  This tends towards outfitting a system with a great SSD, then a GPU, PSU, and finally a motherboard/CPU/memory upgrade with one of those being great.  Over time the other two of that trio also get upgraded, and the cycle repeats.  Old parts are sold and some cost is recouped in the process, but at least some of the hardware is always on the cutting edge, rather than a middling computer shop off-the-shelf system that could be full of bloatware and dust.

As a result of upgrading my brother's computer, I ended up with his old CPU/motherboard/memory combo, full of dust, sitting on top of one of my many piles of boxes.  I decided to pick it up and run the system with a top range GPU and an SSD through my normal benchmarking suite to see how it faired to the likes of the latest FM2 Trinity and Intel offerings, both at stock and with a reasonable overclock.  Certain results piqued my interest, but as for normal web browsing and such it still feels as tight as a drum.

The test setup is as follows:

Core2Duo E6400 – 2 cores, 2.13 GHz stock
2x2 GB OCZ DDR2 PC8500 5-6-6
MSI i975X Platinum PowerUp Edition (supports up to PCIe 1.1)
Windows 7 64-bit
AMD Catalyst 12.3 + NVIDIA 296.10 WHQL (for consistency between older results)

My recent testing procedure in motherboard reviews pairs the motherboard with an SSD and a HD7970/GTX580, and given my upgrading philosophy above, I went with these for comparable results.  The other systems in the results used DDR3 memory in the range of 1600 C9 for the i3-3225 to 2400 C9 for the i7-3770K.

The Core2Duo system was tested at stock (2.13 GHz and DDR2-533 5-5-5) and with a mild overclock (2.8 GHz and DDR2-700 5-5-6).  

Gaming Benchmarks

Games were tested at 2560x1440 (another ‘throw money at a single upgrade at a time’ possibility) with all the eye candy turned up, and results were taken as the average of four runs.

Metro2033

Metro2033 - One 7970

Metro2033 - One 580

While an admirable effort by the E6400, and overclocking helps a little, the newer systems get that edge.  Interestingly the difference is not that much, with an overclocked E6400 being within 1 FPS of an A10-5800K at this resolution and settings while using a 580.

Dirt3

Dirt3 - One 7970

Dirt3 - One 580

The bump by the overclock makes Dirt3 more playable, but it still lags behind the newer systems.

Computational Benchmarks

3D Movement Algorithm Test

3D Particle Movement Single Threaded

This is where it starts to get interesting.  At stock the E6400 lags at the bottom but within reach of an FX-8150 4.2 GHz , but with an overclock the E6400 at 2.8 GHz easily beats the Trinity-based A10-5800K at 4.2 GHz.  Part of this can be attributed to the way the Bulldozer/Piledriver CPUs deal with floating point calculations, but it is incredible that a July 2006 processor can beat an October 2012 model.  One could argue that a mild bump on the A10-5800K would put it over the edge, but in our overclocking of that chip anything above 4.5 GHz was quite tough (we perhaps got a bad sample to OC).

3D Particle Movement MultiThreaded

Of course the situation changes when we hit the multithreaded benchmark, with the two cores of the E6400 holding it back.  However, if we were using a quad core Q6600, stock CPU performance would be on par with the A10-5800K in an FP workload, although the Q6600 would have four FP units to calculate with and the A10-5800K only has two (as well as the iGPU).

WinRAR x64 3.93 - link

WinRar x64 3.93

In a variable threaded workload, the DDR2 equipped E6400 is easily outpaced by any modern processor using DDR3.

FastStone Image Viewer 4.2 - link

FastStone Image Viewer 4.2

Despite FastStone being single threaded, the increased IPC of the later generations usually brings home the bacon - the only difference being the Bulldozer based FX-8150, which is on par with the E6400.

Xilisoft Video Converter

Xilisoft Video Converter 7

Similarly with XVC, more threads and INT workloads win the day.

x264 HD Benchmark

x264 HD Pass 1

x264 HD Pass 2

Conclusions

When I start a test session like this, my first test is usually 3DPM in single thread mode.  When I got that  startling result, I clearly had to dig deeper, but the conclusion produced by the rest of the results is clear.  In terms of actual throughput benchmarks, the E6400 is comparatively slow to all the modern home computer processors, either limited by cores or by memory. 

This was going to be obvious from the start.

In the sole benchmark which does not rely on memory or thread scheduling and is purely floating point based the E6400 gives a surprise result, but nothing more.  In our limited gaming tests the E6400 copes well at 2560x1440, with that slight overclock making Dirt3 more playable. 

But the end result is that if everything else is upgraded, and the performance boost is cost effective, even a move to an i3-3225 or A10-5800K will yield real world tangible benefits, alongside all the modern advances in motherboard features (USB 3.0, SATA 6 Gbps, mSATA, Thunderbolt, UEFI, PCIe 2.0/3.0, Audio, Network).  There are also significant power savings to be had with modern architectures.

My brother enjoys playing his games at a more reasonable frame rate now, and he says normal usage has sped up by a bit, making watching video streams a little smoother if anything.  The only question is where Haswell will come in to this, and is a question I look forward to answering.

Comments Locked

136 Comments

View All Comments

  • frumply - Tuesday, January 15, 2013 - link

    what I did w/ a stepfather's system a couple years back was to get one of them low-cost 45nm C2D Pentiums and add a stick of RAM I wasn't using anymore, replacing a E4300 or equivalent, as well as doubling the RAM. Long as you're not doing video processing or 3D rendering those late Wolfdales seemed to be more than powerful enough for the job. I've done the same with my system by using a mid-tier Q9x00 CPU. Most of my gaming is console-based these days anyway, and aside from that my only gripe is that I'm stuck with 4GB of RAM till I upgrade to something better.

    I'm looking forward to finally upgrading with Haswell, but much of the drive for that's been due to the power-saving features.
  • nathanddrews - Tuesday, January 15, 2013 - link

    By any chance did you also log the minimum and maximum fps for the games?

    I found that when I upgraded from my Q6600 (OC 3.2GHz) to a stock 3570K, while keeping the same GPU and SSD, I all but eliminated low fps spikes and max fps nearly doubled. Some games are immensely quicker while others simply no longer stutter.
  • IanCutress - Tuesday, January 15, 2013 - link

    Dirt3, 2560x1440, Ultra, 8xMSAA

    E6400 @ 2.13 GHz, HD7970, min FPS: 17.3
    E6400 @ 2.80 GHz, HD7970, min FPS: 32.3

    E6400 @ 2.13 GHz, GTX580, min FPS: 18.6
    E6400 @ 2.80 GHz, GTX580, min FPS: 22.5

    Note that I did these on 12.3 / 296.10 drivers to remain consistent with my chronologically older results, so the newer drivers should probably push it up a little, especially those 12.11b11.
  • Mr Perfect - Wednesday, January 16, 2013 - link

    Interesting. How does that compare to the newer systems? Do you have minimums for those too?

    A year and a half ago, I ran a similar upgrad-as-much-as-you-can Socket 939 Opteron machine with a then-new 6870 in it, and when the card was carried over to a new i7 2600 build, it was the minimum frame rates that really improved the most. When the old machine was CPU limited, it was really CPU limited.
  • Golgatha - Tuesday, January 15, 2013 - link

    Just upgraded my dad from a Pentium D, 4GB DDR2, 9500 GT system to a Core i7 920, 12GB DDR3, GTX 460 768MB hand me down system for his Christmas 2012 present. Let's see if he notices the difference?!

    I think the biggest relative upgrade I performed was for my in-laws though. Took them from a Pentium 4 3.0Ghz to a Core 2 Quad q9400. Now that's an upgrade! They're still using that system and it still runs like a champ by the way.

    I treated myself to an i7 3770k, 16GB DDR3, SLI 680 system this past fall.
  • karasaj - Tuesday, January 15, 2013 - link

    I am so utterly jealous of how much money you have xD. I will be rocking a core i3-3220+GTX 650 and a decent ivy bridge laptop probably until broadwell. I might pick up a surface pro like machine with a broadwell chip, and a quad core desktop system by then.

    Although broadwell might be really only more for mobile, so I might end up waiting for the revision after that to get a CPU, and upgrade my GPU first.
  • Golgatha - Wednesday, January 16, 2013 - link

    Well, I actually had the GTX 460 768MB as a dedicated PhysX card alongside my original GTX 680. Swapped out the motherboard, CPU, and RAM that I ended up repurposing for my dad's Christmas present, and also repurposed the GTX 460 for his rig. That left an empty PCIe 16x slot in my current rig, which obviously is a dangerous thing for my pocketbook, as I ended up going SLI. Found a great deal on the FS/FT boards that I couldn't pass up. ;-)

    Darn those Microcenter deals that inevitably cause a cascading cash flow outlay!
  • DanNeely - Tuesday, January 15, 2013 - link

    I gave my parents a similar bump in two stages a few years ago. From an Athon-900 and 128Mb PC-100, initially to an A64-2.0 and 1gb DDR2-400, and a year or two later to a 2.4ghz dual core and 3GB ram when I retired my S939 box.

    Arguably it's due for another upgrade soon; but they almost never use it anymore. Mom mostly uses my old netbook from her recliner while dad buys and batters a cheap 15" laptop to death on the road every 18 months or so. The desktop only ever really gets used if the wifi is down, or by my dad to use the networked printer. He only does this because he's terrified that connecting his laptop to wifi will screw up his ATT dongles software (this apparently happened once a half dozen years and at least two dongles ago). I've gone as far as offering to image his drive first to guarantee I can undo anything if it breaks but he won't let me touch it.... *sigh*
  • sideshow23bob - Tuesday, January 15, 2013 - link

    Interesting article. I'd be interested to see power consumption differences in the C2D and the latest and greatest as I imagine that may be a more decisive win for the modern hardware than the actual performance difference. Thanks so much for sharing this.
  • jjj - Tuesday, January 15, 2013 - link

    More striking is how little you gain when replacing a 6.5 years old CPU, 6.5 years used to be an eternity yet now it's still functional.
    If you would go back 6.5 more years you would land in the pentium 3 era,think how slow that was compared to core 2 duo.
    And this gave me an idea, would be cool to test every gen from Intel and AMD,as far back as you can manage to go and plot it in on a time line see how perf evolved (and slowed down) - ofc you would have to exclude the extreme (pricing) series to get relevant results.

Log in

Don't have an account? Sign up now