A Newcomer from Intel: The Core 2 Duo E7200

A couple of weeks before the Phenom X3 launch Intel sent this little gem:

That's the Core 2 Duo E7200, it's due out sometime this quarter and it's supposed to sell for $133. The 45nm dual-core E7200 runs at 2.53GHz, with a 1066MHz FSB and has a 3MB L2 cache. Given what we reported in our last CPU story, we don't expect to see Intel hit the $133 mark with this chip until 45nm dual core shipments ramp up in late Q2/early Q3. A quick search online reveals the E7200 selling for around $160 today.

Intel also trimmed the pricing on some of its CPUs, the Core 2 Quad Q6600 now sells for $224 and the Core 2 Duo E6850 is now priced at $183.

Mainstream Platforms: Intel has an Issue

In the sub-$200 CPU space, most of these chips will be paired up with a motherboard that supports integrated graphics. For AMD that means the new 780G chipset and for Intel that means G35. Now from a general performance standpoint, these two chipsets perform very much like their more expensive, enthusiast-class siblings (790FX and P35/X48). You may give up 2 - 3% in the way of performance but motherboards are much cheaper and you get the benefit of integrated graphics, which is more than sufficient if your usage doesn't including heavy 3D gaming.

Unfortunately, Intel is in a not-so-great position right now when it comes to its platforms. It can't turn to ATI anymore for integrated graphics solutions, and with a full out war on NVIDIA brewing, it's left alone to provide chipsets for its processors as NVIDIA's latest IGP solutions are not yet available for Intel CPUs.

While G45 will hopefully bring full H.264/VC1/MPEG-2 decode acceleration to Intel's integrated graphics, it's just not ready yet. And while ATI/NVIDIA have historically held the integrated graphics performance advantage, now it's arguably even bigger. Without full HD-decode support on its chipsets, it's not just gamers that Intel is alienating, the platforms are preventing further adoption of Blu-ray on the PC.

So what are the options for OEMs? Either go with an AMD platform, or stick an AMD or NVIDIA graphics card in their Blu-ray enabled Intel machines. Neither option is something that Intel should be happy with right now. Intel's forthcoming G45 chipset does, at least in theory, solve this problem - however it's at least a couple of months away from being released.

As far as mainstream platforms go, AMD is definitely the winner here. The CPU performance leaves much to be desired, but for once (for once), we actually have a tangible platform advantage on the desktop. Now if you pirate your HD movies then none of this matters, as GPU accelerated H.264 decode doesn't work on much pirated content.

Why Bother with Three Cores? The Test
Comments Locked

45 Comments

View All Comments

  • Locutus465 - Wednesday, April 23, 2008 - link

    I just upgraded my system to the following last night (running vista ultimate ed. 64bit).

    AMD Phenom 9850be (at stock speed for now, with packaged heatsink).
    4GB OCz DDR2 800 memory (at stock speed)
    ASUS M3A32-MVP Deluxe / WiFi-AP AMD 790FX
    DIAMOND Viper Radeon HD 3870
    Soundblaster x-fi Fatal1ty

    The rest of my system stayed the same, primary hdd = WD Caviar SATA 7200RPM, secondary = Segate SATA 7200RPM, page file running off of secondary disc rather than system disc etc.

    I can tell you right now that the all AMD platform is very strong. As my primary display I have a 19" LCD running 1280x1024 and I run all games with all graphics options set to max and never get below 70FPS on any game I know how to pull the FPS for. I'm able to run crysis very smoothly at the same resolution with medium graphics settings, I have not yet tried cranking things up though. Additionally I've had to take some of my work home which delt with converting a 3GB pipe delimited file in to several smaller files then converting those into valid CSV files (using excel), on my new system this process was very quick. For your reference below is a list of every game I've tried on my system, they ALL play silky smooth.

    Doom 3
    Quake 4
    Age of Empires III
    F.E.A.R
    Oblivion
    Half Life 2
    Half Life 2 EP1
    WoW
    *Crysis

    *only game I don't have graphics/audio settings fully maxed on.

    Since upgrading I've also taken to gaming on my 720P Toshi DLP using the DVI to HDMI converter packaged with my video card and audio running through my AVR via multi-channel analog inputs. All I have to say is damn is that fun!!!
  • Ensoph42 - Wednesday, April 23, 2008 - link

    I don't understand why every review I've seen for the Phenoms use DDR2-800. I thought one of the perks was that you were supposed to use DDR2-1066 to get max performance. Someone explain this to me.
  • niva - Wednesday, April 23, 2008 - link

    I have a phenom 9600 with 8Gb of RAM, I had major issues getting the RAM to 1066 and remaining stable, simply stayed with 800 for stability reasons. Then again, I don't play games much so I'm not concerned about squeezing out an extra 1-5% performance for the sake of stability.

    Of course I didn't play with this too much, maybe I was doing something wrong but I've not found a good guide saying exactly what I need to set for the system to remain stable.
  • Ensoph42 - Wednesday, April 23, 2008 - link

    I hear you. Although is it that your mem is rated at 800 and wont OC to 1066? Or rated for 1066 but just isn't stable period at that speed?

    However the GIGABYTE GA-MA78GM-S2H, that is used in the review, has a memory standard of DDR2-1066. One of the selling points of the Phenom, I believed, was the memory controller supported DDR2-1066.

    I found this link here that takes a look at performance differences. I havent given it a close read since it's late, and seems limited but:

    http://www.digit-life.com/articles3/mainboard/ddr2...">http://www.digit-life.com/articles3/mainboard/ddr2...



  • perzy - Wednesday, April 23, 2008 - link

    Well unless the software is as well written as Unreal engine 3 (Tim Sweeney is a god),in 95% of the programs avrege Joe use (including windows itself) there is very little advantage even going from singlecore to dualcore!
    Which brings me to my question: Whats really going on with the 'Heat wall' 'Frequenzy wall' or whatever you call it that Intel hit so hard in 2004(-ish) ? (remember the throttling superhot 3.8 GHz P4's ?)
    What all users really need is higher frequenzy! Why arn't we getting it?
  • Clauzii - Wednesday, April 23, 2008 - link

    The frequency wall can be considered like cooking the electrons off of the DIE-surface, which is not so good. High frequency=High heat. Now Youre thinking, "But IBM got 6GHz?". It's a different design philosophy: Simpler pipeline, faster frequency.

    Until ALL programs/OSes support multi-threaded programs, we are bound to single-threaded OR the pseudo Hyperthreading which can do SOME multithreading, depending on the Code & Data used.

    If I've used a 8 GHz machine to write this on, I would still only be able to see the cursor rate one pr. sec.

    What I think we need are (even more!) intelligent CPUs (GPUs). If a CPU or GPU knew approx. what kind of performance and usage of power a program needs, a more sofisticated power scheme could be possible??

    Until then, All I know is that evolution goes forward. Not always fast, but forward :)
  • Nehemoth - Wednesday, April 23, 2008 - link

    ...(or give up 200MHz and get a quad-core X3 9550 at the same price)...

    Should be Quad-Core X4 no X3
  • MrBlastman - Wednesday, April 23, 2008 - link

    I'd like to see them include the X2 6400 in the benchmarks as well. I see that they might be trying to get the pricing in line, but for an AMD user looking to upgrade, all I really am considering right now is a 6400, X3 or X4, nothing else.

    The UT 3 benchmarks shed some hope for the Phenoms as they show with a properly coded game, the X4's can remain competitive. I still wish to this day they'd release 3+ GHz phenoms :(
  • ImmortalZ - Wednesday, April 23, 2008 - link

    Any MPEG4 AVC video encoded at Profile 4.1 or lower is fully accelerated by today's GPUs. Scene releases from the past few months confirm to this - and even with this, most of the older release are still compatible given you use the proper filters.
  • ImmortalZ - Wednesday, April 23, 2008 - link

    * By Profile I meant Level.

Log in

Don't have an account? Sign up now