Battle 3: Celeron D vs. Sempron

AMD just recently introduced their new low-end branded CPU: Sempron, and as we've already seen it does a wonderful job of outperforming Intel's Celeron D, however the margin of improvement is far less than what we're used to seeing thanks to a much improved Celeron D. How does the Sempron fare under Doom 3? Let's find out:

Remember that there are two flavors of Sempron, a K7 and a K8 version. The K7 version performs just like an Athlon XP since it's basically a Thoroughbred core with its 256KB L2 cache. The biggest performance limiter to the K7 based Sempron 2800+ is that it has no on-die memory controller, bringing its performance down pretty far.

But the K8 based Sempron 3100+ does some serious damage, outperforming the Celeron D 335 by an incredible 53%. For a budget Doom 3 system, you will want to steer far away from a Celeron D and towards the Sempron. As we've seen before, the cache size dependency of Doom 3 on the Pentium 4 is significant and even though the Celeron D and the Sempron both only have a 256KB L2 cache, the Sempron's on-die memory controller helps reduce the impact of such a small cache on Doom 3 performance.

The winner here is Sempron.

AMD vs. AMD AMD vs. Intel
Comments Locked

59 Comments

View All Comments

  • PrinceGaz - Wednesday, August 4, 2004 - link

    The amount of System memory (above 512MB) is unlikely to have any impact on framerate in the timedemo as I doubt it would need to swap anything out after the first run (which is discarded anyway).

    I found my 128MB graphics-card (a Ti4200) gave an almost identical framerate at Low, Medium, and High quality settings in the timedemo even when gfx-card limited, provided Aniso was disabled in the driver for High quality mode (which would otherwise use 8x Aniso and impact on performance in other ways). So increasing the videocard memory from 128MB to 256MB will have no effect whatsoever on the timedemo, except maybe at Ultra quality which I didn't bother testing.
  • Steve Guilliot - Wednesday, August 4, 2004 - link

    #27
    That's the OS load balancing between the two procs. Two D3 threads aren't running at once. That's why sum utilization of both procs won't go over 100%.
  • Succorso - Wednesday, August 4, 2004 - link

    Is this review using XP or XP64 beta with the amd64 ? Are the benefits the same using a 32bitXP as opposed to the 64 bit XP ?

    Succorso
  • SignalPST - Wednesday, August 4, 2004 - link

    its interesting how DOOM3 runs best in the Nvidia/AMD combo along with the amazing price/performance that they offer over their competitors

    the Athlon64 3000+ is on par with Intel's 3.4GHz EE, while the price difference is $840

    the GeForce 6800 GT is faster than ATI's X800XT PE, the price difference being $160

    so in this scenario, the Nvidia+AMD combo can save you $1000 and still outperform the ATI+Intel combo

    bottom line, for DOOM3 and future DOOM3 engine games, ATI+Intel=losers
  • cKGunslinger - Wednesday, August 4, 2004 - link

    Yes, I would also like to see some numbers benchmarking 256/364/512/768/1024/etc MB memory configurations. When does the average system have *enough* ram to run WinXP and play a game?
  • xtf - Wednesday, August 4, 2004 - link

    Would it be possible to add the cache (and other) specs of the K7s to certain charts?
    Because sometimes the 2700 and 2800 are slower than then the 2500 and it'd be interesting to know why.
  • tdent1138 - Wednesday, August 4, 2004 - link

    Great article AT! I'm happy to know my 2.53Ghz @ 2.717Ghz P4 and 9800pro will happily run D3 at 8x6 in medium quality. I can now wait until HL2 at least to upgrade to whatever makes sense at the time (A64 something I imagine). Thanks again!
  • tdent1138 - Wednesday, August 4, 2004 - link

  • Philbill - Wednesday, August 4, 2004 - link

    Great article, Do you plan to give an update with the high end ATI cards?
    Phil
  • dangereuxjeux - Wednesday, August 4, 2004 - link

    Somehow, I feel ashamed that the Sempron 3100+ crushes my ol' P4 2.4C.... please please please stop publishing articles like this that encourage me to spend any more of my money upgrading to a new AMD platform to go along with my 6800.

Log in

Don't have an account? Sign up now