While the PS2's Emotion Engine has a lot of potential, developers have continuously stated that the platform is too difficult to program for. With both GameCube and Xbox using widely available and common CPU platforms, the real competition exists between the Cube's Gekko and the Xbox's Intel CPU.

In terms of raw performance, the Celeron 733 (4-way set associative L2) will outperform the PowerPC 750 running at 500MHz in any of the synthetic benchmarks we've seen. We can only assume that a 733MHz CPU with a 133MHz FSB and 8-way set associative L2 cache would only be faster than the Gekko giving the Xbox the CPU performance advantage.

Both platforms have good compiler support and the tilt of the hat goes to IBM's Gekko in terms of having a very flexible ISA.

Where the GameCube does clearly come out on top however is in heat production and die size. The Gekko produces around 1/3 the amount of heat as the Xbox CPU and measures in at close to half of the die size. This leads to tremendous cost savings in the production of the CPU that translates into the ability to price the GameCube at $199 instead of $299 like the Playstation 2 and Xbox.


The PS2's Graphics Synthesizer is entirely too dependent on extreme parallelism in order to fill its 16 pixel pipelines which could be the cause of many of the slowdowns we've seen in games for the platform. Many of Electronic Arts' titles have been ported to both GameCube and Xbox and the first thing everyone seems to notice is that the slowdown problems that existed with the PS2 are now gone.

The GameCube wins in terms of GPU efficiency courtesy of the embedded 1T-SRAM from MoSys. However the use of a fixed function T&L pipeline is a bit of a turn off for the GPU. Again this is another situation in which it would have been beneficial to have ATI's input into the design of the product before it was finalized. It is a shame that ATI acquired ArtX after the design was already completed otherwise we might have seen a programmable T&L pipeline instead.

Raw GPU power and feature set does go to the NV2A core that is in the Xbox. Games such as Dead or Alive 3 are perfect examples of how easy it is for developers to write these custom pixel and vertex shader programs as well as how great the results can be.

Both Flipper and the NV2A support texture-compression which plays a major role in the use of higher-resolution textures in games. On the launch titles for the GameCube we've seen a number of lower resolution textures being used compared to the Xbox launch titles. That could just be a sign of the early adopters not taking advantage of the technology yet or it could be due to a lack of main memory bandwidth, it's too early to tell.

Audio & I/O

The clear winner when it comes to audio is the Xbox. While Dolby Pro Logic II support is great, it isn't widely supported by most of today's receivers and lacks many of the benefits of Dolby Digital 5.1.

Also from an I/O standpoint the Xbox comes out ahead as well because of its built in hard drive and Ethernet adapter. There have been too many failures in the past of console add-on products to expect incredible success from any add-on product to either of the competing consoles. What is interesting to note is that in spite of the hard drive and faster DVD drive, Xbox load times are still not dramatically better than the GameCube load times.

We have yet to compare one title on both platforms to figure out which one loads faster (in theory the Xbox should) but current GameCube titles experience much quicker load times than Xbox titles.

Disassembling the unit (continued) Final Words
Comments Locked


View All Comments

  • cubeguy2k5 - Monday, December 20, 2004 - link

    feel that anandtechs article on xbox vs ps2 vs gamecube didnt go in depth enough, guessed at too many things, and intentionally got others wrong, not sure where to discuss this at, would like to get a thread going.....

    "However details on this processor are sketchy at best but the information we've been able to gather points at a relatively unmodified PowerPC 750CXe microprocessor " - where did they gather this from? gekko isnt a PPC 750CXE or it would be marked as such.

    "The Flipper graphics core is a fairly simple fixed function GPU aided by some very powerful amounts of memory bandwidth, but first onto the architecture of the graphics core. Flipper always operates on 4 pixels at a time using its 4 pixel pipelines; each of those pipelines is capable of applying one texture per pipeline which immediately tips you off that the ArtX design wasn't influenced by ATI at all. Since the Radeon and GeForce2, both ATI and NVIDIA's cores have been able to process a minimum of two textures per pixel in each of their pipelines which came quite in handy since none of today's games are single textured anymore." - who told them that gamecube only has one texture unit per pipeline? it wasnt nintendo, i could just as easily say it has 2, doubling texel bandwidth....... who said it was fixed function?

    "Planet GameCube: In a recent IGNinsider article, Greg Buchner revealed that Flipper can do some unique things because of the ways that the different texture layers can interact. Can you elaborate on this feature? Have you used it? Do you know if the effects it allows are reproducible on other architectures (at decent framerates)?

    Julian Eggebrecht: He was probably referring to the TEV pipeline. Imagine it like an elaborate switchboard that makes the wildest combinations of textures and materials possible. The TEV pipeline combines up to 8 textures in up to 16 stages in one go. Each stage can apply a multitude of functions to the texture - obvious examples of what you do with the TEV stages would be bump-mapping or cel-shading. The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. We just used the obvious effects in Rogue Leader with the targeting computer and the volumetric fog variations being the most unusual usage of TEV. In a second generation game we’ll obviously focus on more complicated applications."

    The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. COMPLETELY UNDER PROGRAMMER CONTROL MEANS NOT FIXED FUNCTION, and on fixed function GPUs you cannot do advanced shader effects in realtime can you? rogue leader and rebel strike use them EXTENSIVELY.... anandtech.... wheres your explanation?

    ill provide more examples later....

    "Julian Eggebrecht: Maybe without going into too much detail, we don’t think there is anything visually you could do on X-Box (or PS2) which can’t be done on GameCube. I have read theories on the net about Flipper not being able to do cube-mapped environment maps, fur shading, self-shadowing etc... That’s all plain wrong. Rogue does extensive self-shadowing and both cube-maps and fur shading are not anymore complicated to implement on GameCube than on X-Box. You might be doing it differently, but the results are the same. When I said that X-Box and GameCube are on par power-wise I really meant it. " looks like a PROVEN DEVELOPER just proved anandtech is WRONG... nice..... factor5 was involved in the creation of cube, they know it better than ANYONE else, including anandtech....

    come on anandtech, i know you see this article... what about this?

    you clearly state that you believe xbox is ageneration ahead of gamecube technically, when you COULD NOT do any of the shader effects nor the amount of bumpmapping thats in rogue leader even, on a pre GF3 GPU, let alone rebel strike..... what about the water effects in rebel strike, mario sunshine, waverace, i do believe that in 2001, not one game had water even on pc, even CLOSE to waverace in terms of how it looked, and the physics behind it, and in 2002 there wasnt one game close to mario sunshine as far as water goes, wow!..... what about all the nice fully dynamic lighting in RE4, and rebel strike? you couldnt pull that off on a fixed function gpu could you? apparently they cant even pull it off on xbox, when halo2 has massive slowdown, mostly static lighting, an abysmal polygon count, coupled with lod pop in, and various other problems/faked effects.... nice, what about ninja gaiden ? same story, good character models, very bad textures, non existant lighting, shadows that seem to react to non existant lightsources that exist inside of walls..... cute.....


    nice textures and lack of lighting... low polycount and invisible lightsources that seem to only allow ryu to cast shadows, not the environment, wow.... what bout the faked reflections used in the game?... neat
  • Cooe - Tuesday, August 18, 2020 - link

    The fanboy delusions are strong with this one...
  • Arkz - Saturday, September 17, 2011 - link

    "the other incorrectly labeled digital AV (it's still an analog signal) for component connections."

    wrong, its purely digital. the component cable has a DAC chip in the connector block. technically they could make a DVI cable for it.
  • Arkz - Saturday, September 17, 2011 - link

    and gc cpu is 485 not 500
  • ogamespec - Thursday, August 8, 2013 - link

    Actually Gekko speed is 486 ( 162 x 3) MHz.

    And Gamecube GPU (Flipper) TEV is fixed stage. No custom shaders.
  • techFan1988 - Wednesday, May 4, 2022 - link

    Mmmm I understand that now we have much better information than back then, but I find this piece of the article a bit skewed towards the Xbox (or against the GC).
    There are a couple of aspects that are factually wrong, for example:
    "However from all of that data that we have seen comparing the PowerPC 750 to even the desktop Intel Celeron processor, it does not seem that the Gekko can compete, performance-wise."

    The original PowerPC 750 didn't even have on-die L2 cache, so saying "it doesn't compete with a Celeron coppermine processor" is absolutely unfair (it would be like comparing the first versions of the P3 -the ones running at 500Mhz- with the Coppermine ones).

    To grab the original PPC 750 and compare it to a coppermine celeron 128 (the ones based on the P3 architecture and the one feeding the Xbox -although with a faster bus which was comparable to that of a regular P3) is not a fair comparison.

    At least, since this was a modification of the PPC750 CXe (and not the original PPC750) the author of the article should have compared that CPU to the Celeron and not the original PPC 750.

    I mean, the difference between P3 first gen and P3 coppermine was even bigger than the difference between P2 and P3 just because of the integrated L2 caché!
    How could this factor be ignored when comparing GC's and Xbox's CPUs?

Log in

Don't have an account? Sign up now