Flipper makes noises

Flipper is also home to a custom Macronix DSP that essentially does the job of NVIDIA's APU in the Xbox. The only difference is that the Macronix DSP is not powerful enough to perform real-time Dolby Digital Encoding without significant latency penalties. The latency induced by the encoding on the Xbox is minimal at worst and we have confirmed this through our extensive testing of the nForce APU; we could not determine any induced delays in our tests.

Nintendo doesn't provide a digital output on the console itself so there's no way for a developer to perform the real-time encoding if they felt they had the extra power left over to do so, which is exactly what EA did with DTS encoding on the Playstation 2. We have seen one developer implement something beyond the regular Stereo or Pro Logic audio output with Dolby Pro Logic II in Star Wars Rogue Squadron II.

Dolby's Pro Logic II is an algorithm that extracts 5.1 audio out of a stereo signal by comparing the differences and similarities between the two signals. This is known as matrix surround decoder since it produces more channels than are in the original signal. Although it's not nearly as good as the 5.1 discrete audio signals found with Dolby Digital or DTS, it's far better than the original Pro Logic, which has its roots in the 1970's. It's also backwards and forwards compatible with the original Pro Logic, meaning that a Pro Logic II encoded signal can be played back on a Pro Logic receiver and vice versa.

In the case of the Game Cube, the Pro Logic II encoded signal can be played back on a stereo or Pro Logic device as well for compatibility, although the Pro Logic decoded version will only have a mono surround and a bandwidth limited center and surround channels (a low pass filter at 7kHz is applied in the Pro Logic decoder to these channels). Further, channel separation is not nearly as good with a standard Pro Logic decoder.

A discrete 5.1 signal, on the other hand, is full bandwidth for the 5 main channels and by its very nature supports complete channel separation with targeting of a specific sound to a specific channel.

Pro Logic II is only available on some of the very latest surround sound receivers, while Dolby Digital has been available on most receivers for the past few years. Pro Logic II will eventually be a standard on all receivers, but probably not for another year or two. Any receiver with Pro Logic II will also have Dolby Digital and DTS support. The original Pro Logic has been included on just about every surround sound receiver over the past 15 years.

An interesting thing about the audio processor is that it is connected to 16MB of DRAM via an 8-bit memory bus running at 81MHz (1/2 of Flipper's operating frequency). Obviously, 16MB of memory is a lot of memory for audio processing so developers are able to use any part of that memory as regular storage of data that doesn't need that much memory bandwidth since there is only 81MB/s of bandwidth to this audio DRAM.

1T-SRAM outside of Flipper A true console's I/O
Comments Locked

6 Comments

View All Comments

  • cubeguy2k5 - Monday, December 20, 2004 - link

    feel that anandtechs article on xbox vs ps2 vs gamecube didnt go in depth enough, guessed at too many things, and intentionally got others wrong, not sure where to discuss this at, would like to get a thread going.....

    "However details on this processor are sketchy at best but the information we've been able to gather points at a relatively unmodified PowerPC 750CXe microprocessor " - where did they gather this from? gekko isnt a PPC 750CXE or it would be marked as such.

    "The Flipper graphics core is a fairly simple fixed function GPU aided by some very powerful amounts of memory bandwidth, but first onto the architecture of the graphics core. Flipper always operates on 4 pixels at a time using its 4 pixel pipelines; each of those pipelines is capable of applying one texture per pipeline which immediately tips you off that the ArtX design wasn't influenced by ATI at all. Since the Radeon and GeForce2, both ATI and NVIDIA's cores have been able to process a minimum of two textures per pixel in each of their pipelines which came quite in handy since none of today's games are single textured anymore." - who told them that gamecube only has one texture unit per pipeline? it wasnt nintendo, i could just as easily say it has 2, doubling texel bandwidth....... who said it was fixed function?

    "Planet GameCube: In a recent IGNinsider article, Greg Buchner revealed that Flipper can do some unique things because of the ways that the different texture layers can interact. Can you elaborate on this feature? Have you used it? Do you know if the effects it allows are reproducible on other architectures (at decent framerates)?

    Julian Eggebrecht: He was probably referring to the TEV pipeline. Imagine it like an elaborate switchboard that makes the wildest combinations of textures and materials possible. The TEV pipeline combines up to 8 textures in up to 16 stages in one go. Each stage can apply a multitude of functions to the texture - obvious examples of what you do with the TEV stages would be bump-mapping or cel-shading. The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. We just used the obvious effects in Rogue Leader with the targeting computer and the volumetric fog variations being the most unusual usage of TEV. In a second generation game we’ll obviously focus on more complicated applications."

    The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. COMPLETELY UNDER PROGRAMMER CONTROL MEANS NOT FIXED FUNCTION, and on fixed function GPUs you cannot do advanced shader effects in realtime can you? rogue leader and rebel strike use them EXTENSIVELY.... anandtech.... wheres your explanation?

    ill provide more examples later....



    "Julian Eggebrecht: Maybe without going into too much detail, we don’t think there is anything visually you could do on X-Box (or PS2) which can’t be done on GameCube. I have read theories on the net about Flipper not being able to do cube-mapped environment maps, fur shading, self-shadowing etc... That’s all plain wrong. Rogue does extensive self-shadowing and both cube-maps and fur shading are not anymore complicated to implement on GameCube than on X-Box. You might be doing it differently, but the results are the same. When I said that X-Box and GameCube are on par power-wise I really meant it. " looks like a PROVEN DEVELOPER just proved anandtech is WRONG... nice..... factor5 was involved in the creation of cube, they know it better than ANYONE else, including anandtech....


    come on anandtech, i know you see this article... what about this?

    you clearly state that you believe xbox is ageneration ahead of gamecube technically, when you COULD NOT do any of the shader effects nor the amount of bumpmapping thats in rogue leader even, on a pre GF3 GPU, let alone rebel strike..... what about the water effects in rebel strike, mario sunshine, waverace, i do believe that in 2001, not one game had water even on pc, even CLOSE to waverace in terms of how it looked, and the physics behind it, and in 2002 there wasnt one game close to mario sunshine as far as water goes, wow!..... what about all the nice fully dynamic lighting in RE4, and rebel strike? you couldnt pull that off on a fixed function gpu could you? apparently they cant even pull it off on xbox, when halo2 has massive slowdown, mostly static lighting, an abysmal polygon count, coupled with lod pop in, and various other problems/faked effects.... nice, what about ninja gaiden ? same story, good character models, very bad textures, non existant lighting, shadows that seem to react to non existant lightsources that exist inside of walls..... cute.....

    http://www.geocities.com/cube_guy_2k5/ng3.jpg

    nice textures and lack of lighting... low polycount and invisible lightsources that seem to only allow ryu to cast shadows, not the environment, wow.... what bout the faked reflections used in the game?... neat
  • Cooe - Tuesday, August 18, 2020 - link

    The fanboy delusions are strong with this one...
  • Arkz - Saturday, September 17, 2011 - link

    "the other incorrectly labeled digital AV (it's still an analog signal) for component connections."

    wrong, its purely digital. the component cable has a DAC chip in the connector block. technically they could make a DVI cable for it.
  • Arkz - Saturday, September 17, 2011 - link

    and gc cpu is 485 not 500
  • ogamespec - Thursday, August 8, 2013 - link

    Actually Gekko speed is 486 ( 162 x 3) MHz.

    And Gamecube GPU (Flipper) TEV is fixed stage. No custom shaders.
  • techFan1988 - Wednesday, May 4, 2022 - link

    Mmmm I understand that now we have much better information than back then, but I find this piece of the article a bit skewed towards the Xbox (or against the GC).
    There are a couple of aspects that are factually wrong, for example:
    "However from all of that data that we have seen comparing the PowerPC 750 to even the desktop Intel Celeron processor, it does not seem that the Gekko can compete, performance-wise."

    The original PowerPC 750 didn't even have on-die L2 cache, so saying "it doesn't compete with a Celeron coppermine processor" is absolutely unfair (it would be like comparing the first versions of the P3 -the ones running at 500Mhz- with the Coppermine ones).

    To grab the original PPC 750 and compare it to a coppermine celeron 128 (the ones based on the P3 architecture and the one feeding the Xbox -although with a faster bus which was comparable to that of a regular P3) is not a fair comparison.

    At least, since this was a modification of the PPC750 CXe (and not the original PPC750) the author of the article should have compared that CPU to the Celeron and not the original PPC 750.

    I mean, the difference between P3 first gen and P3 coppermine was even bigger than the difference between P2 and P3 just because of the integrated L2 caché!
    How could this factor be ignored when comparing GC's and Xbox's CPUs?

Log in

Don't have an account? Sign up now