The Xbox 360 GPU: ATI's Xenos

On a purely hardware level, ATI's Xbox 360 GPU (codenamed Xenos) is quite interesting. The part itself is made up of two physically distinct silicon ICs. One IC is the GPU itself, which houses all the shader hardware and most of the processing power. The second IC (which ATI refers to as the "daughter die") is a 10MB block of embedded DRAM (eDRAM) combined with the hardware necessary for z and stencil operations, color and alpha processing, and anti aliasing. This daughter die is connected to the GPU proper via a 32GB/sec interconnect. Data sent over this bus will be compressed, so usable bandwidth will be higher than 32GB/sec. In side the daughter die, between the processing hardware and the eDRAM itself, bandwidth is 256GB/sec.

At this point in time, much of the bandwidth generated by graphics hardware is required to handle color and z data moving to the framebuffer. ATI hopes to eliminate this as a bottleneck by moving this processing and the back framebuffer off the main memory bus. The bus to main memory is 512MB of 128-bit 700MHz GDDR3 (which results in just over 22GB/sec of bandwidth). This is less bandwidth than current desktop graphics cards have available, but by offloading work and bandwidth for color and z to the daughter die, ATI saves themselves a good deal of bandwidth. The 22GB/sec is left for textures and the rest of the system (the Xbox implements a single pool of unified memory).

The GPU essentially acts as the Northbridge for the system, and sits in the middle of everything. From the graphics hardware, there is 10.8GB/sec of bandwidth up and down to the CPU itself. The rest of the system is hooked in with 500MB/sec of bandwidth up and down. The high bandwidth to the CPU is quite useful as the GPU is able to directly read from the L2 cache. In the console world, the CPU and GPU are quite tightly linked and the Xbox 360 stands to continue that tradition.

Weighing in at 332M transistors, the Xbox 360 GPU is quite a powerful part, but its architecture differs from that of current desktop graphics hardware. For years, vertex and pixel shader hardware have been implemented separately, but ATI has sought to combine their functionality in a unified shader architecture.

What's A Unified Shader Architecture?

The GPU in the Xbox 360 uses a different architecture than we are used to seeing. To be sure, vertex and pixel shader programs will run on the part, but not on separate segments of the hardware. Vertex and pixel processing differ in purpose, but there is quite a bit of overlap in the type of hardware needed to do both. The unified shader architecture that ATI chose to use in their Xbox 360 GPU allows them to pack more functionality onto fewer transistors as less hardware needs to be duplicated for use in different parts of the chip and will run both vertex and shader programs on the same hardware.

There are 3 parallel groups of 16 shader units each. Each of the three groups can either operate on vertex or pixel data. Each shader unit is able to perform one 4 wide vector operation and 1 scalar operation per clock cycle. Current ATI hardware is able to perform two 3 wide vector and two scalar operations per cycle in the pixel pipe alone. The vertex pipeline of R420 is 6 wide and can do one vector 4 and one scalar op per cycle. If we look at straight up processing power, this gives R420 the ability to crunch 158 components (30 of which are 32bit and 128 are limited to 24bit precision). The Xbox GPU is able to crunch 240 32bit components in its shader units per clock cycle. Where this is a 51% increase in the number of ops that can be done per cycle (as well as a general increase in precision), we can't expect these 48 piplines to act like 3 sets of R420 pipelines. All things being equal, this increase (when only looking at ops/cycle) would be only as powerful as a 24 piped R420.

What will make or break the difference between something like a 24 piped R420 and the unified shaders of the Xbox GPU is how well applications will lend themselves to the adaptive nature of the hardware. Current configurations don't have nearly the same vertex processing power as they do pixel processing power. This is quite logical when we consider the fact that games have many more pixels displayed than vertices. For each geometry primitive, there are likely a good number of pixels involved. Of course, not all titles will need the same ratio of geometry to pixel power. This means that all the ops per clock could either be dedicated to geometry processing in truly polygon intense scenes. On the flip side (and more likely), any given clock cycle could see all 240 ops being used for pixel processing. If game designers realize this and code their shaders accordingly, we could see much more focused processing power dedicated to a single type of problem than on current hardware.

ATI is predicting that developers will use lots of very small triangles in Xbox 360 games. As engines like Epic's Unreal Engine 3 have shown incredible results using pixel shaders and normal maps to augment low geometric detail, we can't tell if ATI is trying to provide the chicken or the egg. In other words, will we see many small triangles on Xbox 360 because console developers are moving in that direction or because that is what will run well on ATI's hardware?

Regardless of the paths that lead to this road, it is obvious that the Xbox 360 will be a geometry power house. Not only are all 3 blocks of 16 shaders able to become vertex shaders, but ATI's GPU will be able to handle twice as many z operations if a z only pass is performed. The same is true of current ATI and NVIDIA hardware, but the fact that a geometry only pass can now make use of shader hardware to perform 48 vector and 48 scalar operations in any given clock cycle while doing twice the z operations is quite intriguing. This could allow some very geometrically complicated scenes.

How Many Threads? Inside the Xenos GPU
Comments Locked

93 Comments

View All Comments

  • Doormat - Friday, June 24, 2005 - link

    @#22: Yes 1080P is an OFFICIAL ATSC spec. There are 18 different video formats in the ATSC specification. 1080/60P is one of them.

    FWIW, Even the first 1080P TVs coming out this year will *NOT* support 1080P in over HDMI. Why? I dunno. The TVs will upscale everything to 1080P (from 1080i, 720p, etc), but they cant accept input as 1080P. Some TVs will be able to do it over VGA (the Samsung HLR-xx68/78/88s will), but still thats not the highest quality input.
  • Pastuch - Friday, June 24, 2005 - link

    RE: 1080P
    "We do think it was a mistake for Microsoft not to support 1080p, even if only supported by a handful of games/developers."

    I couldnt disagree more. At the current rate of HDTV adoption we'll be lucky if half of the Xbox 360 users have 1280x720 displays by 2010. Think about how long it took for us to get passed 480i. Average Joe doesnt like to buy new TVs very often. Unless 1080P HDTVs drop to $400 or less no one will buy them for a console. We the eger geeks of Anandtech will obviously have 42 widescreen 1080P displays but we are far from the Average Joe.

    RE: Adult Gamers

    Anyone who thinks games are for kids needs a wakeup call. The largest player base of gamers is around 25 years old right now. By 2010 we will be daddys looking for our next source of interactive porn. I see mature sexually oriented gaming taking off around that time. I honestly believe that videogames will have the popularity of television in the next 20 years. I know a ton of people that dont have cable TV but they do have cable internet, a PC, xbox, PS2 and about a million games for each device.
  • Pannenkoek - Friday, June 24, 2005 - link

    #19 fitten: That's the whole point, people pretend that even rotten fruit laying on the ground is "hard" to pick up. It's not simply about restructuring algorithms to accomodate massive parallelism, but also how it will take ages and how no current game could be patched to run multithreaded on a mere dual core system.

    Taking advantage of parallism is a hot topic in computer science as far as I can tell and there are undoubtedly many interesting challanges involved. But that's no excuse for not being able to simply multithread a simple application.

    And before people cry that game engines are comparable to rocket science (pointing to John Carmack's endeavours) and are the bleeding edge technology in software, I'll say that's simply not the reality, and even less an excuse to not be able to take advantage of parallelism.

    Indeed, game developers are not making that excuse and will come with multithreaded games once we have enough dual core processors and when their new games stop being videocard limited. Only Anandtech thinks that multithreading is a serious technical hurdle.

    This and those bloody obnoxious "sponsored links" all through the text of articles are the only serious objections I have towards Anandtech.
  • jotch - Friday, June 24, 2005 - link

    #26 - yeah i know that happens all over but I was just commenting on the fact that the console's market is mainly teens and adults not mainly kids.
  • expletive - Friday, June 24, 2005 - link

    "If you’re wondering whether or not there is a tangible image quality difference between 1080p and 720p, think about it this way - 1920 x 1080 looks better on a monitor than 1280 x 720, now imagine that blown up to a 36 - 60” HDTV - the difference will be noticeable. "

    This statement should be further qualified. There is only a tangible benefit to 1080p if the display device is native 1080p resolution. Otherwise, the display itself will scale the image down to its native resolution (i.e. 720p for most DLP televisions). If youre display is native 720p then youre better off outputting 720p becuase all that extra processing is being wasted.

    There are only a handful of TVs that support native 1080p right now and they are all over $5k.

    These points are really important when discussing the real-world applications of 1080p for a game console. The people using this type of device (a $300 game console) are very different then those that go out and buy 7800GTX cards the first week they are released. Based on my reading in the home theater space, less than 10% of the people that own a PS3 will be able to display 1080p natively during its lifecycle (5 years).

    Also, can someone explain how the Xenos unified shaders was distelled from 48 down to 24 in this article? That didnt quite make sense to me...

    John
  • nserra - Friday, June 24, 2005 - link

    I was on the supermarket, and there was a kid (12year old girl) buying the game that you mention with the daddy that know sh*t about games, and about looking for the 18 year old logo.

    Maybe if they put a pen*s on the box instead of the carton girl, some dads will then know the difference between a game for 8 year old and an 18.

    #21 I don’t know about your country, but this is what happen in mine and not only with games.
  • knitecrow - Friday, June 24, 2005 - link

    would you be able to tell the difference at Standard resolution?

    instead of drawing more pixels on the screen, the revolution can use that processing power and/or die space for other functions... e.g. shaders

    If the revolution opts to pick an out-of-order processor, something like PPC970FX, i don't see why i can't be competitive.


    But seriously, all speculation aside, the small form factor limits the ammount of heat components can put out, and the processing power of the system.
  • perseus3d - Friday, June 24, 2005 - link

    --"Sony appears to have the most forward-looking set of outputs on the PlayStation 3, featuring two HDMI video outputs. There is no explicit support for DVI, but creating a HDMI-to-DVI adapter isn’t too hard to do. Microsoft has unfortunately only committed to offering component or VGA outputs for HD resolutions."--

    Does that mean, as it stands now, the PS3 will require an adapter to be played on an LCD Monitor, and the X360 won't be able to be used with an LCD Monitor with DVI?
  • Dukemaster - Friday, June 24, 2005 - link

    At least we know Nintendo's Revolution is the lozer when it comes to pure power.
  • freebst - Friday, June 24, 2005 - link

    I just wanted to remind everyone that 1080P at 60 Frames isn't even an approved ATSC Signal. 1080P at 30 and 24 frames is, but not 60. 1280x720 can run at 60, 30, and 24 that is unless you are running at 50 or 25 frames/sec in Europe.

Log in

Don't have an account? Sign up now