Hardware Changes

We wanted to see the new CPU up close, so we went straight to our original Xbox 360 disassembly instructions which, surprisingly enough, still work on the new consoles. There are a couple of changes, the only torx driver we used was a T12 and the torx screws on the bottom of the console are now a mixture of gold, silver and black whereas they used to be just silver and black. Other than those changes, the entire process can be completed just the way we diagrammed it over two years ago.

With the tops off our Xboxes, we can look at some of the changes first hand:


The original Xbox 360


The new Xbox 360, the additional heatsink you see at the top of the image was originally added in an earlier model of the Xbox 360, it makes another appearance here in the Falcon

The cooling setup is definitely less beefy than with the original console:


The original Xbox 360, note the heatpipe running through the CPU heatsink on the right. The GPU heatsink is on the left.


The new Xbox 360, note the lack of a heatpipe going through the CPU's heatsink (right) and the additional heatsink for the GPU (bottom right).

 

It's amazing how little has actually changed with the internal design of the console, obviously some components have changed (e.g. DRAM) but the general layout remains the same after two years.

Identifying a 65nm Xbox 360 The New Chips
Comments Locked

46 Comments

View All Comments

  • ChristopherO - Sunday, November 18, 2007 - link

    It looked like you were familiar, but I partly mentioned that because other people reading might not have been. You never know when you can win over a convert.

    That's funny you have a Seasonic... We're probably running the same PSU, I have the S12 600, which I got for a steal when Silicon Acoustics went out of business. 2.5" drives are a great way to go, but I'm not willing to give up the performance of a Raptor for one. My Raptor 150 in suspension is near silent. I'm on Vista 64, with 4GB memory, so that helps the seek situation... Vista caches everything it can at boot time, so once you're through the initial power-on phase, everything is pretty sedate.

    Sure you can cap the DVD reads, but I use a dense foam padding that I use in lieu of the Sonata rails in my chassis and the net effect is that the drive is vastly quieter. It is the nosiest component of the system, but that doesn't bug me because hardly anything uses discs these days. Typically you install, and then you're done. Or you insert a game and run the "piracy check" and then the drive spins down. Inserting a movie, etc, doesn't spin the drive up so that's not even audible.
  • AssBall - Saturday, November 17, 2007 - link

    I took my DVD ROM out of my case and, quite frankly, it is significatly louder in my hand than it was when being muffled by the case (no rubber washers).
  • saiku - Friday, November 16, 2007 - link

    is there something that would wrap around the hd-dvd drive? would still have to leave the vent holes open but perhaps some material out there that damps noise?
  • ChristopherO - Friday, November 16, 2007 - link

    I haven't seen the inside of the HD DVD add-on. Generally speaking you'd want to remove the drive and dampen that (perhaps building a custom chassis). Otherwise the HD DVD add-on will have metal to metal contact and thus be generating noise that you can't isolate. Wrapping the whole thing in foam isn't very practical.
  • swaaye - Friday, November 16, 2007 - link

    It's the damn DVDROM. They have that thing cranking at full RPMs almost all of the time. Hell, I've heard the disk come out of its grip once and spin out inside. It's ridiculous, IMO.
  • provoko - Friday, November 16, 2007 - link

    Thanks for opening up a 360 for us and testing it. =) I enjoyed the wattage charts, the same ones you use for CPUs.
  • semo - Friday, November 16, 2007 - link

    what is a half-node?

    and
    quote:

    If we assume that the Xbox 360's GPU is at least as powerful as the PS3's, the cooling requirements should be somewhere similar; given that the PS3 basically had a GeForce 7800 GTX under its hood
    it still has hasn't it? when will the ps3 get a gpu shrink btw.
  • ChristopherO - Friday, November 16, 2007 - link

    A half-node is basically a die shrink that doesn't require reworking the component.

    For instance you can't take a 90nm chip, and convert it to 65nm without redesigning the chip. Sure the end product would be substantially the same, but you will need to rework a considerable portion to fit the new size. 65nm to 45nm is the same, you need to redesign your chip.

    The 80nm shrink lets you use the same design as 90nm, but smaller.

    More than likely the GPU is an 80nm design.

    For example, shrinking from 90 to 80 gives you 88.8% size (which is close to Anand's measured 85%). Shrinking from 90 to 65 is 72.2% the size, however this will be less exact than the half-node since the chip itself is going to be somewhat different (75% is a good enough estimate). Both these sizes will result in a decent heat and power savings.

    No one can say if this will kill-off the RROD, but it stands a pretty good chance.

    I'm willing to guess that the failure rates will drop to something normal (a single-digit percentage). No guarantee on that, but they trimmed off 70 watts of power usage, that's a pretty remarkable savings which will go a long way towards improving the overall heat situation. Not to mention that their GPU heat sinks are substantially beefier, so you're dissipating less heat over a larger area. Even if the GPU was the same, it will have less contention with the CPU for air cooling.

    Too bad they didn't mount the DVD ROM on rubber grommets. It isn't bad when properly isolated, but the metal case is acting like an amplifier.
  • psychobriggsy - Sunday, November 18, 2007 - link

    Um, 90nm to 80nm results in a theoretical die area of 79% of the 90nm one - remember there are two dimensions to the shrink.

    90nm to 65nm results in a die area of 52% of the 90nm one, ideally.

    Scaling never achieves that due to factors such as the spacing between transistors not scaling as well, or the shrink not scaling well in one axis compared to the other, or a million other reasons. Hence I believe that the GPU is 80nm, and the CPU is 65nm.

    The shrink indeed should reduce or halt the RROD situation if it was caused by excessive heat leading to warping that broke contacts.
  • ChristopherO - Monday, November 19, 2007 - link

    Whoops, you were right. I typed fast and forgot to multiply the reduction by 2.

    In theory the 80nm shrink should scale more linearly since it is fundamentally similar to the 90nm part. The 65nm part has the potential to be significantly different. I guess the numbers bear that out, the GPU is much closer to predicted than the CPU.

    It makes me wonder what else they might have changed. In theory they could have made alterations to the chip that increased media performance, etc, without impacting general computations and expected behavior within a game.

    It's a weird quandary since they have a baseline level of performance which they aren't trying to diverge from.

Log in

Don't have an account? Sign up now