The Red Ring of Death

Earlier this year, Microsoft announced an extension of the Xbox 360's warranty from 1 to 3 years for consoles affected by the infamous Red Ring of Death (RRoD) defect. Microsoft never confirmed what actually caused the RRoD, or how many consoles would ultimately be affected, but the symptoms are very well known. Your console will start to freeze/lock up, eventually followed by three red lights on the front of the system, after which you'll either be able to revive the box for short periods of time or it becomes an expensive piece of modern art.

The present solution to RRoD is pretty simple; you call Microsoft's support hotline, you give the representative some information about your Xbox 360 (he/she will then walk you through some diagnostic steps, nothing too painful), and a few days later you'll find an empty box at your doorstep. Toss in your Xbox 360, affix the pre-paid shipping label (Microsoft even provides tape to seal the box) and about a month later you'll get a refurbed or brand new Xbox 360, as well as a 30-day pass for Xbox Live. While you're without your console for as much as a month, at least there's no cost incurred; overall Microsoft takes care of RRoD victims quite well.

Many have surmised that the reason for the RRoD problems is because of inadequate GPU cooling, resulting in fractures in the lead-free solder between the chip and the motherboard. We haven't been able to confirm this suspicion but we have been able to find evidence that Microsoft ignored many suggestions to improve GPU cooling in the Xbox 360, although we're not sure why.

Simply looking at the Xbox 360's internals you see that there's something wrong with the cooling setup; the heatsink covering the GPU, albeit wide, is barely large enough to cool a low end desktop graphics card, much less the higher powered GPU that's in the Xbox 360. If we assume that the Xbox 360's GPU is at least as powerful as the PS3's, the cooling requirements should be somewhere similar; given that the PS3 basically had a GeForce 7800 GTX under its hood, the cooling requirements should be similar. What would require a two-slot cooling solution in a desktop PC was given a barely adequate heatsink on the Xbox 360 and stuck underneath a DVD drive.

Despite the seemingly inadequate cooling, the Xbox 360 worked just fine - the exception being what seemed to be an inordinate amount of RRoD failures, but since Microsoft extended the warranty it wasn't a huge problem, just more of an annoyance.

It's possible that simple tweaks in the manufacturing process could reduce the likelihood of RRoD, assuming that it is heat related. As yield curves improve over the life of manufacturing a particular chip, it is possible to produce chips that run at lower voltages and are thus cooler. It could very well be that the consoles that fail due to RRoD are simply using higher yield GPUs that run at higher voltages, and thus produce more heat, explaining why the problem seems to affect some consoles but not others. If this correlation were true, as overall chip yields improve, the chances of RRoD go down.

When rumors began creeping up about Microsoft moving to 65nm chips in the Xbox 360, many wondered if this could be the end of the RRoD problems, saving owners the headache with dealing with potential failure. Assuming that the root cause of RRoD is inadequate cooling, it is feasible that moving to cooler chips could alleviate if not altogether fix the problem.

While there's no conclusive way of proving whether or not these new Xbox 360s will reduce the chances of the dreaded RRoD, the geek in us couldn't help but try to go find one of these babies, test it and take it apart.

Index Identifying a 65nm Xbox 360
Comments Locked

46 Comments

View All Comments

  • ChristopherO - Sunday, November 18, 2007 - link

    It looked like you were familiar, but I partly mentioned that because other people reading might not have been. You never know when you can win over a convert.

    That's funny you have a Seasonic... We're probably running the same PSU, I have the S12 600, which I got for a steal when Silicon Acoustics went out of business. 2.5" drives are a great way to go, but I'm not willing to give up the performance of a Raptor for one. My Raptor 150 in suspension is near silent. I'm on Vista 64, with 4GB memory, so that helps the seek situation... Vista caches everything it can at boot time, so once you're through the initial power-on phase, everything is pretty sedate.

    Sure you can cap the DVD reads, but I use a dense foam padding that I use in lieu of the Sonata rails in my chassis and the net effect is that the drive is vastly quieter. It is the nosiest component of the system, but that doesn't bug me because hardly anything uses discs these days. Typically you install, and then you're done. Or you insert a game and run the "piracy check" and then the drive spins down. Inserting a movie, etc, doesn't spin the drive up so that's not even audible.
  • AssBall - Saturday, November 17, 2007 - link

    I took my DVD ROM out of my case and, quite frankly, it is significatly louder in my hand than it was when being muffled by the case (no rubber washers).
  • saiku - Friday, November 16, 2007 - link

    is there something that would wrap around the hd-dvd drive? would still have to leave the vent holes open but perhaps some material out there that damps noise?
  • ChristopherO - Friday, November 16, 2007 - link

    I haven't seen the inside of the HD DVD add-on. Generally speaking you'd want to remove the drive and dampen that (perhaps building a custom chassis). Otherwise the HD DVD add-on will have metal to metal contact and thus be generating noise that you can't isolate. Wrapping the whole thing in foam isn't very practical.
  • swaaye - Friday, November 16, 2007 - link

    It's the damn DVDROM. They have that thing cranking at full RPMs almost all of the time. Hell, I've heard the disk come out of its grip once and spin out inside. It's ridiculous, IMO.
  • provoko - Friday, November 16, 2007 - link

    Thanks for opening up a 360 for us and testing it. =) I enjoyed the wattage charts, the same ones you use for CPUs.
  • semo - Friday, November 16, 2007 - link

    what is a half-node?

    and
    quote:

    If we assume that the Xbox 360's GPU is at least as powerful as the PS3's, the cooling requirements should be somewhere similar; given that the PS3 basically had a GeForce 7800 GTX under its hood
    it still has hasn't it? when will the ps3 get a gpu shrink btw.
  • ChristopherO - Friday, November 16, 2007 - link

    A half-node is basically a die shrink that doesn't require reworking the component.

    For instance you can't take a 90nm chip, and convert it to 65nm without redesigning the chip. Sure the end product would be substantially the same, but you will need to rework a considerable portion to fit the new size. 65nm to 45nm is the same, you need to redesign your chip.

    The 80nm shrink lets you use the same design as 90nm, but smaller.

    More than likely the GPU is an 80nm design.

    For example, shrinking from 90 to 80 gives you 88.8% size (which is close to Anand's measured 85%). Shrinking from 90 to 65 is 72.2% the size, however this will be less exact than the half-node since the chip itself is going to be somewhat different (75% is a good enough estimate). Both these sizes will result in a decent heat and power savings.

    No one can say if this will kill-off the RROD, but it stands a pretty good chance.

    I'm willing to guess that the failure rates will drop to something normal (a single-digit percentage). No guarantee on that, but they trimmed off 70 watts of power usage, that's a pretty remarkable savings which will go a long way towards improving the overall heat situation. Not to mention that their GPU heat sinks are substantially beefier, so you're dissipating less heat over a larger area. Even if the GPU was the same, it will have less contention with the CPU for air cooling.

    Too bad they didn't mount the DVD ROM on rubber grommets. It isn't bad when properly isolated, but the metal case is acting like an amplifier.
  • psychobriggsy - Sunday, November 18, 2007 - link

    Um, 90nm to 80nm results in a theoretical die area of 79% of the 90nm one - remember there are two dimensions to the shrink.

    90nm to 65nm results in a die area of 52% of the 90nm one, ideally.

    Scaling never achieves that due to factors such as the spacing between transistors not scaling as well, or the shrink not scaling well in one axis compared to the other, or a million other reasons. Hence I believe that the GPU is 80nm, and the CPU is 65nm.

    The shrink indeed should reduce or halt the RROD situation if it was caused by excessive heat leading to warping that broke contacts.
  • ChristopherO - Monday, November 19, 2007 - link

    Whoops, you were right. I typed fast and forgot to multiply the reduction by 2.

    In theory the 80nm shrink should scale more linearly since it is fundamentally similar to the 90nm part. The 65nm part has the potential to be significantly different. I guess the numbers bear that out, the GPU is much closer to predicted than the CPU.

    It makes me wonder what else they might have changed. In theory they could have made alterations to the chip that increased media performance, etc, without impacting general computations and expected behavior within a game.

    It's a weird quandary since they have a baseline level of performance which they aren't trying to diverge from.

Log in

Don't have an account? Sign up now