When Nintendo announced the Wii U there was a lot of interest in its hardware specs. The new console could very well end up the fastest kid on the block thanks to the sheer age of the Xbox 360 and PlayStation 3. I wasn't all that interested in the Wii U, but I did want to get a look at the silicon inside so I grabbed a unit and went into teardown mode.

The Wii U is available in two versions: the basic and deluxe kits. The basic kit comes in white, while the deluxe is black. Both are glossy. The deluxe console gives you 32GB of on-board storage (eMMC NAND) vs. 8GB for the standard model. The deluxe kit also includes a stand and cradle for the Wii U tablet GamePad as well as a stand for the console itself. The two kits retail for $299.99 and $349.99, respectively.

The Wii U hardware itself looks a lot like a larger Wii. Getting inside the chassis is also quite similar. The CMOS battery door is the first thing you'll have to remove, followed by 8 more screws (a mixture of philips head and tri-wing). The bulk of these screws are behind console-colored stickers, be sure to peel them all off. With all 8 (9 including the CMOS battery door) screws removed, you can slide the left side of the Wii U away from the front of the console, and off all together. This reveals the final three tri-wing screws that you'll need to remove to get inside the chassis (the gallery of all of this is at the bottom of the article).

With all 12 screws removed, pry the top of the Wii U up and away from the body until it separates from the rest of the frame revealing the console's fairly compact internals:

The Wii U optical drive uses a custom format for game storage, but offers a very Blu-ray-like 25GB capacity per disc. Max sequential read speeds are pretty high compared to the current gen consoles at 22MB/s.

Two screws hold the front cover in place, followed by four screws that hold the optical drive in place. Be careful when removing the optical drive as there's a ribbon cable attached to the motherboard for power/data. The same goes for the front cover, although its ribbon cable is really only on light/switch duty.

With the optical drive removed, next up is removing the shielding on the top and bottom of the motherboard and the shround on top of the heatsink. Just go around the perimeter of the motherboard removing screws (you'll be able to remove all but two easily). Once you've done this, the motherboard will be able to separate from the Wii U's lower tray.

Removing the shielding itself requires carefully moving the antenna cables out of the way. As these wires are soldered to the Wii U chassis on one end, be very careful not to pull too hard otherwise you run the risk of needing to break out the soldering iron.

If you've removed all of the screws and freed the antenna wires from their guides a bit, you should be able to pull back the plastic heatsink shroud, revealing...more shielding:

With no screws left to hold it in place however, the shielding is easily dealt with (again pay close attention to the antenna wires). The same is true for the bottom of the PCB.

The two sets of antenna wires go to two independent wireless controllers: one for 802.11b/g/n WiFi, the other to a dedicated 802.11n controller to handle Miracast display streaming between the Wii U and the GamePad display. Thanks to Ryan Shrout over at PC Per for figuring this one out!

Once you've removed all shielding you're left with a pretty clean looking motherboard:

On the top side of the board you'll see the eMMC/NAND package, in this case it's a dual-die Samsung eMMC solution (there's another Toshiba NAND device on the back of the board, not for user storage):

There are four 4Gb (512MB) Hynix DDR3-1600 devices surrounding the Wii U's MCM (Multi Chip Module). Memory is shared between the CPU and GPU, and if I'm decoding the DRAM part numbers correctly it looks like these are 16-bit devices giving the Wii U a total of 12.8GB/s of peak memory bandwidth. (Corrected from earlier, I decoded the Hynix part numbers incorrectly). Our own Ryan Smith found a great reference for the original Wii so we can compare memory frequencies. It looks like the original Wii had a 32-bit wide GDDR3 memory interface running at a max datarate of 1.4GHz for a total of 5.6GB/s of bandwidth (excluding eDRAM).

That doesn't sound like a lot (it's the same amount of memory bandwidth on the Nexus 10 and iPad 3/4), but the Wii U is supposed to have a good amount of eDRAM for both the CPU and GPU to use. Also keep in mind that the Nexus 10 and iPad 3/4 have to drive much higher resolutions than the Wii U does.

The Wii U's MCM is unfortunately covered by a heatspreader, but given that I went through all of this to look at the console's silicon, I was going to look at the console's silicon.

Normally to remove an integrated heat spreader (IHS) you grab a sharp blade and go around cutting the (hopefully) glue around the perimeter of the chip. In the case of the Wii U's MCM, the blades I'd normally use were too thick. A few years ago I decided to give shaving with a double edge safety razor a try. My attempts failed poorly, but I had a stack of unused DE razor blades that were thin enough to get the IHS off.

The trick here is to apply enough pressure to the blade to cut through the glue, as simply trying to saw through the glue will take forever. There are two blobs of glue per side, but if you're trying to remove the IHS be careful not to cut through the glue and scrape any of the actual exposed silicon...like I did.

With the IHS off, we have the Wii U's MCM in all of its glory:

There are actually three components on this single package, made in at least two different microprocessor fabs. The multicore PowerPC based CPU is the smaller of the two larger chips. This die is made on IBM's 45nm SOI process. The RV7xx derived GPU is the biggest die on the package, and I'm presuming it was made on a 40nm process. I'm assuming the very tiny die in the corner is actually some off-chip memory. Both the CPU and GPU in the Wii U are supposed to have some eDRAM, although the bulk of it is likely dedicated for the GPU.


Clockwise from the top: CPU, GPU, off-chip memory?

The approximate die sizes for all components on the MCM are in the table below:

Wii U Silicon Analysis
  Dimensions Approximate Die Size
CPU 5.2mm x 6.3mm 32.76mm2
GPU 12.3mm x 12.7mm 156.21mm2
3rd die (memory?) 1.79mm x 1.48mm 2.65mm2

If we assume a 40nm process for the GPU, then we're looking at something a bit larger than the RV740. The Wii U does boast backwards compatibility with games made for the original Wii, which is made possible thanks to a shared ISA with the original PowerPC based Wii.

The size comparison between CPU and GPU die shouldn't be too much of a surprise. When building a dedicated gaming machine it always makes sense to throw more transistors at your GPU. The nearly 5x ratio of GPU to CPU die size here is a bit on the extreme side though. I suspect many of the current generation consoles, including the Wii U, suffered from a lack of a powerful yet affordable CPU solution that could be easily implemented. 

I also took some power measurements on the Wii U. The system is powered by a 75W external power supply, but total system power consumption doesn't even hit half of that (at least with the games I tried):

Wii U Power Consumption
  System Power Consumption in Watts
Standby (Power Off) 0.22W
Wii U Menu (No Disc in Drive) 31.2W
Wii U Menu (Disc in Drive) 32.8W
Super Mario U 33.0W
Netflix Playback 28.5W

Rendering the Wii U menu actually consumes almost as much power as playing Super Mario U. Watching a movie on Netflix consumes a bit less power, my guess is a lot of the 3D blocks are power gated leaving only the CPU cores and video decode hardware active.

The Wii U ships with its own web browser based on webkit, the user agent string for the latest version of the Wii U's software is: Mozilla/5.0 (Nintendo WiiU) AppleWebKit/534.52 (KHTML, like Gecko) NX/2.1.0.8.21 NintendoBrowser/1.0.0.7494.US.

Pages load quickly and compatibility is surprisingly decent (HTML5 test: 258 + 4 bonus points). By default you control and view the browser on the GamePad, but you can also choose to display the content on your TV via the console. Scrolling is very smooth and the overall experience is way better than what you'd normally expect from a web browser on what's primarily a game console. It's not quite as good as using a modern tablet, but still usable. And where there is a browser, we will run SunSpider on it:

SunSpider Javascript Benchmark 0.9.1 - Stock Browser

The browser is using an older version of WebKit, which is likely responsible for some of the not absolutely stellar performance here.

Comments Locked

132 Comments

View All Comments

  • Ryan Smith - Monday, November 19, 2012 - link

    Keep in mind that the 4770 was among the very first products produced on a very troubled TSMC 40nm process. Anything produced now (nearly 3.5 years later) is going to have the benefit of the process maturing and lots of design experience to fall back on for optimizing the layout and transistor leakage.
  • tipoo - Monday, November 19, 2012 - link

    Plus the embedded GPUs draw quite a bit less power, ie the e6760 mentioned below.
  • Flyingcircus - Monday, November 19, 2012 - link

    do we even know that the GPU is still being manufactured in the original structure size? wouldn't it be possible that they've worked with AMD on a die shrink?

    AMD has been using the 32nm process in mass production for their IGPs for well over a year after all

    you have a point with the refining of the process but the TDPs of the later cards manufactured in that structure size (i.e. HD 5 and HD 6) don't really support that as power consumption seems to have been largely staying level on GPUs with similar transistor numbers and similar raw GFLOPs performance
  • tarv - Monday, November 19, 2012 - link

    Read somewhere there was a leaked email confirming this was the part that the Wii-U GPU was based off of.

    http://www.amd.com/us/products/embedded/graphics-p...
  • tarv - Monday, November 19, 2012 - link

    That gives it 480 shaders and tops out at 576 gflops. It also is 40nm and runs at 35-40 watts.
  • Sniffynose - Monday, November 19, 2012 - link

    The unknown is that many hint at it being dx10.1 or not even a dx instruction set (as mentioned by some Indy devs) also the inclusion of ddr3 with a 64 bit bus width puts it really far back in the 4000 series era like 4550 range of the r700 chips.

    The e6760 would be epic but it uses DDR with a 128 bus width, which is why its the rumored chip for either the ps4 or 720.

    The wattage is definitely massively lower though I mean the launch ps3 pushed 180 watts whereas the wii u is hitting under 40 consistently.

    I can't wait until it gets the X-ray done so we can see how much or how little is really in there.

    I think what is throwing us off is that the ram is shared, is it ram for general use and the discrete gpu has more of it's own ( making the gpu you suggested completely viable) or is it shared as in being used on a much older inferior gpu.

    I personally hope we learn of good news like something along the e6760, rather then some horrible low wattage e4xxx series gpu with shared ram.
  • Kevin G - Monday, November 19, 2012 - link

    You're forgetting the massive amount of eDRAM on the GPU die with regards to bandwidth. The width there could easily be 1024 bit wider (or wider). Bandwidth for that 32 MB of eDRAM should not be an issue.

    Case in point, the PS2's Emotion Engine had a 2560 bit wide bus to its 4 MB of eDRAM on die and that was over 10 years ago.
  • Flyingcircus - Monday, November 19, 2012 - link

    do we already know the size of the eDRAM on the GPU die? i wasn't aware

    if it's big enough that could be the saving grace of the GPU, 32MB does seem a bit small however if you take todays high res textures in account
  • Kevin G - Monday, November 19, 2012 - link

    The 32 MB figure has been floating around for awhile as one of the few confirmed specs out there (2 GB of system memory was the other hard figure that has been out in the public).

    The 32 MB wouldn't be for texture storage in most cases. Rather it would be used to hold various frame buffers for quick read/write operations. That is enough for four 32 bit 1920 x 1080 buffers.
  • Flyingcircus - Tuesday, November 20, 2012 - link

    well that's the first time i heard about it (i did hear about the 2GB rumors beforehand tho.. i also heard numerous other rumors regarding different memory sizes, one of them was bound to be correct) and as far as i'm aware nintendo hasn't made any comment about it so i would be careful with the word "confirmed"

Log in

Don't have an account? Sign up now