Sony just announced the PlayStation 4, along with some high level system specifications. The high level specs are what we've heard for quite some time:

  • 8-core x86-64 CPU using AMD Jaguar cores (built by AMD)
  • High-end PC GPU (also built by AMD), delivering 1.84TFLOPS of performance
  • Unified 8GB of GDDR5 memory for use by both the CPU and GPU with 176GB/s of memory bandwidth
  • Large local hard drive

Details of the CPU aren't known at this point (8-cores could imply a Piledriver derived architecture, or 8 smaller Jaguar cores—the latter being more likely), but either way this will be a big step forward over the PowerPC based general purpose cores on Cell from the previous generation. I wouldn't be too put off by the lack of Intel silicon here, it's still a lot faster than what we had before and at this level price matters more than peak performance. The Intel performance advantage would have to be much larger to dramatically impact console performance. If we're talking about Jaguar cores, then there's a bigger concern long term from a single threaded performance standpoint.

Update: I've confirmed that there are 8 Jaguar based AMD CPU cores inside the PS4's APU. The CPU + GPU are on a single die. Jaguar will still likely have better performance than the PS3/Xbox 360's PowerPC cores, and it should be faster than anything ARM based out today, but there's not huge headroom going forward. While I'm happier with Sony's (and MS') CPU selection this time around, I always hoped someone would take CPU performance in a console a bit more seriously. Given the choice between spending transistors on the CPU vs. GPU, I understand that the GPU wins every time in a console—I'm just always an advocate for wanting more of both. I realize I never wrote up a piece on AMD's Jaguar architecture, so I'll likely be doing that in the not too distant future. Update: I did.

The choice of 8 cores is somewhat unique. Jaguar's default compute unit is a quad-core machine with a large shared L2 cache, it's likely that AMD placed two of these together for the PlayStation 4. The last generation of consoles saw a march towards heavily threaded machines, so it's no surprise that AMD/Sony want to continue the trend here. Clock speed is unknown, but Jaguar was good for a mild increase over its predecessor Bobcat. Given the large monolithic die, AMD and Sony may not have wanted to push frequency as high as possible in order to keep yields up and power down. While I still expect CPU performance to move forward in this generation of consoles, I was reminded of the fact that the PowerPC cores in the previous generation ran at very high frequencies. The IPC gains afforded by Jaguar have to be significant in order to make up for what will likely be a lower clock speed.

We don't know specifics of the GPU, but with it approaching 2 TFLOPS we're looking at a level of performance somewhere between a Radeon HD 7850 and 7870. Update: Sony has confirmed the actual performance of the PlayStation 4's GPU as 1.84 TFLOPS. Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units. It's unclear how custom the GPU is however, so we'll have to wait for additional information to really know for sure. The highest end PC GPUs are already faster than this, but the PS4's GPU is a lot faster than the PS3's RSX which was derived from NVIDIA's G70 architecture (used in the GeForce 7800 GTX, for example). I'm quite pleased with the promised level of GPU performance with the PS4. There are obvious power and cost constraints that would keep AMD/Sony from going even higher here, but this should be a good leap forward from current gen consoles.

Outfitting the PS4 with 8GB of RAM will be great for developers, and using high-speed GDDR5 will help ensure the GPU isn't bandwidth starved. Sony promised around 176GB/s of memory bandwidth for the PS4. The lack of solid state storage isn't surprising. Hard drives still offer a dramatic advantage in cost per GB vs. an SSD. Now if it's user replaceable with an SSD that would be a nice compromise.

Leveraging Gaikai's cloud gaming technology, the PS4 will be able to act as a game server and stream the video output to a PS Vita, wirelessly. This sounds a lot like what NVIDIA is doing with Project Shield and your NVIDIA powered gaming PC. Sony referenced dedicated video encode/decode hardware that allows you to instantaneously record and share screenshots/video of gameplay. I suspect this same hardware is used in streaming your game to a PS Vita.

Backwards compatibility with PS3 games isn't guaranteed and instead will leverage cloud gaming to stream older content to the box. There's some sort of a dedicated background processor that handles uploads and downloads, and even handles updates in the background while the system is off. The PS4 also supports instant suspend/resume.

The new box heavily leverages PC hardware, which is something we're expecting from the next Xbox as well. It's interesting that this is effectively how Microsoft entered the console space back in 2001 with the original Xbox, and now both Sony and MS have returned to that philosophy with their next gen consoles in 2013. The PlayStation 4 will be available this holiday season.

I'm trying to get more details on the CPU and GPU architectures and will update as soon as I have more info.

Source: Ustream

Comments Locked

160 Comments

View All Comments

  • This Guy - Thursday, February 21, 2013 - link

    Still run a 3GHz Pentium 4? Why not?

    I also have a 1GHz C-50 Bobcat that completely smokes my old 2GHz AMD 3000+ (which in turn was faster than the 3GHz P4's of the same era). I also have an i7-2630QM laptop. I only notice the extra power when simulating clocks.

    Modern CPU's are so much faster in part because they include all the other chips that used to make up a mother board (like most of the north bridge and memory controller). Think about grabing the pen next to you compared to sending someone to the shops to buy a new one. Both pen's write just as fast, just your going to have to wait a while for the one from the shops.

    The stuff Xorrax talks about is the other part. A great branch predictor and a good memory hierarchy do great things for IPC.
  • Mugur - Thursday, February 21, 2013 - link

    Sorry, but no. If you think a C50 performance is enough, you're delusional. Just check some cpu benchmarks... Even a newer 1800 Jaguar is way more slower than any Sandy Bridge mobile Celeron.

    I'm an AMD fan so I'm glad to see them in both major consoles, but we need Trinity performance here, not Jaguar. And please don't compare to PS3 or xbox 360 cpus. Compare to the x86 world...
  • Xorrax - Thursday, February 21, 2013 - link

    Pick your comparison -- I'm comparing to existing consoles, which these are replacing. Not PC gaming

    The size, cost, and power benefits of 8 Jaguar cores over other something else doesn't seem like a bad decision. As others have mentioned, it will motivate developers to optimize for multi-core more than they have historically.

    Links to benchmarks? Jaguar isn't in the market yet, so I'd be surprised to find anything meaningful out there other than data from engineering samples, which may be suspect.
  • powerarmour - Friday, February 22, 2013 - link

    You can't compare what CPU performance is like on a PC, compared to a console with minimal OS overheads.

    Besides, with more heterogeneous compute capability in the PS4, you could offload a heck of a lot to the main GPU, freeing up the CPU's for other less demanding tasks.
  • tipoo - Thursday, February 21, 2013 - link

    I think an 8 core 1.6GHz Jaguar is fine. No it's not the highest end you can get on PC, but they have costs to consider. Plus it will have some modifications that we don't know of too, and no Jaguar on the PC side has access to GDDR5 memory. Developers will have to work around its limitations, that's just par for the course for consoles.
  • abrowne1993 - Thursday, February 21, 2013 - link

    I'm not really savvy to this stuff. Can someone do any sort of preliminary cost analysis to guess how much the components would be? Any sort of general idea would be fine. Just wondering how heavy of a loss Sony is expected to take on this.
  • Shadowmaster625 - Thursday, February 21, 2013 - link

    First we have to estimate the transistor count. My estimate is 3.2 billion, 2.8 billion for the gpu and 400 million for the cpu. For a chip that size Sony is probably paying $80-$100 apiece for the first million, and who knows what after that. I cant find the price on a HD7950M gpu die but it cant be much more than $100. So this chip isnt going to be much more than that either. Since the volume is higher I would expect it to be closer to $70 than $100. I'm sure AMD is eating it compared to the margins they get from discrete gpu sales.
  • babyleg003 - Thursday, February 21, 2013 - link

    Really, AMD...? A company with $2.70 stock price, that tell you just how capable this company is...not. I guess when you go with the worst you get what you pay for. Give me an Intel i7 processor anytime, I'll pay the extra for performance...
  • JarredWalton - Thursday, February 21, 2013 - link

    I don't mind them using AMD, but Jaguar? Really? Ouch. It's Cell craziness all over again, only I'm not sure 8x Jaguar will really provide that much more performance than the CellBE. At 1.6GHz, Jaguar is basically something like 30% faster than Atom at 1.6GHz, which means something like a 1.6GHz Ivy Bridge Core or a Piledriver core would be three times faster. I would have much rather seen 3GHz quad-core Piledriver or similar than octal-core Jaguar.
  • ShieTar - Thursday, February 21, 2013 - link

    Of course a CPU like the FX8350 would be often bored when coupled with a HD7850 feeding 1080p. So it would also be a good idea to replace the GPU with something in the 7950 or better range. And if Sony still manages to sell that box for <500$, everybody would be happy.

Log in

Don't have an account? Sign up now