We’ve seen the architecture. We’ve seen the teasers. We’ve seen the Frontier. And we’ve seen the specifications. Now the end game for AMD’s Radeon RX Vega release is finally upon us: the actual launch of the hardware. Today is AMD’s moment to shine, as for the first time in over a year, they are back in the high-end video card market. And whether their drip feeding marketing strategy has ultimately succeeded in building up consumer hype or burnt everyone out prematurely, I think it’s safe to say that everyone is eager to see what AMD can do with their best foot forward on the GPU front.

Launching today is the AMD Radeon RX Vega 64, or just Vega 64 for short. Based on a fully enabled Vega 10 GPU, the Vega 64 will come in two physical variants: air cooled and liquid cooled. The air cooled card is your traditional blower-based design, and depending on the specific SKU, is either available in AMD’s traditional RX-style shroud, or a brushed-aluminum shroud for the aptly named Limited Edition.

Meanwhile the Vega 64 Liquid Cooled card is larger, more powerful, and more power hungry, utilizing a Radeon R9 Fury X-style external radiator as part of a closed loop liquid cooling setup in order to maximize cooling performance, and in turn clockspeeds. You actually won’t see AMD playing this card up too much – AMD considers the air cooled Vega 64 to be their baseline – but for gamers who seek the best Vega possible, AMD has put together quite a stunner.

Also having its embargo lifted today, but not launching until August 28th, is the cut-down AMD Radeon RX Vega 56. This card features lower clockspeeds and fewer enabled CUs – 56 out of 64, appropriately enough – however it also features lower power consumption and a lower price to match. Interestingly enough, going into today’s release of the Vega 64, it’s the Vega 56 that AMD has put the bulk of their marketing muscle behind.

AMD Radeon RX Series Specification Comparison
  AMD Radeon RX Vega 64 Liquid AMD Radeon RX Vega 64 AMD Radeon RX Vega 56 AMD Radeon R9 Fury X
Stream Processors 4096
(64 CUs)
4096
(64 CUs)
3584
(56 CUs)
4096
(64 CUs)
Texture Units 256 256 224 256
ROPs 64 64 64 64
Base Clock 1406MHz 1247MHz 1156MHz N/A
Boost Clock 1677MHz 1546MHz 1471MHz 1050MHz
Memory Clock 1.89Gbps HBM2 1.89Gbps HBM2 1.6Gbps HBM2 1Gbps HBM
Memory Bus Width 2048-bit 2048-bit 2048-bit 4096-bit
VRAM 8GB 8GB 8GB 4GB
Transistor Count 12.5B 12.5B 12.5B 8.9B
Board Power 345W 295W 210W 275W
(Typical)
Manufacturing Process GloFo 14nm GloFo 14nm GloFo 14nm TSMC 28nm
Architecture Vega
(GCN 5)
Vega
(GCN 5)
Vega
(GCN 5)
GCN 3
GPU Vega 10 Vega 10 Vega 10 Fiji
Launch Date 08/14/2017 08/14/2017 08/28/2017 06/24/2015
Launch Price $699* $499/599* $399/499* $649

Between these SKUs, AMD is looking to take on NVIDIA’s longstanding gaming champions, the GeForce GTX 1080 and the GeForce GTX 1070. In both performance and pricing, AMD expects to be able to bring NVIDIA’s cards to a draw, if not pulling out a victory for Team Red. This means we’ll see the $500 Vega 64 set against the GTX 1080, while the $400 Vega 56 goes up against the GTX 1070. At the same time however, the dark specter of cryptocurrency mining hangs over the gaming video card market, threatening to disrupt pricing, availability, and the best-laid plans of vendors and consumers alike. Suffice it to say, this is a launch like no other in a time like no other.

Overall it has been an interesting past year and a half to say the least. With a finite capacity to design chips, AMD’s decision to focus on the mid-range market with the Polaris series meant that the company effectively ceded the high-end video card market to NVIDIA once the latter’s GeForce GTX 1080 and GTX 1070 launched. This has meant that for the past 15 months, NVIDIA has had free run of the high-end market. Meanwhile AMD’s efforts to focus on the mid-range market to win back market share meant that AMD initially got the jump on NVIDIA in this market by releasing Polaris ahead of NVIDIA’s answer, and their market share has recovered some. However it’s a constant fight against the dominating NVIDIA, and one that’s been made harder by essentially being invisible to the few high-end buyers and the many window shoppers. That is a problem that ends today with the launch of the Vega 64.

I’d like to say that today’s launch is AMD landing a decisive blow in the video card marketplace, but the truth of the matter is that while AMD PR puts on their best face, there are signs that behind the scenes things are more chaotic than anyone would care for. Vega video cards were originally supposed to be out in the first-half of this year, and while AMD technically made that with the launch of the Vega Frontier Edition cards, it’s just that: a technicality. It was certainly not the launch that anyone was expecting at the start of 2017, especially since some of Vega’s new architectural functionality wasn’t even enabled at the time.

More recently, AMD’s focus on product promotion and on product sampling has been erratic. We’ve only had the Vega 64 since Thursday, giving us less than 4 days to completely evaluate the thing. Adding to the chaos, Thursday evening AMD informed us that we’d receive the Vega 56 on Friday, and encouraging us to focus on that instead. The reasoning behind this is complex – I don’t think AMD knew if it could have Vega 56 samples ready, for a start – but ultimately boils down to AMD wanting to put their best foot forward. And right now, the company believes that the Vega 56 will do better against the GTX 1070 than the Vega 64 will do against the GTX 1080.

Regardless, it means that we’ve only had a very limited amount of time to evaluate the performance and architectural aspects of AMD’s new cards, and even less time to write about them. Never mind chasing down interesting odds & ends. So while this is a full review of the Vega 64 and Vega 56, there’s some further investigating left to do once we recover from this blitz of a weekend and get our bearings back.

So without further ado, let’s dig into AMD return to the high-end market with their Vega architecture, Vega 10 GPU, and the Vega 64 & Vega 56 video cards.

Vega 10: Fiji of the Stars
POST A COMMENT

213 Comments

View All Comments

  • npz - Monday, August 14, 2017 - link

    My point was that since most modern games have recieved enhancements for ps4 pro and more will moving forward -- given it's the engine the devs use -- and that the vast majority are cross platform, then major PC games will already have a built in fp16 optimazation path to be taken advantage of.

    Also don't forget s
    Scorpio's arrival which will likely feature the same, so there's would be even more incentive for using this on PC
    Reply
  • Yojimbo - Tuesday, August 15, 2017 - link

    From what I have heard, Scorpio will not contain double rate fp16.

    And I am not sure that your claim that most modern game engines have been enhanced to take advantage of double rate fp16. I highly doubt that's true. Maybe a few games have cobbled in code to take advantage of low-hanging fp16 fruit.

    As far as AMD's "advantage", don't forget that NVIDIA had double rate FP16 before AMD. They left it out of Pascal to help differentiate their various data center cards (namely the P100 from the P40) in machine learning tasks. But now that the Volta GV100 has tensor cores it's not necessary to restrict double rate FP16 to only the GV100. For all we know double rate FP16 will be in their entire Volta lineup.
    Reply
  • Yojimbo - Tuesday, August 15, 2017 - link

    edit: I meant to say "They left it out of mainstream Pascal..." (as in GP102, GP104, GP106, GP107, GP108) Reply
  • Santoval - Tuesday, August 15, 2017 - link

    I am almost 100% certain that consumer Volta GPUs will have disabled double rate FP16 and completely certain that it will have disabled tensor cores. Otherwise they will kiss their super high margins of professional GPU cards goodbye, and Nvidia is never going to do that. Tensor cores were largely added so that Nvidia can compete with Google's tensor CPU in the AI / deep learning space. Google still does not sell that CPU but that might change. Unlike Google's CPU, which can be used only for AI inference, Volta's tensor cores will do both inference and training, and that is very important for this market. Reply
  • Yojimbo - Wednesday, August 16, 2017 - link

    Well, my point was that since they have tensor cores they can afford to have double rate FP16, so of course I agree that there will not be tensor cores enabled on consumer Volta cards. If the tensor cores give significantly superior performance to simple double rate FP16 (and NVIDIA's benchmarks show that they do) then why would NVIDIA need to wall off simple double rate FP16 to protect their V100 card? As much as NVIDIA want to try to protect their margins they also need to stave off competition. The tensor cores allow them to do both at once. They push forward the capabilities of the ultra high end (V100 while allowing double rate FP16 to trickle down to cheaper cards to stave off competition. I am not saying that I think they definitely will do it, but I see the opportunity is there. Frankly, I think the reason they wouldn't do it is if they don't think the cost of power budget or dollars to implement it is worth the gain in performance in gaming. Also, perhaps they want to create three tiers: the V100 with tensor cores, the Volta Titan X and/or Tesla V40 with double rate FP16, and everything else.

    As far as Google's TPUs, their TPU 2 can do training and inferencing. Their first TPU did only inferencing on 8 bit quantized (integer) networks. The TPU 2 does training and inferencing on FP16-based networks. The advantage NVIDIA's GPUs have are that they are general purpose parallel processors, and not specific to running computations for convolutional neural networks.
    Reply
  • Santoval - Tuesday, August 15, 2017 - link

    Nope, it was explicitly stated by MS that Scorpio's GPU will ship with disabled Rapid Math. Why? I have no idea. Reply
  • Nintendo Maniac 64 - Tuesday, August 15, 2017 - link

    Codemasters apparently doesn't realize that the Tegra X1 used in the Nintendo Switch also supports fp16, so it's not something unique to the PS4 Pro... Reply
  • OrphanageExplosion - Tuesday, August 15, 2017 - link

    There was also FP16 support in the PlayStation 3's RSX GPU. Generally speaking, the PS3 still lagged behind Xbox 360 in platform comparisons.

    The 30% perf improvement for Mass Effect is referring to the checkerboard resolve shader, not the entire rendering pipeline.

    For a more measured view of what FP16 brings to the table, check out this post: http://www.neogaf.com/forum/showpost.php?p=2223481...
    Reply
  • Wise lnvestor - Tuesday, August 15, 2017 - link

    Did you even read the gamingbolt article? And look at the picture? When a dev talk about how much they saved in milliseconds, IT IS THE ENTIRE rendering pipeline. Reply
  • romrunning - Monday, August 14, 2017 - link

    6th para - "seceded" should be "ceded" - AMD basically yielded the high-market to Nvidia, not "withdraw" to Nvidia. :) Reply

Log in

Don't have an account? Sign up now