Meet the GTX 480 and GTX 470, Cont

Moving beyond the GF100 GPU itself, we have the boards. With NVIDIA once more forgoing a power-of-two sized memory bus on their leading part, the number of memory chips and the total memory size of the GTX 400 series is once again an odd amount. On the GTX 480 there are 12 128MB GDDR5 memory chips for a total of 1536MB of VRAM, while the GTX 470 has 10 chips for a total of 1280MB. This marks the first big expansion in memory capacity we’ve seen out of NVIDIA in quite some time; after introducing the 8800GTX in 2006 with 768MB of RAM, we haven’t seen them do anything besides bump up 256-bit/512-bit memory bus parts to 1GB over the years. And with 256MB GDDR5 due in volume later this year, we wouldn’t be surprised to see NVIDIA push a 3GB part before the year is out.

Meanwhile in a stark difference from the GTX 200 series, the GTX 400 series does not share a common design and cooler. This leads to the GTX 480 and GTX 470 being remarkably different cards. The GTX 470 is essentially a slight variation on the GTX 200 series design, utilizing a similar fully shrouded design as those cards. Meanwhile the GTX 480 is positively alien coming from the GTX 200 series in two distinct ways. The first is the 4 heatpipes from the top of the card (with a 5th one staying within the card), and the second is the fully exposed heatsink grill on the front of the card. That’s exactly what it looks like folks – that’s the top of the heatsink on the GTX 480. At this point it’s mostly an intellectual curiosity (we have no idea whether it makes the GTX 480’s cooler all that better) but we did learn the hard way that it’s not just cosmetic, it can get very hot.

One new thing that both cards do share in common is that the shroud is no longer a single large device; on the GTX 480 and GTX 470 the top of the shroud can be snapped on and off, allowing easy access to the heatsink and fan assemblies. We can’t imagine that most users will ever want to remove the top of the shroud, but this is one of the cooler design elements we’ve seen in a video card in recent years. It’ll be interesting to see if this proves to be beneficial for aftermarket coolers, as this should make installation/uninstallation much more expedient.

One other common element between the cards is that they have a cut-out PCB for pulling in air both from the front side and the back side of the card. We’ve seen this before on the GTX 295, but this is the first time we’ve seen this design element on a single-GPU card.

For those of you working with cramped cases, you should find these cards to be a pleasant surprise. The GTX 470 is 9.5”, making it the same length as the Radeon 5850 (or nearly 1” shorter than the GTX 200 series). On the other hand the GTX 480 measures 10.5”, which is ever so slightly longer than the GTX 200 series which we measure at 10.45”. We’re also happy to report that NVIDIA put the PCIe power plugs on the top of both cards, rather than on the rear of the card as AMD did on the Radeon 5850. Practically speaking, both of these cards should fit in to a wider array cases than AMD’s respective cards.

Even though these cards will fit in to smaller cases though, airflow will be paramount due to the high TDP of these cards. NVIDIA’s own reviewers guide even goes so far as to recommend spacing your cards out as far as possible for SLI use. This actually isn’t a bad idea no matter what cards are involved since it ensure neither card is restricted by the other, however given that not every board with a 3rd PCIe x16 slot offers full bandwidth to that slot, it’s not a practical suggestion for all cases. If you can’t separate your cards, you’re going to want great airflow instead, such as putting a fan directly behind the cards.

Up next is the port layout of the GTX 400 series. Unlike AMD, NVIDIA’s TDP is too high here to go with a half-slot vent here, so NVIDIA is limited to what ports they can fit on a single full slot. In this case their reference design is a pair of DVI ports and a mini-HDMI port (this being the first cards with that port in our labs). Bear in mind that GF100 doesn’t have the ability to drive 3 displays with a single card, so while there are 3 DVI-type outputs here, you can only have two at once.

After having seen DisplayPort on virtually every AMD card in our labs, we were caught a bit off guard by the fact that NVIDIA didn’t do the same and go with something like a mini-DisplayPort here for a 2x DVI + 1x mini-DP configuration like we’ve seen on the Radeon 5970. NVIDIA tells us that while they could do such a thing, their market research has shown that even their high-end customers are more likely to purchase a monitor with HDMI than with DP, hence the decision to go with mini-HDMI. This is somewhat academic since DVI can easily be converted to HDMI, but this allows NVIDIA’s partners to skip the dongles and makes it easier to do audio pass-through for monitors with built-in speakers.

Speaking of audio, let’s quickly discuss the audio/video capabilities of the GTX 400 series. GF100 has the same audio/video capabilities as the 40nm GT 200 series launched late last year, so this means NVIDIA’s VP4 for video decoding (H.264/MPEG-2/VC-1/MPEG-4 ASP) and internal passthrough for audio. Unfortunately the latter means that the GTX 400 series (and other first-generation Fermi derivatives) won’t be able to match AMD’s Radeon 5000 series in audio capabilities – NVIDIA can do compressed lossy audio(DD/DTS) and 8 channel uncompressed LPCM, but not lossless compressed audio formats such as DTS-HD and Dolby TrueHD. This leaves the HTPC crown safely in AMD’s hands for now.

Finally we have bad news: availability. This is a paper launch; while NVIDIA is launching today, the cards won’t be available for another two and a half weeks at a minimum. NVIDIA tells us that the cards are expected to reach retailers on the week of April 12th, which hopefully means the start of that week and not the end of it. In either case we have to chastise NVIDIA for this; they’ve managed to have hard launches in the past without an issue, so we know they can do better than this. This is a very bad habit to get in to.

Once these cards do go on sale, NVIDIA is telling us that the actual launch supply is going to be in the tens-of-thousands of units. How many tens-of-thousands? We have no idea. For the sake of comparison AMD had around 30,000 units for the 5800 series launch, and those were snapped up in an instant. We don’t think NVIDIA’s cards will sell quite as quickly due to the pricing and the fact that there’s viable competition for this launch, but it’s possible to have tens-of-thousands of units and still sell out in a heartbeat. This is something we’ll be watching intently in a couple of weeks.

The availability situation also has us concerned about card prices. NVIDIA is already starting off behind AMD in terms of pricing flexibility; 500mm3+ dies and 1.5GB of RAM does not come cheap. If NVIDIA does manage to sell the GTX 400 series as fast as they can send cards out then there’s a good chance there will be a price hike. AMD is in no rush to lower prices and NVIDIA’s higher costs mean that if they can get a higher price they should go for it. With everything we’ve seen from NVIDIA and AMD, we’re not ready to rule out any kind of price hike, or to count on any kind of price war.

Index The GF100 Recap
Comments Locked

196 Comments

View All Comments

  • Saiko Kila - Sunday, March 28, 2010 - link

    These MSRPs are not entirely, I mean historically correct... The first MSRP (list price) for HD 5850 was $259, and that was price you had to pay when buying on sites like newegg (there were some rebates, and some differences depending on manufacturer, but still you had to have a very potent hunting sense to get a card of any manufacturer, I got lucky twice). Shortly after launch (about one month, it was October) the MSRP (set by AMD) hiked to $279 and problems with supply not only continued but even worsened. Now, since November 2009, it's $299. HD 5870 followed generally similar path, though HD 5850 hiked more, which is no wonder. Note that this is for reference design only, some manufacturers had higher MSRPs, after all AMD or nvidia sell only chips and not gaming cards.

    If you believe anandtech, here you've got a link, the day the cards were announced:
    http://www.anandtech.com/video/showdoc.aspx?i=3643">http://www.anandtech.com/video/showdoc.aspx?i=3643

    The whole pricing things with HD 5xxx series is quite unusual (though not unexpected) since normally you'd anticipate the street price to be quite lower than MSRP, and then to drop even further, and you would be right. I remember buying EVGA GTX260 just after its launch and the price was good $20 lower than suggested price. That's why we need more competition, and for now the outlook isn't very bright, with nvidia not quite delivering...


    And these European prices - most if not all European countries have a heavy tax (VAT), this tax is always included and you have to pay it, there are other taxes too. In the US the sales tax is not included in the street price, and usually you can evade it after all (harder for Californians). Europeans usually get higher prices. Comparing US prices is thereby better, particularly in us dollars (most electronics deliveries are calculated in dollars in Europe). So the prices in the rest of the world were also boosted, even in Europe, despite weak dollar and other factors :)

    One note - HD5xxx cards are really very big and most of them have very unfriendly location of power sockets, so you'd expect to pay more for a proper, huge case. Also note that if you have a 600 W PSU or so you'd be smarter to keep it and not upgrade, unless REALLY necessary. The lower load means lower efficiency, especially when plugged to 115V/60Hz grid. So if you have a bigger PSU you pay more for electricity. And it seems that more gamers are concerned with that bill than in any time before... You couldn't blame them for that and it's sad in its own way.
  • LuxZg - Tuesday, March 30, 2010 - link

    Well, current MSRP is like I wrote it above. If there is no competition and/or demand is very high, prices always tend to go up. We're just lucky it's not happening often because in IT competition is usually very good.

    As for European prices, what do taxes have to do with it? We've got 23% taxes here, but it's included in all prices, so if nVidia goes up 23% so do AMD cards as well. If I'm looking at prices in the same country (and city, and sometimes store as well), and if nVidia is 300$ and ATI is 100 and 500, than I just can't compare them and say "hey, nVidia is faster than this 100$ ATI card, I?ll buy that"... no, you can't compare like that. Only thing you can do in that case is say something like "OK, so I have 300$ and fastest I can afford is nVidia" .. or "I want fastest there is, and I don't mind the cost" and you'll take HD5970 than. Or you can't afford any of those. So again, I don't get why cards in this review are so rigidly compared one to another as if they have exact same price (or +/- 10$ difference). And at one hand they compare MORE expensive nVidia card to QUITE CHEAPER AMD card, but won't compare that same nVidia card to a more expensive AMD card.. WHY?

    And AMD cards are no bigger than nVidia ones, and last time I've checked bigger case is way way cheaper than a new PSU. And I'm running my computer on, get this, 450W PSU, so I'm not wasting any excessive power on inefficiences on low loads ;) And since this PSU keeps overclocked HD4890, it should work just fine with non-overclocked HD5870. While I'm pretty sure that GTX470 would already mean a new PSU, new PSU that costs ~100$/80€ .. So I'd pay more $ in total, and get a slower card.

    Again, I'm not getting why there's such a rigid idea of GTX470=HD5850 & GTX480=HD5870 ..
  • LuxZg - Saturday, March 27, 2010 - link

    Just re-read the conclusion.. something lacks in this sentence:
    "If you need the fastest thing you can get then the choice is clear, .."
    Shouldn't it finish with "... choice is clear, HD5970..." ? That's what I'm saying, HD5970 wasn't mentioned in the entire conclusion. Past are the days of "single-GPU crown" .. That's just for nVidia to feel better. ATI Doesn't want "single GPU crown", they want the fastest graphics CARD. And they have it.. Serious lack in this article, serious.. And again, there is exact same amount of money dividing GTX480 and HD5870, as is between GTX480 and HD5970..
  • blindbox - Saturday, March 27, 2010 - link

    I know this is going to take quite a bit of work, but can't you colour up the main cards and its competition in this review? By main cards, I mean GTX 470, 480 and 5850 and 5870. It's giving me a hard time to make comparison. I'm sure you guys did this before.. I think.

    It's funny how you guys only coloured the 480.

    PS: I'm sorry for the spam, my comments are not appearing, and I'm sorry for replying to this guy when it is completely off topic, lol.
  • JarredWalton - Saturday, March 27, 2010 - link

    Yes, it did take a bit of work, but I did it for Ryan. The HD 5870/5970 results are in orange and the 5850 is in red. It makes more of a difference on crowded graphs, but it should help pick out the new parts and their competition. I'm guessing Ryan did it to save time, because frankly the graphing engine is a pain in the butt. Thankfully, the new engine should be up and running in the near future. :-)
  • Finally - Saturday, March 27, 2010 - link

    Further improvement idea:
    Give the dual-chip/SLI cards also another colour tone.
  • lemonadesoda - Sunday, March 28, 2010 - link

    No. Keep colouring simple. Just 3 or 4 colours max. More creates noise. If you need to highlight other results, colour the label, or circle or drop shadow or put a red * a the end.

    Just NO rainbow charts!
  • IceDread - Tuesday, March 30, 2010 - link

    The article does not contain hd 5970 in CF. The article does not mention the hd 5970 at all under conclusion. This is really weird. It is my belief that anandtech has become pro nvidia and is no longer an objective site. Obejtivity is looking at performance + functionality / price. HD 5970 is a clear winner here. After all, who cares if a card has 1, 2 or 20 gpus? It's the performance / price that matters.
  • Kegetys - Tuesday, March 30, 2010 - link

    According to a test in legitreviews.com having two monitors attached to the card causes the idle power use to rise quite a bit, I guess the anand test is done with just one monitor attached? It would be nice to see power consumption numbers for dual monitor use as well, I dont mind high power use during load but if the card does not idle properly (with two monitors) then that is quite a showstopper.
  • Ryan Smith - Wednesday, March 31, 2010 - link

    I have a second monitor (albeit 1680) however I don't use it for anything except 3D Vision reviews. But if dual monitor power usage is going to become an issue, it may be prudent to start including that.

Log in

Don't have an account? Sign up now