AMD: The Peoples' GPU Maker

This week AMD came out and codified its new GPU strategy, but in reality it's the same strategy that has been in place since the release of the R600 GPU (Radeon HD 2900 XT). On paper (or LCD), it's the best idea ever, take a look:

Obviously this mythical GPU that can let you play at any resolution with any detail settings doesn't exist, but the idea is that AMD will continue to target the $200 - $300 market segment with its GPU designs.

The buck doesn't stop there though, AMD will continue to build more and less expensive GPUs, they will simply be derived off of this one mainstream design. Again this is nothing new, it's exactly what AMD did with R600 and RV670.

NVIDIA's approach is markedly different as this week's GT200 launch clearly illustrates. NVIDIA continues the approach of building a very large, monolithic GPU, eventually scaling the architecture down to lower power and price points. The GT200 is the latest example of the large monolithic die and subsequent mainstream parts will be based on some version of that GPU.

AMD argues that NVIDIA's approach means that there's too long of a time to market for high speed mainstream GPUs and it keeps power/costs high. There is truth in what AMD is saying but not entirely.

NVIDIA could just as easily introduce a brand new architecture with a mainstream part, it simply chooses not to as it's far easier to recoup R&D costs by selling ultra high end, high margin GPUs.

The power/cost argument is a valid one but AMD's approach isn't actually any better from that standpoint:

 

A pair of RV770s, AMD's new GPU, end up consuming more power than a single GT200 - despite being built on a smaller 55nm process.

A pair of these RV770s only costs $400 compared to $650 for a single GT200, but I suspect that part of that is due to differences in manufacturing process. If NVIDIA hadn't been so risk averse with the GT200 and built it on 55nm (not that I'm advocating it, simply posing a hypothetical), the cost differences would be smaller - if not in favor of NVIDIA since GT200 is built on a single card.

When the smoke clears, AMD's strategy is to simply build a GPU for the masses and attempt to scale it up and down. While NVIDIA is still building its GPUs the same way it has for decades, starting very large and scaling down.

AMD isn't taking a radically different approach to building and designing GPUs than NVIDIA, it's simply building one market segment lower.

NVIDIA's Unexpected Response Power, Thermals and Noise
Comments Locked

114 Comments

View All Comments

  • Final Destination II - Saturday, June 21, 2008 - link

    Btw, I have a 7600GT. I don't have to upgrade each week, because I have a display 1280x1024 display and don't care at all for 2500xquadrizillion resolutions.

    Plus, you constantly keep forgetting that the HD4850 rules in quality settings.

    Anyway. There are 5 things I will never do:
    - burn my money
    - install SLI
    - install Crossfire
    - burn my money while achieving 5% more performance ("yay! that's worth it!" says the enthusiast. "what a moron" say I...)
    - install one of those ridiculous dual-chip power burners - thanks, I already got me a nice heater for the winter.

    If you check statistics you will see that you are quite alone on your enthusiast throne, looking down on people who "only" get 85% of your performance for 33% the money.
    In fact, I would call those the "intelligent people"...
  • superkdogg - Monday, June 23, 2008 - link

    I am similar to this guy ^^^.

    I have three display options: 12x10 19" LCD monitor, 37" 720p TV, and a VGA projector. Granted the projector kinda sucks, but there's nothing quite like using the garage door as your gaming screen.

    For reason of not having a super display (nor really wanting one) having a usable video card gets really, really simple. I actually still have my x800 GTO2 flashed to x850xt and overclocked. Laugh if you want, but other than being old and not supporting newer shader models (a big deal for some, not to me) it still puts together playable framerates in many games.

    So, now that I've explained where I come from, I can say that the 4850 has my attention. I'm never going to be the $600 graphics card guy, but being the $200 graphics card guy and being able to turn up all the detail settings for most games and being 'bottlenecked' at the monitor sounds good to me.
  • Final Destination II - Saturday, June 21, 2008 - link

    Edit:
    -display
  • Final Destination II - Saturday, June 21, 2008 - link

    "Expect another price cut after all 9800GTX's are gone and every one is a GTX+"

    Great. That's the way to kick your own customers in the ass. Remember the price slide on the iPhone? That's what will happen to Nvidia. Frustrated 9800GTX buyers will realize that their precious $300 they bought 2 weeks ago is worth $100 less and worse than ever.

    Sorry, guy - but's that's no reason to buy another one. That's a reason to stick with it and feel sorry for each HD4850 that passes you.
  • Final Destination II - Saturday, June 21, 2008 - link

    55nm NV280? Could it be that you are confusing the 9800GTX+ with your personal wishes?
  • Straputsky - Saturday, June 21, 2008 - link

    Oh, I already expected that nVidia will use the 55nm process asap. But when I look to the ATI chip (about 250mm^2) and then the nVidia (as you mentioned about 400mm^2) I still wonder if it's possible to use two of them on one board. The power consumption might be less, but I think it's still far to high.
    Have a look at the G92b. It's shrinked but due to the higher clock it uses more power than the old one. ATIs new X2 consumes a bit more power than one 280GTX and I think that's the upper end of what might be possible with air cooling. If you want to use two 280GTX you have to reduce power consumption by nearly 50%. I think that's more than you can get from the 55nm process. Maybe you can put together two 260GTX with lower clock speeds - but for which price tag? >400€? At this price level people tend to buy two 280GTX cause price doesn't matter anymore.
    Maybe you're right but until I see it, I won't believe it.
  • magnusr - Friday, June 20, 2008 - link

    Does it support 7.1 LPCM 192kHz 24bit? Good enough to disable onboard sound / audio cards?

    Tested it with receivers who supports LPCM?

    IS ATIs HDCP and drivers just as stable as Nvidia regarding connections through receivers? Had some trouble with the old 3870 while my current 9600GT works fine.
  • madgonad - Friday, June 20, 2008 - link

    Come on Derek and Anand!

    Give us gaming/HTPC lovers some info.

    What kind of audio can you get out of a BluRay?
    Will games fully utilize all seven channels, or just default to stereo?
    Are any levels of EAX supported?
    Does it have the equivalent of DD Live or DTS Connect so that games will be played in 5.1/7.1?

    These are important questions that nobody has answered due to the overwhelming infatuation with Crysis scores.

    Also, how well does it upscale DVDs? In theory that teraflop of processing power could do a lot.
  • rgsaunders - Saturday, June 21, 2008 - link

    You are wasting your time asking Anandtech to do balanced reviews with something other than gaming scores. I have tried on numerous occasions over the last several years to get them to broaden their testing, but it seems all the reviewers are adolescent gamers. They still haven't figured out that gamers are a minority of users.
  • DerekWilson - Monday, June 23, 2008 - link

    we're always open to new test applications and investigative directions.

    specific suggestions will get the best results. especially if you can point us to a reliable, repeatable, fair test to use for a certain real world application.

Log in

Don't have an account? Sign up now