Our Thoughts: The GPU Side

The AMD/ATI acquisition doesn’t make a whole lot of sense on the discrete graphics side if you view the evolution of PC graphics as something that will continue to keep the CPU and the GPU separate.  If you look at things from another angle, one that isn’t too far fetched we might add, the acquisition is extremely important. 

Some game developers have been predicting for quite some time that CPUs and GPUs were on this crash course and would eventually be merged into a single device.  The idea is that GPUs strive, with each generation, to become more general purpose and more programmable; in essence, with each GPU generation ATI and NVIDIA take one more step to being CPU manufacturers.  Obviously the GPU is still geared towards running 3D games rather than Microsoft Word, but the idea is that at some point, the GPU will become general purpose enough that it may start encroaching into the territory of the CPU makers or better yet, it may become general purpose enough that AMD and Intel want to make their own.

It’s tough to say if and when this convergence between the CPU and GPU would happen, but if it did and you were in ATI’s position, you’d probably want to be allied with a CPU maker in order to have some hope of staying alive.  The 3D revolution killed off basically all giants in the graphics industry and spawned new ones, two of which we’re talking about today.  What ATI is hoping to gain from this acquisition is protection from being killed off if the CPU and GPU do go through a merger of sorts. 


The NVIDIA GeForce 256 was NVIDIA's first "GPU", offloading T&L from the CPU. Who knows what the term GPU will mean in 5 years, will it be a fully contained within today's CPUs?

ATI and NVIDIA both seem to believe that within the next 2 - 3 years, Intel will release its own GPU and in a greater sense than their current mediocre integrated graphics.  Since Intel technically has the largest share of the graphics market thanks to their integrated graphics, it wouldn’t be too difficult for them to take a large chunk of the rest of the market -- assuming Intel can produce a good GPU.  Furthermore, if GPUs do become general purpose enough that Intel will actually be able to leverage much of its expertise in designing general purpose processors, then the possibility of Intel producing a good GPU isn’t too far fetched. 

If you talk to Intel, it's business as usual.  GPU design isn’t really a top priority and on the surface everything appears to be the same.  However, a lot can happen in two years -- two years ago NetBurst was still the design of the future from Intel.  Only time will tell if the doomsday scenario that the GPU makers are talking about will come true. 

Our Thoughts: Will AMD manufacture ATI GPUs? Final Words
Comments Locked

61 Comments

View All Comments

  • johnsonx - Thursday, August 3, 2006 - link

    Yep, you two are both old. Older than me. Heath H8? I didn't think selling candy bars would pay for college. You actually had to build candy bars from a kit back then? Wow. ;)

    Mostly the 'kids' comment was directed at your esteemed CEO, and maybe Kubicki too (who I'm well aware is with Dailytech now), and was of course 99.9% joke. Anand may be young, but he's already accomplished a lot more than many of us ever will.
  • Gary Key - Thursday, August 3, 2006 - link

    where is the edit button... led to
  • PrinceGaz - Wednesday, August 2, 2006 - link

    Well according to ATI's investors relations webby and also Wikipedia, they were founded in 1985 and started by making integrated-graphics chips for the like of IBM's PCs, and by 1987 had started making discrete graphics-cards (the EGA Wonder and VGA Wonder).

    Yes, they quite obviously do predate the 3D revolution by many years. VGA graphics date from 1987 and no doubt the VGA Wonder was one of the first cards supporting it. I imagaine that EGA Wonder card they also made in 1987 would have had the 9-pin monitor connection you mention as that is the EGA standard (I've never used it but that's what the Wiki says).

    All useless information today really, but a bit history is worth knowing.
  • johnsonx - Wednesday, August 2, 2006 - link

    Yep, I stuck quite a few EGA and VGA wonder cards in 386's and 486's back then. They were great cards because they could work with any monitor. Another minor historical point: Monochrome VGA was common in those days too - better graphics ability than old Hercules Mono, but hundreds of $ less than an actual color monitor.
  • yacoub - Wednesday, August 2, 2006 - link

    Your comment should get rated up b/c you correctly state that ATI has been around for some time. Let us also not forget that NVidia bought 3dfx, 3dfx did not simply disappear. And Matrox, while mostly focused in the graphic design / CAD market with their products, has also survived their forays into the gaming market with products like the G200 and G400. Perhaps something about basing your graphics card company in Canada is the trick? :)
  • johnsonx - Wednesday, August 2, 2006 - link

    Well, 3dfx was dead. NVidia was just picking at the carcass. Matrox survives only because they make niche products for professional applications. Their 3D products (G200/G400/G450, Parhelia) were hotly anticipated at the time, but quickly fell flat (late to market, surpassed by the competition by the time they arrived, or very shortly after).
  • mattsaccount - Wednesday, August 2, 2006 - link

    >>NVIDIA also understands that dining with Intel is much like dining with the devil: the food may be great but you never know what else is cooking in the kitchen.

    The food in Intel's cafeteria is actually quite good :)
  • stevty2889 - Wednesday, August 2, 2006 - link

    Not when you work nights..it really sucks then..
  • dev0lution - Thursday, August 3, 2006 - link

    But the menu changes so often you don't get bored ;)
  • NMDante - Wednesday, August 2, 2006 - link

    Night folks get shafter with cafe times.
    That's probably why there's so many 24 hr. fast food offerings around RR site. LOL

Log in

Don't have an account? Sign up now