New Features you Say? UVD and DirectX 10.1

As we mentioned, new to RV670 are UVD, PowerPlay, and DX10.1 hardware. We've covered UVD quite a bit before now, and we are happy to learn that UVD is now part of AMD's top to bottom product line. To recap, UVD is AMD's video decode engine which supports decode, deinterlacing, and post processing for video playback. The key features of UVD are full decode support for both VC-1 and H.264. MPEG-2 decode is also supported, but the entropy decode step is not performed for MPEG-2 video in hardware. The advantage over NVIDIA hardware is the inclusion of entropy decode support for VC-1 video, but this tends to be overplayed by AMD. VC-1 is lighter weight than H.264, and the entropy decode step for VC-1 doesn't make or break playability even on lower end CPUs.

DirectX 10.1 is basically a release of DirectX that clarifies some functionality and adds a few features. Both AMD and NVIDIA's DX10 hardware support some of the DX10.1 requirements, but since they don't support everything they can't claim DX10.1 as a feature. Because there are no capability bits, game developers can't rely on any of the DX10.1 features to be implemented in DX10 hardware.

It's good to see AMD embracing DX10.1 so quickly, as it will eventually be the way of the world. The new capabilities that DX10.1 enables are enhanced developer control of AA sample patterns and pixel coverage, blend modes can be unique per render target rather, vertex shader inputs are doubled, fp32 filtering is required, and cube map arrays are supported which can help make global illumination algorithms faster. These features might not make it into games very quickly, as we're still waiting for games that really push DX10 as it is now. But AMD is absolutely leading NVIDIA in this area.

Better Power Management

As for PowerPlay, which is usually found in mobile GPUs, AMD has opted to include broader power management support in their desktop GPUs as well. While they aren't to wholly turn off parts of the chip, clock gaiting is used, as well as dynamic adjustment of core and memory clock speed and voltages. The command buffer is monitored to determine when power saving features need to be applied. This means that when applications need the power of the GPU it will run at full speed, but when less is going on (or even when something is CPU limited) we should see power, noise, and heat characteristics improve.

One of the cool side effects of PowerPlay is that clock speeds are no longer determined by application state. On previous hardware, 3d clock speeds were only enabled when a fullscreen 3D application started. This means that GPU computing software (like folding@home) was only run at 2D clock speeds. Since these programs will no doubt fill the command queue, they will get full performance from the GPU now. This also means that games run in a window will perform better which should be good news to MMO players everywhere.

But like we said, dropping 55nm parts less than a year after the first 65nm hardware is a fairly aggressive schedule and one of the major benefits of the 3800 series and an enabler of the kind of performance this hardware is able to deliver. We asked AMD about their experience with the transition from 65nm to 55nm, and their reply was something along the lines of: "we hate to use the word flawless... but we're running on first silicon." Moving this fast even surprised AMD it seems, but it's great when things fall in line. This terrific execution has served to put AMD back on level competition with NVIDIA in terms of release schedule and performance segment. Coming back from the delay in R600 to hit the market in time to compete with 8800 GT is a huge thing and we can't stress it enough. To spoil the surprise a bit, AMD did not outperform 8800 GT, but this schedule puts AMD back in the game. Top performance is secondary at this point to solid execution, great pricing, and high availability. Good price/performance and a higher level of competition with NVIDIA than the R600 delivered will go a long way to reestablish AMD's position in the graphics market.

Keeping in mind that this is an RV GPU, we can expect AMD to have been working on a new R series part in conjunction with this. It remains to be seen what (and if) this part will actually be, but hopefully we can expect something that will put AMD back in the fight for a high end graphics part.

Right now, all that AMD has confirmed is a single slot dual GPU 3800 series part slated for next year, which makes us a little nervous about the prospect of a solid high end single GPU product. But we'll have to wait and see what's in store for us when we get there.

Index Sensible Naming and the Cards


View All Comments

  • Iger - Friday, November 16, 2007 - link

    Another interesting question is warranty. Main manufacturers of 8800GT (eVGA and XFX) give lifetime warranty on their products - that's much more impressive than saphire's 1 year... Reply
  • Odeen - Thursday, November 15, 2007 - link

    In one word, kinky. Reply
  • Xcom1Cheetah - Thursday, November 15, 2007 - link

    At techreport they show that the 3870 power usage under full load is full 39W less than the 8800GT... thats a huge difference..
    Any idea why there is such a large difference.">
  • GlassHouse69 - Thursday, November 15, 2007 - link

    that makes immensely more sense than the results shown here. the 3850-70 isnt a massive leap and the ram requires alot less juice to run as well. (according to other articles of 3vs4 gddr) Reply
  • just4U - Thursday, November 15, 2007 - link

    First off excellent article.

    As to the sightings .. Ok here in Canada we tend to get things a little later then those in the states BUT the local shop I deal with already has 3850/70s in stock for
    179/239 respectively.

    For some reason even tho our dollar is finally on par (accually higher then the us greenback) we still seem to be paying higher prices. Perhaps price gouging from the retail stores.. (what ever) But anyhow it's in stock here in Calgary and there's not a 8800GT to be had... which sells at 329(ish) so yeah ... I think ill pick one one up based on this article.
  • forPPP - Friday, November 16, 2007 - link


    ... 3850/70s in stock for 179/239 respectively. For some reason even tho our dollar is finally on par (accually higher then the us greenback) we still seem to be paying higher prices.

    You are extremely lucky. In Poland 3870 is listed for $440 !!! OK with VAT (without $360), but it's the same price as for 8800GT. Well, who will buy it then ??? It's a joke - same price, much slower and more power hungry ! ATI what happened !?
  • just4U - Thursday, November 15, 2007 - link

    mmm ok so I was wrong. Shops around here have them listed but don't yet have them in stock. They are expected over the next few days. Reply
  • falacy - Thursday, November 15, 2007 - link

    This is something we should all keep in mind, given that nothing has fundimentally changed in PC computing in the last 8 years. There is still a lot of fun to be had from the plethora of older PC games, which even the lowest end harware can play in full detail (with the exception of Unreal, which really taxes older hardware). And hey, if you're not going to complain about waiting 10 seocnds for it load, Open Office works great on lowend hardware too. Heck, even the lowest end Conroe CPUs trounce the 3.0GHz Pentium4 line in video tanscoding (and it would be interesting to see how the new Celeron 4xx series stacks up against a Pentium4 with 512K of chache, as they are both single core...).

    I just purchased a EVGA E-GEFORCE 8600GT Superclocked 567MHZ 256MB 1.5GHZ GDDR3 PCI-E Dual DVI-I HDTV Out Video Card for $95 CAD, which sure beats the $110 CAD that the 8500GT was priced at a couple of weeks ago. As far as usefullness goes in the $100 price segment, the 8600GT is a great buy, as it has playable graphics at 1280x1024 and 1024x768 in many games, where the 8500GT just does not.

    Hopefully now the passively cooled 8500GT models, which have smaller heatsinks and price tags than the passively cooled 8600GT, will be the standard for HD player PCs and we can all forget about the 8400 line of cards.

    It would have been nice to get one of the 3850s, but for the extra $80 it's not really worth the performance boost for people like me who are still using a 1024x768 CRT and Windows XP, playing older games and who perhaps have gotten too old to want to chase the latest gaming craze. I do have the hardware for Vista 64bit, but it's not worth hassle of the side-grade, when there isn't anything out there I feel compelled to play in DirectX 10. Maybe in a couple years there will be more DX10 titles that it will be worth upgrading the OS and monitor, rather than spending money on hardware.

    I'm running an ASUS P5K-VM, Pentium-Dual Core E2160, 1GB DDR2 667, which leaves my Pentium4 531 and 1GB DDR in the dust! Apart from only supporting PCI-E 1, this board will stand the test of time, so long as games/applications become more quad-core optimized, but for right now it's a super fast, super cheap computer compaired to what I paid for my Celeron 300a based uber-computer I had less than 10 years ago!
  • poeticmoons - Thursday, November 15, 2007 - link

    It’s seems like you just ran over the fact that the 3870 is a dual slot card. Now I know that the 8800GTS and GTX were dual slot, but the 8800GT isn’t and I feel that is a very important factor. I don’t see how you would run 4 dual slot GPU’s in an ATX form factor case. Yes I know that the 3850 is a single slot card, but the high memory GT isn’t competing with that card it’s competing with the 3870. With a die shrink I would have just assumed that a dual slot card would be unnecessary. Reply
  • Spoelie - Thursday, November 15, 2007 - link

    Text hints at 3870 actually being quieter, while the slide mentions otherwise. Any data to back this up? Also, is the quieter part during idle or load, or both? Reply

Log in

Don't have an account? Sign up now