Back to Article

  • anandtech02148 - Sunday, February 18, 2007 - link

    what is the power consumption idle/load for this card?
    less memory matters right? 30buxs cooler, 200-300 buxs cpu, and an overheat graphic card more money for cooling.
  • Maroth - Wednesday, February 14, 2007 - link

    Article is very fine, but can additional benchmark make ? Between 8800GTS 640MB and 320MB (in Q4 or BF2):
    1) in different PCI-Express mode (x16, x4, x1, PCI-Ex Disabled)
    2) in different "PCI-Ex Texture Memory" setting (256MB, 128, 0MB)
    (default used in test was 256MB ?)
    3) in different "Texture Quality" in-game setting

    What about "Frames Render Ahead" driver setting ? Default value (3) used in test was ?
    Is these option performance-impact (on 8800GTS 320MB) ?
  • Webgod - Tuesday, February 13, 2007 - link

    Isn't it pointlessly taxing the video card with pixels so small?

    It's been a while since I've tried anything at 1600x1200 on my 19" CRT, now I use a 26" LCD TV at 1360x768. Obviously AA is my friend at that rez, but higher on up, why not just crank the anisotropic filtering to 16x or 32x and run with it??

    Isn't SLI becoming seriously less relevant once you've got a great framerate at 1600x1200 and above?

    Who's going to run anything higher than 1900x1200, much less with forcing 4xAA??
  • MadAd - Thursday, February 15, 2007 - link

    I took some comparison screenshots in battlefield 2 the other week and 4x AA is definately worthwhile in 1600x1200 and 1920x1200 (full ansio, everything on high), 8x is better but not at the expense of the big performance hit- hopefully the next card I go to will get me 8x.

    The change from none to 4 has more impact than 4 to 8 but its definately not pointless and the more the better IMO, who needs 999 fps when you can spend some of it ratcheting up the AA to look good.
  • Sunrise089 - Tuesday, February 13, 2007 - link

    As one of those in the ideal target audience of 22" widescreen displays (and I would have loved to see some 1680x1050 tests, since you guys say that's the ideal resolution) and on a reasonable budget, this card does seem interesting. (For now, I'll buy Jarred's explanation of driver issues with AA). One thing I would love to see are numbers comparing overclocked performance between max OC'd 8800GTXs and both GTS parts. Not only am I interested in general with seeing how close the GTSs can get to the GTX, but I'd also love to see whether or not the difference in memory affects the memory overclock on this new model. Reply
  • VooDooAddict - Tuesday, February 13, 2007 - link

    While I agree that NEW systems will probably be pairing the 8800GTS 320MB with a 22" 1680x1050, there is a still a large % of people very happy with their 19" 1280x1024 displays. There are also people that might save a little on the display and get a 19" WS 1440x900 in favor of more RAM or better video.

    The large screen segment is also interested in 1280x1024. There are plenty of people running Large 27"-32" LCD TVs as gaming monitors. these have usable 1280x720 or 1280x768 wide screen modes for most games. Also the 30" LCD crowd likes to see 1280x1024 to get an idea of how a nicely scaled 1280x800 will run. On a 30" LCD, I want World of Warcraft to run smooth at 2560x1600 with some AF and light AA, but am happy to get smooth Oblivion at 1280x768.

    The bottom line is there are too many people looking for 1280x1024 to ignore it. ESPECIALLY with any video card products focused on price. Ignoring 1280x1024 in an 8800GTX SLI review ... I don't think anyone would fault you much there.

    I really think you need to go back and run 1280x1024.

    Also what is up with Quake4 "ULTRA", from what I remember with Doom3 and Quake4 ULTRA mode was specifically for cards with 512MB or more of video RAM due to the uncompressed textures. Is there any difference on Non-Ultra?

    Any video card review dealing with memory size needs a mention of MMOs. I can't tell you how many times I've heard people think they need to spend more $$ on a card with more video RAM for an MMO. Until I ran some benches of my own I had also convinced myself that 512MB video RAM would be better for MMOs.
  • chizow - Monday, February 12, 2007 - link

    Derek, great review as usual. Noticed you said you were going to take a closer look at the 320MB's poor performance at high resolutions, especially considering how the 256MB X1900 parts performed better with AA in some instances.

    Earlier today a Polish review was linked on AT">here. What's interesting to note is they used some utility (someone said RivaTuner) to track memory usage at each resolution. If you go through the benchmarks, they all tell the same story. System memory/page file is getting slammed on the 320MB GTS at higher resolutions/AA settings.

    I'm no video card/driver expert but I'm thinking a simple driver optimization could improve the 320 GTS performance dramatically. It looks like the 8-series driver isn't correctly handling the 320 GTS' lower local memory limitations and handling its memory like a 640 or 768 GTS/GTX, so the additional requirements are being dumped into system memory, drastically decreasing performance. Considering the 256MB parts are handling high resolution/AA settings better, maybe an optimization limiting memory to local and faster caching/clearing of the local frame buffer would be the fix?

    Just a thought and maybe something to pass along to nVidia.
  • kilkennycat - Monday, February 12, 2007 - link

    Again, NVidia accomplishes a genuine hard launch..........Seems as if AMD/Ati will have a lot to live up to with their Dx10 graphics-card releases.


    And comparison-sampling from the above list:-

    The EVGA Superclocked 576/1700 8800GTS/640 is $379.99 ( with Dark Messiah and after a $30 rebate thru 3/31)

    and for comparison the
    EVGA (superclocked) 576/1700 8800GTS/320 is $319.99 with same bundle but no $30 rebate.
  • LoneWolf15 - Monday, February 12, 2007 - link

    At the same time, the inconsistency in G80 driver quality is something to be noted when buying a G80 (note: I own an 8800GTS, and can speak to this).

    This inconsistency is something that seems to have been largely glossed over by hardware review sites, so a number of purchasers have had some disappointments from games with texture corruption or that weren't/aren't well supported. nVidia's slowness in making their cards as "Vista Ready" as they advertised them to be is something to consider too.

    Don't get me wrong, I like my card. I just think that when looking at ATI vs. nVidia, one needs to look objectively. I've owned 7 ATI-GPU cards and 7 nVidia-GPU cards (not including all the other ones) since 1992, and both have had their ups and their downs.
  • kilkennycat - Monday, February 12, 2007 - link

    Remember that the 8800 is a brand-new architecture. Like to exchange your 8800 for my 7800GTX ? You are suffering the pain of being an early-adopter. ATi/AMD still has to go through the same pain with the R600, including having to handle the new driver interfaces in Vista, plus juggle the actual drivers themselves for both DX10 and DX9. Reply
  • LoneWolf15 - Tuesday, February 13, 2007 - link

    I'm not running Vista (other than RC2 on a beta-testing box, my primary is XP), actually, just making a point. (note: I bought the 8800 not because I had to have the latest and greatest, but because the 8800GTS offered reasonable performance at 1920x1200).

    But I will say like I've said to everyone else claiming "early adoption! early adoption!" that nVidia released the G80 last fall. Early November, actually. And when they did, the G80 cards from multiple vendors listed prominently on their boxes "Vista Ready", "DX10 Compatible" "Runs with Vista", and so on. This includes nVidia, who, up until last week (think the backlash had something to do with it?) posted prominently on their website: "nVidia: Essential for the best Windows Vista Experience".

    We know that at this time those slogans are bollocks. SLI doesn't work right in Vista. There's lots of G80 driver bugs that might have been worked out during Vista's long gestation period from Beta 2 through all the release candidates (consider nVidia claims to have 5 years of development into the G80 GPU, so one could argue they've had the time). The truth of the matter is that nVidia shouldn't have made promises about compatibility unless they already knew they had working drivers. They didn't, and failed to get non-beta drivers out at the time of Windows Vista's release, breaking the promise on every single one of those boxes. The G80 is not currently Vista ready, and there are still some driver bugs even on XP; for a long list, check out 's forums.

    I believe that when you make a promise (and I believe nVidia did with their marketing of the card) you should make good on it. While I don't plan to adopt Vista for some time to come (maybe not even this year), that doesn't mean I'll give nVidia a free pass for failing to deliver on a promise they've made. Your post seems to insinuate I'm whining and pouting about this, and it also indicates why nVidia gets away with not keeping promises: for every one person that complains, there are two more that beat him/her up for being an early adopter, or not having a properly configured system, or for running Windows Vista.
  • maevinj - Monday, February 12, 2007 - link

    I'll trade you my 6800gt since the you're having driver issues with your 8800. Reply
  • SonicIce - Monday, February 12, 2007 - link

    How can the 8800 GTS 320MB be slower in BF2 than a 7900 GTX or 7950 GT even when it has more capabilities and memory? Isn't it better in every aspect (inclusing memory) so how could it possibly lose? Reply
  • coldpower27 - Wednesday, February 14, 2007 - link

    It looks like the G80 memory optimizations are currently in their infancy. G80's memory size performance is the strangest I have seen in a long while for Nvidia hardware. Reply
  • lopri - Monday, February 12, 2007 - link

    While I do understand the need to have consistency in the graph (especially with G7X series cards), everyone knows that Oblivion can be played with AA enabled on G80. (It's absolutely beautiful) HDR+AA is one of the main selling points of G80 and it's surprising that this feature hasn't been tested a little more. I'm quite confident to say the frame buffer would make a bigger difference with AA enabled in Oblivion.

    Also curious, why was Company of Heroes not tested? This game is, by far, most memory hogging game I've ever seen. (to the extent I suspect a memory leak bug) Company of Heroes @1920x1200/4AA/16AF and up will use up to ~1GB of GRAPHICS MEMORY.
  • Avalon - Monday, February 12, 2007 - link

    I agree with Munky here. I'm completely surprised that the card's horrible AA performance was just simply glossed over. With AA on in most of those games, the card couldn't even outpace a 7900GTX (once or twice the X1950Pro). That's pathetic. I don't know if it's the drivers, or just the inherent nature of the architecture, but it seems like this card can't stand up there with less than 640MB of RAM.

    What gamers are going to pay $300 for a card and NOT use AA? I think the only game that the card could handle using AA without choking was if that's all you play...meh. Otherwise, this card doesn't seem worth it.
  • LoneWolf15 - Monday, February 12, 2007 - link

    Your comment is more astute yet when you take in the fact that current 8800GTS 320MB cards are running $299 online, and (admittedly with some instant rebate magic) 640MB cards can be gotten for around $365.

    If the price drops on the 320MB, I think it'll be a steal of a card, and it's certainly a bonus option for OEM's like Dell. But a $65 difference for double the RAM doesn't seem like a lot extra to pay.
  • JarredWalton - Monday, February 12, 2007 - link

    I truly think there's a lot to be done in the drivers right now. I think NVIDIA focused first on 8800 GTX, with GTS being something of an afterthought. However, GTS was close enough in design that it didn't matter all that much. Now, with only 320MB of RAM, the new GTS cards are probably still running optimizations designed for a 768MB card, so they're not swapping textures when they should or something. Just a hunch, but I really see no reason for the 320MB card to be slower than 256MB cards in several tests, other than poor drivers.

    I don't think most of this would have occurred were it not for the combination of G80 + Vista + 64-bit driver support all coming at around the same time. NVIDIA could have managed any one of those items individually, but with all three something had to give. 64-bit drivers are only a bit different, but still time was taken away to work on that instead of just getting the drivers done. G80 already required a complete driver rewrite, it seems, and couple that with Vista requiring another rewrite and you get at least four driver sets currently being maintained. (G70 and earlier for XP and Vista, G80 for XP and Vista - plus 64-bit versions as well!)

    The real question now is how long it will take for NVIDIA to catch up with the backlog of fixes/optimizations that need to be done. Or more importantly, will they ever truly catch up, since we have lesser G8x chips coming out in the next few months most likely? I wouldn't be surprised to see GTS/320 performance finally get up to snuff when the new midrange cards launch, just because NVIDIA will probably be spending more time optimizing for RAM on those models.
  • code255 - Monday, February 12, 2007 - link

    I wonder how big the difference between the normal GTS and the GTS 320 will be in Enemy Territory: Quake Wars. That game's engine's MegaTexture technology will probably need sh*tloads of GPU memory if one wants to play at Ultra texture quality.

    Btw, in Doom 3 / Quake 4, how big's the difference in memory requirement between High and Ultra quality? I heard that the textures in High mode look basically just as good as in Ultra.
  • OvErHeAtInG - Tuesday, February 13, 2007 - link

    As someone who still plays Quake 4 - This is something that has bothered me in the last dozen AT GPU reviews so I guess I'll gripe here.

    What's with testing Quake 4 in Ultra mode. It's cool for academic purposes, but in the real world, there is *NO* difference in visual quality between Ultra Quality mode and High Quality mode, whereas the framebuffer is approx 3 times larger in Ultra mode, severely crippling cards < 512 MB.

    Not that it's a huge deal. I've used a 512MB card for almost a year now, and run Quake 4 in High-quality mode - why? it's far more efficient! Anyone who plays this game is going to prefer FPS over IQ, especially IQ that is more theoretical than noticeable.

    You even allude to it in this article:
    <q>The visual quality difference between High and Ultra quality in Quake 4 is very small for the performance impact it has in this case, so Quake 4 (or other Doom 3 engine game) fans who purchase the 8800 GTS 320MB may wish to avoid Ultra mode.</q>
    Um understatement?
  • nicolasb - Monday, February 12, 2007 - link

    The conclusion to this article:


    Based on the games and settings we tested, we feel very confident in recommending the NVIDIA GeForce 8800 GTS 320MB to gamers who run at 1920x1200 or less. With or without AA, at these resolutions games look good and play well on the new part.

    This conclusion does not seem to bear much resemblance to the actual observations. In virtually every case the card performed well without AA, but dismally as soon as 4xAA was switched on. A fair conclusion would be to recommend the card for resolutions up to 1920x1200 without AA, but definitely not with.
  • DerekWilson - Monday, February 12, 2007 - link

    The GTS 320MB still performs well if taken on its own at 19x12 with 4xAA ... But I will modify the comment to better reflect what I mean. Reply
  • nicolasb - Tuesday, February 13, 2007 - link

    The way the conclusion now reads is a big improvement, IMNSHO. :-) Reply
  • munky - Monday, February 12, 2007 - link

    I was expecting better performance with AA enabled, and the article just glossed over the fact that the in half the games with AA the card performed on par or worse than last gen card that cost less. Reply
  • Bob Markinson - Monday, February 12, 2007 - link

    For the base Oblivion install, yes, it's not so much of a memory hog. In-game texture use usually doesn't exceed 256 MB with HDR and 4xAA on @ 1152x864. (Also, please test AA perf too with HDR, both ATI and Nvidia do support it on their current gen cards at the same time.)
    Most popular texture mods will bring up the memory usage north of 500 MB. I've seen it hit over 700 MB. Thus, there's a good chance that any 256 MB card would be crippled with texture swapping. I should know, mine is.
  • Bob Markinson - Monday, February 12, 2007 - link

    For the base Oblivion install, yes, it's not so much of a memory hog. In-game texture use usually doesn't exceed 256 MB with HDR and 4xAA on @ 1152x864. (Also, please test AA perf too with HDR, both ATI and Nvidia do support it on their current gen cards at the same time.)
    Most popular texture mods will bring up the memory usage north of 500 MB. I've seen it hit over 700 MB. Thus, there's a good chance that any 256 MB card would be crippled with texture swapping. I should know, mine is.
  • DerekWilson - Monday, February 12, 2007 - link

    What texture mod would you recommend we test with? Reply
  • Bob Markinson - Monday, February 12, 2007 - link

    Qarl's Texture Pack 2 and 3 are quite popular world texture packs. Please check this site for more details:">

    Note that version 3 really does need a lot of texture memory. Also, check out Qarl's 4096 compressed landscape LOD normal map texture pack, it'll add far more depth than the plain, overly filtered Oblivion LOD textures.
  • DerekWilson - Monday, February 12, 2007 - link

    We will take a look at those texture packs and do some testing ...

    Hopefully we can provide a follow up further exploring the impact of memory on the 8800 architecture.
  • blackbrrd - Monday, February 12, 2007 - link

    I looked at the Oblivion scores, and the first thing that hit me was: they are using the standard crappy looking textures!

    No oblivion fan running a 8800gts would run with the standard texture pack. It is, at times, really really bad.

    Running a texture pack like the one above is quite normal. If you have enough video card memory there isn't much of a slowdown - except when the data is loaded into memory - which happends all the time... It does make the game look nicer though!
  • tacoburrito - Monday, February 12, 2007 - link

    With all the eye candy turned on, the 320mb card seems to be only on par with the previous gen 79xx cards, but costs almost twice as much. I'd much rather cough up the extra $200 and get the full GTS version. Reply
  • DerekWilson - Monday, February 12, 2007 - link

    Actually, the 320MB card blows away the 7 series in our tests. Why would you say that it's only on par? At 16x12, the 8800 GTS 320MB is 60% faster, and the difference in performance only gets larger from there. Reply
  • tacoburrito - Monday, February 12, 2007 - link

    With the exception of Half Life 2, at 4x AA, wouldn't you say that the 8800 GTS 320 is only marginally better than 7950 GT, but would costs twice a much? Reply
  • tacoburrito - Monday, February 12, 2007 - link

    Whoops, I meant to say 7900 GTX Reply
  • DerekWilson - Monday, February 12, 2007 - link

    From the context of the thread, I assumed you were talking about Oblivion.

    Without AA, the 8800 320MB is much better than the 7900 GTX. With AA, there is an argument to be made, but the price of the 7900 GTX (as Jarred pointed out) is higher.

  • JarredWalton - Monday, February 12, 2007 - link

    I'd be very curious to find out where you're seeing 7900 GTX cards for "half the price". I don't see any in stock when taking a quick look at major resellers, and our">Pricing Engine confirms that. I'm pretty sure the 7900 GTX is discontinued now, and prices never got below $400. Reply
  • Wwhat - Monday, February 12, 2007 - link

    It still remains to be seen how DX10 games (or future OpenGL games that use geometry shaders?) run on the various incarnations of the new cards, you should have put that in the conclusion as a caveat, it's not just textures anymore you know.

    I don't thinks there's anything at all currently that uses geometry shaders, you wonder why some developer doesn't throw together a quick test utility, billions of people on the planet and nobody can do that little effort? geez.
    Surely someone at crytek or Id or something can write a small looping thing with a framecounter? anand should send out some mails, get someone on his feet.

  • DerekWilson - Monday, February 12, 2007 - link

    There are some dx10 sample apps that make use of geometry shaders ... I've been working on testing these, but it is more difficult than it may seem as FRAPS has trouble with DX10 apps.

    You do have a point though -- DX10 performance will be important. The problem is that we can't really make a recommendation based on DX10 performance.

    The 8 series parts do have more value than the 7 series and x1k series parts in that they support DX10. But this is as far as we can take it. Performance in the games we have does matter, and it is much more prudent to make a purchase only based on the information we know.

    Sure, if the cost and performance of an 8 series part is the same or very near some DX9 class hardware, the features and DX10 support are there to recommend it over the competition. But it's hard to really use this information in any other capacity without knowing how good their DX10 support really is.
  • Awax - Monday, February 12, 2007 - link

    The main point for me is the low impact of memory size on modern games.

    On previous generation game, like Quake4, developers had to use a lot of high resolution texture/bump map/lookup map to achieve advanced effect with the limited capacity in raw performances and flexibility of the cards available.

    With DX9 and more in DX10, the new way is to _CALCULATE_ things completely instead of having them interpolated with tricks using intermediary results or already computed lookup tables stored in textures.
  • DerekWilson - Monday, February 12, 2007 - link

    But new ways to calculate things will also benefit from having huge amounts of data to calculate things from.

    It's really hard to speculate on the direction DX10 games will take at this point. Certianly we will see more use of programmable features and a heavier impact on processing power. But memory usage will also increase. We'll just have to wait and see what happens.
  • Marlin1975 - Monday, February 12, 2007 - link

    Whats up with all the super high resolutions? Most people are running 19inch LCDs that = 1280 X 1024. How about some comparisions at that resolution? Reply
  • poohbear - Tuesday, February 13, 2007 - link

    have to agree here, most people game @ 12x10, especially people looking @ this price segment. Once i saw the graphcs w/ only 16x12+ i didnt even think the stuff applies to me. CPU influence is`nt really a factor except for 1024x768 and below, i`ve seen plenty of graphs that demonstrate that w/ oblivion. a faster cpu didnt show any diff until u tested @ 1024x768, 12x10 didnt show much of a diff between an AMD +3500 and a fx60 @ that resolution (maybe 5-10fps). please try to include atleast 12x10 for most of us gamers @ that rez.:)thanks for a good review nonetheless.:) Reply
  • DigitalFreak - Monday, February 12, 2007 - link

    WHY didn't you test at 640x480? Waaahhh Reply
  • DerekWilson - Monday, February 12, 2007 - link

    Performance at 1280x1024 is pretty easily extrapolated in most cases. Not to mention CPU limited in more than one of these games.

    The reason we tested at 1600x1200 and up is because that's where you start to see real differences. Yes, there are games that are taxing on cards at 12x10, but both Oblivion and R6:Vegas show no difference in performance in any of our tests.

    12x10 with huge levels of AA could be interesting in some of these cases, but we've also only had the card since late last week. Even though we'd love to test absolutely everything, if we don't narrow down tests we would never get reviews up.
  • aka1nas - Monday, February 12, 2007 - link

    I completely understand your position, Derek. However, as this is a mid-range card, wouldn't it make sense to not assume that anyone looking at it will be using a monitor capable of 16x12 or higher? Realistically, people who are willing to drop that much on a display would probably be looking at the GTX(or two of them) rather than this card. The lower widescreen resolutions are pretty reasonable to show now as those are starting to become more common and affordable, but 16x12 or 19x12 capable displays are still pretty expensive. Reply
  • JarredWalton - Monday, February 12, 2007 - link

    1680x1050 displays are about $300, and I think they represent the best fit for a $300 (or less) GPU. 1680x1050 performance is also going to be very close - within 10% - of 1600x1200 results. For some reason, quite a few games run a bit slower in WS modes, so the net result is that 1680x1050 is usually within 3-5% of 1600x1200 performance. Lower than that, and I think people are going to be looking at midrange CPUs costing $200 or so, and at that point the CPU is definitely going to limit performance unless you want to crank up AA. Reply
  • maevinj - Monday, February 12, 2007 - link

    Exactly. I want to know how this card compares to a 6800gt also. I run 1280x1024 on a 19' and with the 6800gt and need to know if it's worth spending 300 bucks to upgrade or just wait. Reply
  • DerekWilson - Monday, February 12, 2007 - link

    The 8800 GTS is much more powerful than the 6800 GT ... but at 12x10, you'll probably be so CPU limited that you won't get as much benefit out of the card as you would like.

    This is especially true if also running a much slower processor than our X6800. Performance of the card will be very limited.

    If DX10 and all the 8 series features are what you want, you're best off waiting. There aren't any games or apps that take any real advantage of these features yet, and NVIDIA will be coming out with DX10 parts suitable for slower systems or people on more of a budget.
  • aka1nas - Monday, February 12, 2007 - link

    It would be nice to actually have quantifiable proof of that, though. 1280x1024 is going to be the most common gamer resolution for at least another year or two until the larger panels come down in price a bit more. I for one would like to know if I should bother upgrading to a 8800GTS from my X1900XT 512MB but it's already getting hard to find direct comparisons. Reply
  • Souka - Monday, February 12, 2007 - link

    Its faster than a 6800gt.... what else do you want to know.


  • A5 - Monday, February 12, 2007 - link

    People with a 19" monitor aren't going to drop $300+ on a video card. You can get a X1950 Pro for $175 that can handle 1280x1024 in pretty much every game out today. Reply
  • jsmithy2007 - Monday, February 12, 2007 - link

    Are you high? I know plenty of people with 19 and 21" CRTs that use latest gen GPUs. These people are typically called "gamers" or "enthusiasts," perhaps you've heard of these terms. Even at moderate resolutions (1280x1024, 1600x1200), to run a game like Oblivion with all the eye candy turned on really does require a higher end GPU. Hell, I need 2 7800GTXs in SLI to just barely play with max settings at 1280x1024 while running 2xAA. Granted my GPUs are getting a little long in the tooth, but the point is still the same. Reply
  • Omega215D - Monday, February 12, 2007 - link

    Yes but the X1950 Pro doesn't do DirectX 10 and hopefully with the new unified shader architecture the 8800GTS won't be too obsolete when majority of the games shipping will be DX10.

    I run a widescreen 19" monitor at 1440 x 900, for some reason my card can run games when I was at the 1280 x 1024 res but now games have become a little choppy in this resolution even though the pixel count is less... any idea why?
  • DerekWilson - Monday, February 12, 2007 - link

    Non standard resolutions can sometimes have an impact on performance depeding on the hardware, game, and driver combination.

    As far as DX10 goes, gamers who run 12x10 are best off waiting to upgrade to new hardware.

    There will be parts that will perform very well at 12x10 while costing much less than $300 and providing DX10 support from both AMD and NVIDIA at some point in the future. At this very moment, DX10 doesn't matter that much, and dropping all that money on a card that won't provide any real benefit without a larger monitor or some games that really take advantage of the advanced features just isn't something we can recommend.

Log in

Don't have an account? Sign up now