The Witcher

Near the beginning of the game, our main character must lift a gate for his friends to run through before a giant praying mantis eats them (or something). Anyway, this is a benchmark of that cut scene using FRAPS. We test with 2xAA at 1680x1050 and 1920x1200 while our tests at 2560x1600 are performed without any antialiasing. This is due to the fact that the game removes the option to enable AA at certain resolutions on some cards. While we understand the need to keep gamers who don't know any better from cranking everything up on crappy hardware and complaining, it would be great if people who knew what they were doing had the option of forcing specific settings.

 


Click to Enlarge

The 9800 GTX holds its own against the onslaught of the 4850 under The Witcher. These cards basically perform the same with slightly more difference at lower resolutions in favor of the 4850. The differences aren't large here and for all intents and purposes these cards have the same value for this test.

Oblivion Bioshock
Comments Locked

114 Comments

View All Comments

  • christianspoer - Friday, June 20, 2008 - link

    http://www.hisdigital.com/html/product_sp.php?id=3...">http://www.hisdigital.com/html/product_sp.php?id=3...

    Transistor count at HIS' homepage
  • darkryft - Friday, June 20, 2008 - link

    Now this is more like it! Single-slot card with a decent price point that provides excellent performance when doubled up. I like that AMD/ATI have most certainly not caved to simply let Nvidia dictate how it works.

    Mostly though, I like to see cards like this come out mostly to force the price point of the ultra-high end down. With this card or the 8800GT doubled up being able to outrun the GTX 280, to me it pretty much invalidates a $250 surcharge on the component.

    Bang for the buck. Like it should be.
  • Dobs - Friday, June 20, 2008 - link

    To see how these new cards from nVidia and ATI (AMD) perform with the new Intel processor will be the REAL test IMHO.
    Also wondering which 3870x2 is being used.... Isn't the GDDR4 version (by PowerColor) worth a try? Along the same lines... 4870x2 GDDR5 (?) will surely run at cooler temps than the GDDR3 version.... and King or not, I'll be getting one just because I know Dx10.1 compatibility means so much more than just nVidia's Dx10. (I only upgrade about once every 7 years - so I wanna make it last)
    Excuse my brain dump!
  • TheJian - Friday, June 20, 2008 - link

    90% of us do NOT have SLI/CF. So we're all after the best single slot solution. HardOCP shows far different results for GTX280/260. It blows away all other cards including the 9800GX2. They turn EVERYTHING on though, which Is what I think most would do if they spent $650 on a card (which they won't be charging the day 4870x2 ships in 8 weeks or so). You'd crank up AA AND AF. I see no mention of AF in the benchmarks here, which clearly hardocp.com does and they show clearly that both the GTX260 and GTX280 can run at 2560x1600 with both AA/AF turned on, and comment that all other Nvidia cards could NOT do this. None of the others were playable, including 9800GX2, and 9800GTX SLI. This card was built for 24/30in monitors and dominates there. Not to mention the SLI/CF issues you don't have to deal with (stuttering, not working at all as shown here, how many other games will two cards get NOTHING in?). You need to change your benchmarks to show PLAYABLE resolutions like hardocp. I read here and tom's hardware and thought the GTX260/280 sucks. Then read over at hardocp and find a completely different story when taxing these cards MORE and when all these cards hit unplayable LOWS. It was quite enlightening. Also what happens when we hit physx games in the future?

    Add to this there will be a die shrink in two months from Nvidia on both of these cards (it's already been taped out a few weeks back - 12 weeks from tapeout you have cards in hand) and AMD's 4870X2 (which looks like same month available) is in trouble and takes TWO slots to compete. How many of you have SLI/CF? I don't know many, that does cost more you know? Are all of you running out to buy new boards to run CROSSFIRE? So NV will shrink the chips, likely up the gpu 100mhz, and charge $100-150 less for both cards. So a 4870X2 ($600? it comes GDDR5 so not cheap) will be facing a faster GTX280 (ultra?) than today and likely at $500. We've already heard they'll cut prices when 4870 hits (I figure $50 from both in the next 2-3 weeks) so another $100 after die shrink seems doable. The 9800GTX+ is an even bigger problem for AMD. I'm sure an ULTRA version is right behind that one if 4870 is pretty good. A die shrink on G92 should net more than 60mhz, so expect another based on it but faster shortly for say $259/269? It should knock off about 20% power at load also, which should come out at around or below the 8800GT. AMD has a good card. No argument there, but that all seems about to change based on pricing, die shrinks etc. Nvidia seems to have an answer to it all.

    The only thing good here for AMD is Audio. But, I don't know anyone with an HTPC, and NOBODY with their regular PC hooked to their big screen TV. On top of that DTS 6.1 channel (generally just DolbyD 6.1 actually) is the best audio I know of amongst people I know. I don't even know a single person with 8ch audio for their home theater systems. That's about as rare as the idea of me buying any card that's $650. Almost pointless to brag about something none of us has or will have without spending a shedload of money to upgrade our living rooms. You guys buying 30-50ft HDMI cables and running through your walls? I question the quality of audio coming from this card anyway. My klipsch pc speakers make anything but a high end sound card pointless. Maybe the new AMD's do an adequate job, but I'd want to hear about that a bit more. I'm no audio expert though. Besides that $300 for a card just for an HTPC makes me want an Xbox360/PS3 instead especially since you can already do 8ch audio with integrated chipsets (8200/G35). Do I really want to play PC games on my 61in tv? In theory, cool. But not in practice. Too many downsides. I like them fine on my 24in Dell LCD.

    So nice job AMD, but Nvidia has easy answers. Unfortunately for AMD they can't start a price war with a 4Billion dollar loss last year. Meanwhile Nvidia makes 850mil/year. AMD has to make 250-300mil just to pay interest on their $5B debt. FYI I'm an AMD fanboi...LOL. But I admit was lured to the dark side with Intel's e4300 (hey it hit 3ghz easy!), and now the Xeon3110 (3.6 is cold!...How can I pass that up?). I hope AMD starts making money soon. Intel's killing them on one side, and Nvidia is keeping them down on the other. Don't forget 2xGTX280's destroys everything. Because NVDA actually MAKE money and have NO DEBT they can price to hurt AMD if needed. AMD can't do this. I would buy NVDA (Buy in may-jun, sell in mid Dec...ROFL, 30-80% for last 5 years...ROFL), I would not buy AMD stock :)
  • goinginstyle - Saturday, June 21, 2008 - link

    "You'd crank up AA AND AF. I see no mention of AF in the benchmarks here, which clearly hardocp.com does and they show clearly that both the GTX260 and GTX280 can run at 2560x1600 with both AA/AF turned on, and comment that all other Nvidia cards could NOT do this. None of the others were playable, including 9800GX2, and 9800GTX SLI."

    Why even worry about AA at 2560x1600? Have you tried 4xAA at 2560x1600 (probably not since you have a 24")? It is difficult to tell any differences with AA in most games past 1920x1200 not to mention it is near impossible to tell the difference between 8xAF and 16xAF at that resolution unless you do screen captures for a living or watch slide shows. So who cares about cranking the settings up at high resolutions, you do it at lower resolutions where it makes a difference. Not to mention, how many people actually have 30" monitors or really want to spend more on their video card than the processor, ram, hd, and board combined? I have a 30" monitor and regret it for gaming, great for other things though.

    "You need to change your benchmarks to show PLAYABLE resolutions like hardocp. I read here and tom's hardware and thought the GTX260/280 sucks. Then read over at hardocp and find a completely different story when taxing these cards MORE and when all these cards hit unplayable LOWS. It was quite enlightening. Also what happens when we hit physx games in the future?"

    I am sorry, but the video card reviews at H suck although with all of your posts it seems as though you want to increase their hit count. Their idea of a playable resolution is a joke. Also, their benchmark results are not repeatable or comparable between video card reviews and knowing most of their advertisers sell nothing but NV based product I do not trust them at all. I have not since they started falsifying results a few years ago and got caught doing it. Physx is only good at this point for increasing 3DVantage scores which is really important for somebody like you. Anyway, I would hope a $650 card could beat a $199 card at high resolutions and bravo for NV designing a card that could do it while draining my bank account.

    "ut, I don't know anyone with an HTPC, and NOBODY with their regular PC hooked to their big screen TV. On top of that DTS 6.1 channel (generally just DolbyD 6.1 actually) is the best audio I know of amongst people I know. I don't even know a single person with 8ch audio for their home theater systems."

    All the people you know own 7 channel systems? I will give you a hint, 6.1 is 7 channel dude. Standard DTS and DD are 5.1 formats that are the most widely used behind good old stereo. You need DTS-ES or DD-EX for 6.1 output in those formats (fake it with DPL IIx or DTS Neo:6) which sucks compared to true 8-channel output. It is obvious to me by now that you are some 13 year old with a crush on the H and actually do not know what you are talking about. Just spewing trash to up hit counts, same as what Kyle did in his us versus AT video thread a few months ago. So collect your $10 and go buy this month's Teen Throb and play a game of Bejeweled on your 8800GT at 4xAA/16xAF in order to improve your gaming experience.

    "Because NVDA actually MAKE money and have NO DEBT they can price to hurt AMD if needed. AMD can't do this."

    Why not price the GTX280 at $199 and run AMD out of the video card business if they are so smart. NV will not do it as they hope idiots like you continue to spend $650 for a video card.

    "So nice job AMD, but Nvidia has easy answers."
    Answers like spend more money than you need to when buying our products, data corruption, crappy mb chipsets that overheat, buggy drivers under Vista, video freezing, image quality reductions in order to over inflate benchmark scores, LAN problems, crappy product support, viral marketing, and classic answers like "since we do not make a CPU then the CPU is not important". I guess they do have an answer for everything.
  • TheJian - Saturday, June 21, 2008 - link

    The point is I want to see the cards taxed more so I have a better idea of what I'll deal with at some point in the future. Games are getting more complicated or have you completely missed that over the years? I don't need my xeon3110 to run at 4ghz, but it sure is nice to know I can when I have to in the future. You might not think it benefits us to see benchmarks with the details on high, but if I can run there and you can't my card is faster and can handle more workload correct? In the end the extra power is always desired. Benchmarking without showing the extra power the cards have seem kind of pointless. There are plenty of pig games that will be produced like crysis, why not see which cards can hack more detail? What do you think benchmarks are for? Did you even read my posts? I don't advocate buying one, and won't until a price drop on the die shrink if I even do then. I clearly stated that. I have an 8800GT and you're calling me an idiot that spends $650 on cards...LOL. I paid $199. AFAIK the 8800GT has never been $650...ROFL. Get your facts straight.

    I didn't just quote hardocp, I also quoted pcperspective. Apparently you really can't read. Anandtech can do the same type of benchmarking to prove them wrong if they can. It won't happen. PC Perspective got the same results. You dislike hardocp so freaked out (did you miss the part where I said you can search their forums for me dissing them?)...LOL. If you paid attention to my post you'd see PhysX is worth more than you're claiming. GRAW2 is a prime example of how promising it is. I never mentioned 3Dvantage and only care about game scores. But feel free to dream up some more evidence if needed about me...ROFL. Apparently you missed the entire point of my posts. REAL PLAYABLE SCORES are important to me. Or did you miss that point? The only reason AMD isn't chargin $650 for their card is it is NOT as fast as Nvidia's. The leaders charge the most for whatever their product is. Get over it. Don't buy it if you think it's not worth it. I made no claims of whether or not any card was worth X dollars. Lamborghini charges more than Ford. Sorry, it's a better car so they charge more...You'd be driving one if you had the money.

    No idea why you're giving me hints about 6.1ch audio being 7 channels...ROFL? I never said it wasn't. I didn't say all the people I know own 6.1. READ IT AGAIN...jeez. I said that's the best anyone has that I know personally. Pay more attention to my post and stop blathering. Still foaming at the mouth over hardocp?...LOL. You go on to prove my point that 8ch audio isn't popular (I know, that was my point...are you blind?) "Standard DTS and DD are 5.1 formats that are the most widely used behind good old stereo." I TOTALLY agree, 8ch isn't NORMAL. Again, that was my point.

    I'm guessing Nvidia doesn't price the GTX280 at $199 because they wouldn't please their shareholders if they didn't make money. I don't even understand how you can make stupid comments like this while calling me an idiot in the same sentence...WTF? I don't buy $650 cards and actually stated a price drop and die shrink are needed to get me interested. I state I own a 8800GT and you accuse me of being an idiot buying $650 cards...Whatever. Nvidia has pricing power because they have the better cards. Sorry. If it wasn't the case they wouldn't charge it. The market dictates pricing. When AMD comes out with a better product, Nvidia responds (GTX+ and a price cut today proves my point).

    Buggy products? Everyone is guilty of that. No company has a perfect record. ATI's drivers used to be ATROCIOUS. AMD has had tons of problems with chipsets, cpu bugs etc, patches needed etc. I guess you selectively read news. Continue to live in your dream world. IF Nvidia produced a crappy product they wouldn't be king of the hill for long. The only time this isn't the case is when monopolies exist (Intel/MS). When Nvidia released the 5800fx we ran to ATI. Every company out there inflates benchmarks when they can...Even ATI...Sorry. I think iPods suck, but they still sell. Apple's stock flies higher etc. My points about "ANSWERS" to AMD's products were more financial related but I guess you missed that. Performance generally dictates pricing, which is why I think Nvidia will make more money than AMD this year :) Feel free to prove me wrong at the next quarter...ROFLMAO. AMD is doing a good job of running themselves out of business, Nvidia doesn't need to help them. How many quarters have they lost money for now? FYI I'm 37, not 13. Calling someone else an idiot for buying a $650 card just means you can't afford one so you complain about people who can and do buy one. Don't forget our bargain cards are made on the backs of those $650 buyers. I love these people :) Don't forget you wouldn't even have a JOB without the rich people creating one for you...For these $650 people being idiots, if they can afford that card they must be doing better than you eh? :) Flame on.
  • sigsauer228 - Friday, June 20, 2008 - link

    What a fucking moron
  • TheJian - Friday, June 20, 2008 - link

    Spoken like a true anandtech.com lover. Quit loving websites and pay attention to the data, no matter where it comes from.

    Let me know when you can argue with the data I just put up from hardocp or pc perspective.

    It appears you have nothing.

    Now take a good look in the mirror and repeat yourself. :) Go ahead and grin about it now. You're right...LOL.
  • DerekWilson - Friday, June 20, 2008 - link

    with the exception of crysis, all our benchmarks are tested with the highest possible quality settings in game with the AA setting noted for each test.

    higher than 4xAA at 2560x1600 with a 30" panel doesn't get you a good ROI for lost performance and we wouldn't recommend it even if you spent $1300 on 2x GTX 280 cards.

    as for AF, we set it to the highest level possible in game where there is an option. for games that don't give us the option we force 16xAF in the driver (this pertains to oblivion for this article).
  • TheJian - Friday, June 20, 2008 - link

    I haven't seen anyone testing above 4xAA at 2560x1600. The numbers are still odd for even the avgs. Take COD4 at 1600x1200. Your test shows 64fps 9800GTX vs 78fps for 4850. At PcPerspective 77fps vs 79 respectively. FAR closer. I'd really rather see minimums anyway as those numbers actually affect my gameplay the most. As you can see from the other two sites things REALLY look different if you're examining playability instead of just avg fps.

    The 9800GTX+ is already benchmarked and basically shows them dead even. In Crysis it's a victory for the old one AND the new one.
    http://www.pcper.com/article.php?aid=580&type=...">http://www.pcper.com/article.php?aid=580&type=...
    I pray for minimums in your future articles. :)

    I wouldn't recommend anyone spending $1300 on video cards... :)

Log in

Don't have an account? Sign up now