Conclusion

With what turned out to be not one, not two, but almost a six-month wait, we finally got the head-to-head we were looking for. And with the scores in mind, we are extremely pleased with the way Mobility Radeon 9600 turned out. It seems definitely ready for the next generation games and benchmarks. In our various benchmark runs, we were even able to roughly gauge the heat emission between the Mobility Radeon 9600 and the GeForce FX Go5650. While we can’t release full results, we can state that in our Half-Life 2 benchmark runs, the Mobility Radeon 9600 was able to noticeably generate less heat. We are still waiting for all battery consumption benchmarks to finish, and we will report back as soon as that is completed.

Results aside, it was a bit frustrating to see NVIDIA and ATI take so long to get the chips to market. After all, we reported back in March on these two solutions, and it took us quite some time (albeit almost 6 months) before we started to see real tangible retail systems. Granted, they were in other overseas markets, but the main technology market is still North America.

ATI isn’t completely without fault, as their product announcement comes after their tradition of the Mobility Radeon 9000, which was touted as the first mobile graphics chip to be announced and shipped within a week. Hopefully, we will see the next generation of mobile graphics processors (M11 and NV36M) with an announcement much closer to their full market release. (Of the two, we have only been able to see M11, which is definitely something to keep your eyes peeled for as we near official announcement.) Ideally, each company’s marketing should hold off until the date nears, and not jump the gun to respond to the other.

With the GeForce4 4200 Go ultimately replaced by the Go56xx, NVIDIA is starting to head in the right direction. Power consumption and heat emissions for the GeForce FX Go based notebooks have succeeded in many things for which the GeForce4 4200 Go did not. However, NVIDIA has fair way to go to take their mobility graphics processors up to the same speed as Mobility Radeon 9600 in many of the next-generation games on the horizon.

The developer of Half-Life 2, Valve, is the first developer to voice their displeasure for the NV3x architecture with such intensity, because it has forced them to write additional codepaths particularly for NVIDIA hardware; thus, costing them time, money, and extra resources. This was something not needed to run on ATI hardware, which is why they entered into an agreement with ATI. The order of the agreement was based on already existing hardware benchmark scores to a marketing agreement, not the other way around as some have speculated.

Now, the only way for NVIDIA hardware to run reasonably well in full DX9 games such as Half-Life 2, AquaMark 3, among others, is to lower several image quality related settings: no fog, 32-bit dropped to 16-bit, low dynamic range, etc. The current selection of older DX8 games may suit the GeForce FX based systems (desktop and notebook) just fine, but we are on the heels of a software change to DX9, which is why we are in the process of revising our graphics benchmark suite. The result of GeForce FX benchmarking in DX8 is that consumers are getting use to the higher fps rates in UT2003 and Jedi Knight 2. If Valve didn’t program a special codepath for NVIDIA hardware, customers would be calling up their technical support, and ultimately sending back the software title (RMA issues), which would result in Valve's loss of money. This ends up leaving both the programmer and the NVIDIA consumer dissatisfied because neither side gets to see the full DX9 experience appreciated. Don’t forget that programmers are also artists, and on a separate level, it is frustrating for them to see their hard work go to waste, as those high level settings get turned off. We can’t even begin to hypothesize or speculate the performance results for Go5200, which is a full DX9 part, had we sought to include it in this review.

Update 9/17: We are finished with the battery consumption runs, and we can report back that there is no noticible difference between the two mobile graphic parts, in this respect. We ran both under the highest battery conservation settings (PowerPlay and PowerMizer) and the standard MobileMark settings. Due to NDA reasons, we cannot release the numbers, but the margin between the two result were negligible.

AquaMark 3
Comments Locked

47 Comments

View All Comments

  • Anonymous User - Monday, September 15, 2003 - link

    That 30 FPS-eye-limit rubbish always comes up in these sort of threads - I can't believe there are people who think they can't tell the difference between a game running at 30 FPS and 60 FPS.

    Anyway, I'd like to ask about the HL2 benches - you mention the 5600 is supposed to drop down a code path, but don't specifically say which one was used in the tests. DX8? Mixed? The charts say "DX 9.0", so if that was indeed used then it's interesting from a theoretical point of view but doesn't actually tell us how the game will run on such a system, since the DX8 code path is recommmended by Valve for the 5200/5600.
  • Anonymous User - Monday, September 15, 2003 - link

    The "car wheels not rotating right" effect is caused by aliasing, and you'll still get that effect even if your video card is running at 2000fps.

    Besides, you're limited by your monitor's refresh rate anyhow.
  • Anonymous User - Monday, September 15, 2003 - link

    #14 that is incorrect and totally misleading. Humans can tell the difference up to about 60fps (sometimes a little more).

    Have you ever seen a movie where the car's tires dont seem to rotate right? Thats becuse at 29.97fps you notice things like that.

  • Anonymous User - Monday, September 15, 2003 - link

    #13, unless your not human, the human eye cant see a difference at 30fps and up. 60fps is a goal for users cause at that point, even if there is a slow down to 30fps you cant see the difference.
  • Anonymous User - Monday, September 15, 2003 - link

    Overall, I liked the article...

    However, whilst I understand that you wanted to run everything at maximum detail to show how much faster one chipset may be than another, it would have been helpful if some lower resolution benchmarks could have been thrown in.

    After all, what good does it do you to know that chip B may perform at 30fps whilst chip A performs at 10fps if both are unplayable?

    I don't mind whether I can play a game at an astoundingly good detail level or not - I care more about whether I can play the game at all! :)

    In the end, we'd all love to be able to play all our games in glorious mega-detail looks-better-than-real-life mode at 2000fps, but it's not always possible.

    A big question should be can I play the game at a reasonable speed with a merely acceptable quality. And that's the sort of information that helps us poor consumers! :)

    Thanks for your time and a great article.
  • Sxotty - Monday, September 15, 2003 - link

    Um do you mean floating point (FP16) or 16bit color? As opposed to FP32 on the NV hardware, as ATI's doesn't even support FP32, which is not 32bit color. ATI supports FP24. LOL and the no fog thing was just funny, that is NV's fault it is not like it has to be dropped they did it to gain a tiny fraction of performance.
  • rqle - Monday, September 15, 2003 - link

    I really like this comment:

    "Don’t forget that programmers are also artists, and on a separate level, it is frustrating for them to see their hard work go to waste, as those high level settings get turned off."

    Hope future article on graphic card/chipset will offer more insight on how the may developer feel.
  • Anonymous User - Monday, September 15, 2003 - link

    please note: the warcraft benchmark was done under direct3d. now nvidia cards perform badly under direct 3d with warcraft whereas ati does a very fine job. it's a completely different story, however, if u start warcraft 3 with the warcraft.exe -opengl command. so please take note of that, only very few people about this anyway. my quadro 4 700 go gl gets about +10fps more under opgengl compared to d3d!
  • Pete - Monday, September 15, 2003 - link

    Nice read. Actually, IIRC, UT2003 is DX7, with some DX7 shaders rewritten in DX8 for minimal performance gains. Thus, HL2 should be not only the first great DX9 benchmark, but also a nice DX8(.1) benchmark as well.
  • Anonymous User - Monday, September 15, 2003 - link

    so valve let you guys test out half life 2 on some laptops eh? very nice. (great review to, well written)

Log in

Don't have an account? Sign up now