Introduction

One of the most controversial subjects when it comes to benchmarking graphics performance is undoubtedly Futuremark — specifically their "gaming" benchmarks, the 3DMark series. For 10 years now, we have seen graphics card reviews bicker and argue about the viability of using 3DMark. On the one hand, we have those who insist the 3DMark tools are nothing more than a synthetic graphics benchmark, encouraging heavy optimizations from the various GPU companies in order to come out on top. The other side of the equation consists of people looking for an easy way to categorize performance, plus a group of diehard benchmarkers who are in constant competition to come out on top of the ORB (Online Results Browser) charts. As with so many things in life, reality strikes more of a middle ground.

While there are occasions where the performance metrics generated by the Futuremark tools correlate well to certain real-world games, very few people are going to be interested in purchasing hardware based solely on 3DMark performance. On the other hand, there have been many occasions throughout the history of PC gaming where users have upgraded hardware purely to improve performance in the latest and greatest game. GL-Quake helped to sell thousands (millions even?) of 3dfx graphics cards, which in turn helped to kick-start our modern obsession with 3D gaming.




Take a look at the images in this article for a moment; certainly we're not the only people in the world who when first greeted by a new 3DMark have thought, "Daaaaamn! That is a sweet looking benchmark and it would make an awesome game. They should turn that concept into a real game rather than a 60 second benchmark scene." If you're with us on this one, the wait may be over... sort of.

It appears that Futuremark has been secretly hard at work on their first full retail game, and while we don't have any details on what sort of game it will be or when it will launch, they have announced the formation of Futuremark Games Studio. The plans sound ambitious, with the following statement: "For years, our fans have been asking us when we will start making games. Very soon they are going to get it - and then some!" If we're lucky, we may end up with not just one title but numerous cutting edge titles over the coming years.

That's the core of the announcement, but let's take a minute to discuss exactly why we think this is at all meaningful.

What's in a benchmark?
Comments Locked

18 Comments

View All Comments

  • bmaytum - Monday, February 4, 2008 - link

    Getting back on topic (instead of nVidia vs ATI arguments....):

    3DMark has announced their forthcoming 3DMark Vantage bechmarking tool (and stated that 3DMark2006 is the end of the line for DirectX9 testing from them) here:
    http://www.futuremark.com/3dmarkvantage/">http://www.futuremark.com/3dmarkvantage/

    Since 3DMark Vantage is DX10 ONLY, requires users to run Win Vista - but I'm staying with WinXP until it's dead sometime > 2009, unless Micro$oft fixes Vista substantanialy. But I digress....

    So I can only HOPE that when FGS does release some game titles, they won't be DX10-only. Please!
  • kdog03 - Friday, February 1, 2008 - link

    Where is 3DMark08!!!
  • Pirks - Thursday, January 31, 2008 - link

    Just think of it! The _benchmarking_ company releasing stuff for _console_ - FOR CONSOLE, GUYS! Can you believe console can be bechmarked?

    I mean, seriously - what was Jarred thinking when he stated "we don't know if this is going to be PC only or multiplatform"

    Hello? Reality check anyone?

    I know comments here could VERY funny, but this is first time I laughed HARD after reading the article itself. Thanks Jarred, you're the man!
  • JarredWalton - Thursday, January 31, 2008 - link

    Consoles don't need benchmarks - they're all the same. What are you going to do with a console benchmark? Find out how fast the console in question runs a specific game. Woohoo! Given how many games are multi-platform, I won't be surprised if FGS does port stuff to other platforms, but I also wouldn't be surprised if whoever does the porting ends up removing the benchmarking.

    "We benchmarked 50 different Xbox 360 machines, and discovered that unit number 42 was the fastest of the lot by 0.01%. So we recommend all of you looking at upgrading your console try for one that was manufactured around the same time as unit 42...."
  • perzy - Thursday, January 31, 2008 - link

    I'm sorry if I missed ATI/AMD's recent progress in OpenGL.
    I'm just amazed that even as an avid reader of hardware sites like Anandtech, I had to learn the hard way.
    Yes UT99 is very old now and it was optimized for Glide.
    But I still play it (the sniper variant) and the situation is the same in principle, but the cards have gotten so fast it doesent matter anymore.
    There is just 2 API's that matter for gaming, and two chip makers, and if one manufacturer makes cards that for years are substantially better than the other on one API, why isent it clearly stated?
    BTW, I'm no ATI/AMD-basher, I recently bought a used ATI x1900gt to use for Unreal Tournament 3, which is DirecX...

    On-topic: I'm very much looking forward to Futuremarks games! With their technological skills and usually very interesting scenarios in the test, it should be exiting!
  • JarredWalton - Thursday, January 31, 2008 - link

    Okay, here's the past decade or so of GPUs:

    Voodoo > everything
    Voodoo2 > everything
    TNT2 comes out and is overall better than Voodoo2 (not always faster, but more flexible overall)
    Voodoo2 SLI still > everything
    GeForce 256 comes out; redefines 3D performance. DDR version is best
    GeForce 2 GTS continues to lead
    GeForce 3 = best in class
    GeForce 4 FTW

    So far, NVIDIA is kicking butt and chewing gum, but...

    Radeon 9800 Pro is launched and demolishes everything from NVIDIA
    5800 Ultra is a joke, and 5900 is better but still slower than the Radeon 9800 cards.
    GeForce 6800 eventually launches, going up against the X800. This series ends up being mostly a tie; NVIDIA has SM3.0, but nothing really uses it and by the time it becomes relevant everyone has upgraded.
    GeForce 7800 vs. Radeon X1800 is a tossup as well; certain areas favor one card or the other. X1800 is one of the GPUs with the shortest lifetime of any chip, however...
    X1900/X1950 take the crown back from GeForce 7900. Some titles favor NVIDIA still, but many titles now supporting SM3.0 strongly favor ATI.
    GeForce 8800 launches and beats the tar out of everything else. NVIDIA rests on their laurels and here we are 18 months later with cards only incrementally faster than the original 8800 GTX.

    NVIDIA has held the crown a majority of the time (in my view) since the launch of the TNT2. That said, recent history has been much closer competition up until the 8800. I've got 7900 GTX and X1900 XT/X1950 XTX, and I can tell you that the G70 has not aged well at all. All games are still easily playable on X1950 XTX, though not always at max details (i.e. Crysis).
  • MrKaz - Friday, February 1, 2008 - link

    Very good time line analysis Jarred Walton I completely agree with it, however there is one very important point that many web sites like yours misses, which is the longevity of the card:
    http://www.gamespot.com/features/6182806/p-5.html?...">http://www.gamespot.com/features/6182806/p-5.html?...

    As you can see based on product comparing and using “your” time line:
    GeForce 6800 128MB 9fps
    Radeon X800 256MB 30fps
    The GeForce 7600 GT 256MB 16fps even loses to the “poor” Radeon X1300 XT 256MB 22fps
    The GeForce 7900 GS 256MB 24fps loses the much older Radeon X800 256MB 30fps and to the weaker Radeon X1650 XT 256MB 32fps

    Both cards are similar:
    Radeon X1950 Pro 256MB 33fps
    GeForce 7900 GS 256MB 17fps
    ---
    GeForce 7900 GTX 512MB 17fps
    Radeon X1900 XT 256MB 30fps

    There is a huge performance difference between the cards and both at the time offered similar performance with the same games. But newer games are much faster on old Ati cards than on old Nvidia cards. A difference of 2X/3X is not something to be ignored.
    I think Ati should have more credit for keeping improving drivers on older cards. And Nvidia should have the other way around.
  • Joepublic2 - Friday, February 1, 2008 - link

    That's a good point, but it's not a driver issue with the nvidia cards but the fact that DX9 shader stuff is faster on the older ATI cards.

    http://www.nordichardware.com/Reviews/?skrivelse=3...">http://www.nordichardware.com/Reviews/?skrivelse=3...
  • JarredWalton - Friday, February 1, 2008 - link

    This is all related to what I was saying about the X1900 vs. G70 match-up. NVIDIA was faster in some games, but as more titles shifted to SM3.0 the disparity between R5xx and G7x became very large. So the 2-3 year old ATI stuff is doing much better now than the 2-3 year old NVIDIA stuff. The 8400/8600 cards aren't that great either, since they dropped to 32 or 16 SPs - though the HD 2400/2600 aren't really any better. Lucky for NVIDIA, G80 and G92 turn the tables and are faster than the AMD competition. Which is why we get stuff like the 3870X2 - not that NVIDIA won't take a similar approach soon.

    Quad-SLI was such a disaster the first time around, I'm not looking forward to seeing how quad does with CrossFire and SLI with newer parts. The drivers should be better, but I'll go on record here stating that I don't think a lot of the lesser known games will benefit. Crysis, World in Conflict, and any other super-demanding game will get profiled first; anything that doesn't regularly get benchmarked will be put on the back burner as far as optimizations for Tri/Quad CF/SLI. Hopefully AMD and NVIDIA can prove my skepticism wrong, though.
  • chizow - Wednesday, January 30, 2008 - link

    They've pushed some pretty cutting edge visuals for years in their demos, the art direction and design weren't always studio grade though. I think another big knock on 3DMark was that their demos ran worst than games with comparable visuals released later, which further invalidated its use as a performance benchmark.

    I like 3DMark though, it definitely serves its purpose and has throughout the years. It gives a general expectation for what you can expect in terms of performance and for me is an early indicator of whether a build is FUBAR or if its working within normal parameters. Its not as precise as other benchmarks out there, as the same rig run on consecutive runs without changing a thing can have +/- 1000 point swings in ORB, but at the same time its pretty easy to diagnose a potential system problem by simply knowing the GPU/CPU/3DMark score.

    I'm sure more fine-tuning could be achieved with the full version, but that would require you to pay for it (LOL!). I guess time will tell if people are actually willing to pay for any games they release.

Log in

Don't have an account? Sign up now