Enter the Futuremark Games Studio

Why is it that the ideal review needs to look at performance in a large variety of applications/games? It is precisely because it is difficult (if not impossible) to predict performance without such a broad selection of performance results. If we run performance benchmarks on a dozen applications and a component comes out ahead of its competition in all 12 tests, it's reasonably safe to state that this component is going to be faster in the majority of applications. This is currently the case with Intel's Core 2 pitted against AMD's Athlon X2 — and now Phenom — processors. More often, we encounter situations where some applications perform better on one architecture and the remainder are faster on the competition. Depending on the margins of victory, and even more importantly depending on how individual users plan to use their systems, which component is "better" is a matter of perspective.




If Futuremark Games Studio (FGS) can remain true to their roots and release games that include useful benchmarking tools, even better. It's not that difficult to include benchmarking tools with a game that will provide a very accurate overview of performance, but too few developers take the time to do so. Of course, there's a difference between benchmarking a good game and benchmarking junk. Before FGS can become relevant, they need to prove they can actually make games. We'll have to hold off on leveling a verdict in that area for a while. Regardless, getting more titles from more developers is never a bad thing, and if the games have good graphics and they use the graphics engines from the 3DMark utilities, ORB results take one large step away from being purely synthetic.

We're optimistic about what Futuremark Games Studio will be able to do in the gaming market, and with the resources of Futuremark behind them we will hopefully get to see creative new designs rather than cookie-cutter clones pushed out by corporate think-tanks. What we don't know (yet - we're trying to get more details and will update this article if/when they become available) is what sort of games they're planning to release, when they're planning to release them, and whether they'll be PC-exclusive or multi-platform. Given that the press release mentions "new IP", the door is wide open.

What's in a benchmark?
Comments Locked

18 Comments

View All Comments

  • bmaytum - Monday, February 4, 2008 - link

    Getting back on topic (instead of nVidia vs ATI arguments....):

    3DMark has announced their forthcoming 3DMark Vantage bechmarking tool (and stated that 3DMark2006 is the end of the line for DirectX9 testing from them) here:
    http://www.futuremark.com/3dmarkvantage/">http://www.futuremark.com/3dmarkvantage/

    Since 3DMark Vantage is DX10 ONLY, requires users to run Win Vista - but I'm staying with WinXP until it's dead sometime > 2009, unless Micro$oft fixes Vista substantanialy. But I digress....

    So I can only HOPE that when FGS does release some game titles, they won't be DX10-only. Please!
  • kdog03 - Friday, February 1, 2008 - link

    Where is 3DMark08!!!
  • Pirks - Thursday, January 31, 2008 - link

    Just think of it! The _benchmarking_ company releasing stuff for _console_ - FOR CONSOLE, GUYS! Can you believe console can be bechmarked?

    I mean, seriously - what was Jarred thinking when he stated "we don't know if this is going to be PC only or multiplatform"

    Hello? Reality check anyone?

    I know comments here could VERY funny, but this is first time I laughed HARD after reading the article itself. Thanks Jarred, you're the man!
  • JarredWalton - Thursday, January 31, 2008 - link

    Consoles don't need benchmarks - they're all the same. What are you going to do with a console benchmark? Find out how fast the console in question runs a specific game. Woohoo! Given how many games are multi-platform, I won't be surprised if FGS does port stuff to other platforms, but I also wouldn't be surprised if whoever does the porting ends up removing the benchmarking.

    "We benchmarked 50 different Xbox 360 machines, and discovered that unit number 42 was the fastest of the lot by 0.01%. So we recommend all of you looking at upgrading your console try for one that was manufactured around the same time as unit 42...."
  • perzy - Thursday, January 31, 2008 - link

    I'm sorry if I missed ATI/AMD's recent progress in OpenGL.
    I'm just amazed that even as an avid reader of hardware sites like Anandtech, I had to learn the hard way.
    Yes UT99 is very old now and it was optimized for Glide.
    But I still play it (the sniper variant) and the situation is the same in principle, but the cards have gotten so fast it doesent matter anymore.
    There is just 2 API's that matter for gaming, and two chip makers, and if one manufacturer makes cards that for years are substantially better than the other on one API, why isent it clearly stated?
    BTW, I'm no ATI/AMD-basher, I recently bought a used ATI x1900gt to use for Unreal Tournament 3, which is DirecX...

    On-topic: I'm very much looking forward to Futuremarks games! With their technological skills and usually very interesting scenarios in the test, it should be exiting!
  • JarredWalton - Thursday, January 31, 2008 - link

    Okay, here's the past decade or so of GPUs:

    Voodoo > everything
    Voodoo2 > everything
    TNT2 comes out and is overall better than Voodoo2 (not always faster, but more flexible overall)
    Voodoo2 SLI still > everything
    GeForce 256 comes out; redefines 3D performance. DDR version is best
    GeForce 2 GTS continues to lead
    GeForce 3 = best in class
    GeForce 4 FTW

    So far, NVIDIA is kicking butt and chewing gum, but...

    Radeon 9800 Pro is launched and demolishes everything from NVIDIA
    5800 Ultra is a joke, and 5900 is better but still slower than the Radeon 9800 cards.
    GeForce 6800 eventually launches, going up against the X800. This series ends up being mostly a tie; NVIDIA has SM3.0, but nothing really uses it and by the time it becomes relevant everyone has upgraded.
    GeForce 7800 vs. Radeon X1800 is a tossup as well; certain areas favor one card or the other. X1800 is one of the GPUs with the shortest lifetime of any chip, however...
    X1900/X1950 take the crown back from GeForce 7900. Some titles favor NVIDIA still, but many titles now supporting SM3.0 strongly favor ATI.
    GeForce 8800 launches and beats the tar out of everything else. NVIDIA rests on their laurels and here we are 18 months later with cards only incrementally faster than the original 8800 GTX.

    NVIDIA has held the crown a majority of the time (in my view) since the launch of the TNT2. That said, recent history has been much closer competition up until the 8800. I've got 7900 GTX and X1900 XT/X1950 XTX, and I can tell you that the G70 has not aged well at all. All games are still easily playable on X1950 XTX, though not always at max details (i.e. Crysis).
  • MrKaz - Friday, February 1, 2008 - link

    Very good time line analysis Jarred Walton I completely agree with it, however there is one very important point that many web sites like yours misses, which is the longevity of the card:
    http://www.gamespot.com/features/6182806/p-5.html?...">http://www.gamespot.com/features/6182806/p-5.html?...

    As you can see based on product comparing and using “your” time line:
    GeForce 6800 128MB 9fps
    Radeon X800 256MB 30fps
    The GeForce 7600 GT 256MB 16fps even loses to the “poor” Radeon X1300 XT 256MB 22fps
    The GeForce 7900 GS 256MB 24fps loses the much older Radeon X800 256MB 30fps and to the weaker Radeon X1650 XT 256MB 32fps

    Both cards are similar:
    Radeon X1950 Pro 256MB 33fps
    GeForce 7900 GS 256MB 17fps
    ---
    GeForce 7900 GTX 512MB 17fps
    Radeon X1900 XT 256MB 30fps

    There is a huge performance difference between the cards and both at the time offered similar performance with the same games. But newer games are much faster on old Ati cards than on old Nvidia cards. A difference of 2X/3X is not something to be ignored.
    I think Ati should have more credit for keeping improving drivers on older cards. And Nvidia should have the other way around.
  • Joepublic2 - Friday, February 1, 2008 - link

    That's a good point, but it's not a driver issue with the nvidia cards but the fact that DX9 shader stuff is faster on the older ATI cards.

    http://www.nordichardware.com/Reviews/?skrivelse=3...">http://www.nordichardware.com/Reviews/?skrivelse=3...
  • JarredWalton - Friday, February 1, 2008 - link

    This is all related to what I was saying about the X1900 vs. G70 match-up. NVIDIA was faster in some games, but as more titles shifted to SM3.0 the disparity between R5xx and G7x became very large. So the 2-3 year old ATI stuff is doing much better now than the 2-3 year old NVIDIA stuff. The 8400/8600 cards aren't that great either, since they dropped to 32 or 16 SPs - though the HD 2400/2600 aren't really any better. Lucky for NVIDIA, G80 and G92 turn the tables and are faster than the AMD competition. Which is why we get stuff like the 3870X2 - not that NVIDIA won't take a similar approach soon.

    Quad-SLI was such a disaster the first time around, I'm not looking forward to seeing how quad does with CrossFire and SLI with newer parts. The drivers should be better, but I'll go on record here stating that I don't think a lot of the lesser known games will benefit. Crysis, World in Conflict, and any other super-demanding game will get profiled first; anything that doesn't regularly get benchmarked will be put on the back burner as far as optimizations for Tri/Quad CF/SLI. Hopefully AMD and NVIDIA can prove my skepticism wrong, though.
  • chizow - Wednesday, January 30, 2008 - link

    They've pushed some pretty cutting edge visuals for years in their demos, the art direction and design weren't always studio grade though. I think another big knock on 3DMark was that their demos ran worst than games with comparable visuals released later, which further invalidated its use as a performance benchmark.

    I like 3DMark though, it definitely serves its purpose and has throughout the years. It gives a general expectation for what you can expect in terms of performance and for me is an early indicator of whether a build is FUBAR or if its working within normal parameters. Its not as precise as other benchmarks out there, as the same rig run on consecutive runs without changing a thing can have +/- 1000 point swings in ORB, but at the same time its pretty easy to diagnose a potential system problem by simply knowing the GPU/CPU/3DMark score.

    I'm sure more fine-tuning could be achieved with the full version, but that would require you to pay for it (LOL!). I guess time will tell if people are actually willing to pay for any games they release.

Log in

Don't have an account? Sign up now