Let's talk Compilers...

Creating the perfect compiler is one of the more difficult problems in computing. Compiler optimization and scheduling is an NP-complete problem (think chess) so we can't "solve" it. And compounding the issue is that the best compiled code comes from a compiler that is written specifically for a certain processor and knows it inside and out. If we were to use a standard compiler to produce standard x86 code, our program will run much slower than if we tell our compiler we have a P4 with SSE2 and all the goodies that go along with it. I know this all seems pretty obvious, but allow me to illustrate a little.

Since I've always been interested in 3D graphics, back in 1998 I decided to write a 3D engine with a friend of mine for a project in our C++ class. It only did software rendering, but we implemented a software z-buffer and did back face culling with flat shading. Back then, my dad had a top of the line PII 300, and I acquired an AMD K6 200. Using a regular Borland C++ compiler with no real optimizations turned on, our little software 3D engine ran faster on my K6 than it did on my dad's PII. Honestly, I have no idea why that happened. But the point is that the standard output of the compiler ran faster on my slower platform while both systems were producing the same output. Now, if I had had a compiler from Intel optimized for the PII that knew what it was doing (or if I had hand coded the program in assembly for the PII), my code could have run insanely faster on my dad's box.

So, there are some really important points here. Intel and AMD processors were built around the same ISA (Instruction Set Architecture) and had a great deal in common back in 1998. Yet, performance varied in favor of the underpowered machine for my test. When you look at ATI and NVIDIA, their GPUs are completely and totally different. Sure, they both have to be able to run OpenGL and DirectX9, but this just means they are able to map OGL or DX9 function calls (via their drivers) to specific hardware routines (or even multiple hardware operations if necessary). It just so happens that the default Microsoft compiler generates code that runs faster on ATI's hardware than on NVIDIA's.

The solution NVIDIA has is to sit down with developers and help handcode stuff to run better on their hardware. Obviously this is an inelegant solution, and it has caused quite a few problems (*cough* Valve *cough*). The goal NVIDIA has is to eliminate this extended development effort via their compiler technology.

Obviously, if NVIDIA starts "optimizing" their compiler to the point where their hardware is doing things not intended by the developer, we have a problem. I think its very necessary to keep an eye on this, but its helpful to remember that such things are not advantageous to NVIDIA. Over at Beyond3d, there is a comparison of the different compiler (DX9 HLSL and NV Cg) options for NVIDIAs shaders.

We didn't have time to delve into comparisons with the reference rasterizer for this article, but our visual inspections confirm Beyond3d's findings. Since going from the game code to the screen is what this is all about, as long as image quality remains pristine, we think using the Cg compiler makes perfect sense. It is important to know that the Cg compiler doesn't improve performance (except for a marginal gain while using AA), and does a lot over the 45.xx dets for image quality.

Tomb Raider: Angel of Darkness Back to the game...
Comments Locked

117 Comments

View All Comments

  • Anonymous User - Wednesday, October 8, 2003 - link

    #67 I think the lightsaber glow is horrible on the Nvidia cards. They glow shines THROUGH the players head. Looks to me like a bug. I like the ATI saber much better.
    (Most peoples heads aren't empty, so light does not shine trough. Maybe your experience is different? ;-)))

    #76 Couldn't agree more. The blurry AA in aquamark is crystal clear even in those tiny images. So how could the authors possible miss that and proclaim that there is no IQ issues? Especially since they have looked at the fullscreen images and spend days on the article?

    Also you can immediately see in all the small images that in general AA is better on the ATI card. This is nothing new, and not considered cheating by Nvidia. It's just that most know that there is a quality difference.
    But shouldn't that at least be mentioned in an article that is focused on image quality?

    Why no screenshots on splinter cell? We should just believe the authors on that? With the aquamark pictures they have shown that we can't take their word for it. So I'd really like to see those screenshots too. Same for EVE.

    And I was really suprised that they didn't know that the water issue in NWN was NOT Ati's fault. They claim that they have surfed forums on NWN issues. In that case they should have known that. (one look at rage3d would have been enough)

    And on top of this the TRAOD part. It seems they typed more text on TROAD then they did in the entire rest of the article. No wonder that people frown at the TROAD part.

    All in all, I can see that much work went into the article, but I feel that it could have been much better.
    As it is now it is left to the reader to find the image issues in the small pictures. But I would expect the author to point me to the image issues.
  • Anonymous User - Wednesday, October 8, 2003 - link

    #74, conclusions are one thing, objective journalism is another.
    There are clear differences in even the small and relatively badly chosen images posted with the article, yet all we get to read is "there are no IQ issues".

    Thus, either the authors of the article are not competent enough (maybe they were simply too tired after the testing...) , or they are intentionally ignoring the differences.
  • Iger - Wednesday, October 8, 2003 - link

    I just can't stay aside and not to thank the authors. The job they've done in this article is amazing, and the site was and will be my all-time favourite! Thank you! :)
  • Malichite - Wednesday, October 8, 2003 - link

    I am extremely confused with the posts here. Many ATI guys seem to think AT unfairly favored the nVidia cards. Did we read the same article? In the end I came away with the opinion that while the new Det 52.xx help, things may get better for nVidia, the ATI is still a better choice today. Did I miss something?

    Additionally for all the guys claiming TR:AOD is a great game. Yeah, we all know only the truely *great* games pull a %51 rating over on www.gamerankings.com (based on 21 media reviews).
  • Anonymous User - Wednesday, October 8, 2003 - link

    Just what kind of world do we live in before a guy has to say why he's not a fanboy before they express their opinion, anyway? The worst part is, you people who do this, you're completely justified in your actions, because if you don't explain why you're not an ATi/nVidia fanboy then people call you one.

    God.. can't we argue without calling others fanboys for once?
  • Anonymous User - Wednesday, October 8, 2003 - link

    i forgot to add the j/k part... i dont want you taking my poor attempt at humor the wrong way... ;)

    anyways, i dont know what all the commotion is about.. shouldnt u (ATI-folk) be happy that nvidia is making vast improvements?

    i would feel sympathetic for people who THOUGHT they wasted $400+ dollars on a card that didn't seem to deliver the performance it promised...
  • Anonymous User - Wednesday, October 8, 2003 - link

    What kind of biased crappy unproffesional review shows percentage drops for enabeling ps 2.0 without showing framerates? if fps are around 30 to begin with the % of fps drop makes no difference cause the game is rendered unplayable! and who benchmarks beta drivers not available to the public on hardware not yet anounced?this reeks with $ payoff and seems like anadtech have thrown thier integrity to waste.I wish that on the 10th when nvidia anounces the nv38 they also release these drivers to the public than some seious review site can actually test the hardware
    (and software, forgive my skeptisicm but nVidia sure earned it this past year) and show us what nVidia is bringing to the graphic's field.
    Disapointed by nVidia and now by Anandtech
  • Anonymous User - Wednesday, October 8, 2003 - link

    sure u do... ;)
  • Anonymous User - Wednesday, October 8, 2003 - link

    Not everyone talking about IQ differences here is a fanboy.

    Look at the images at the bottom of the Aquamark 3 IQ page (highest quality AA, 8xAF). The nVidia 52.14 image is blurred, much detail is lost especially around the explosion. The Catalyst 3.7 image is way sharper, yet its AA is smoother (look at the car body above the wheels), and it loses much less detail around the explosion. The differences are much more than "barely noticeable".

    The tiny images don't give much credit to the article, though.

    (Before anyone calls me az ATI fanboy: I have a GeForce FX 5600 dual DVI.)
  • Anonymous User - Wednesday, October 8, 2003 - link

    TR: AOD is a terrible game, most people just like it because it's such a hot spot for all this benchmarking shite.

    At times like this, I'm glad I use a Matrox Millenium II! .. okay, kidding, but still.

Log in

Don't have an account? Sign up now