Image Quality: Still Foggy

As Ryan pointed out in his more timely piece, image quality under OS X is noticeably worse than under Windows.

The Mac screen shots are foggier for some reason and despite the fix that was applied to Portal, Half Life 2 Episode 2 appears to have worse texture filtering quality under OS X than Windows. This is more pronounced of a difference than what we saw under Portal.


Half Life 2 Episode 2 - Windows 7 - Click to Enlarge


Half Life 2 Episode 2 - OS X - Click to Enlarge

It looks like something is wrong with the AF setting, reverting to Trilinear filtering confirms my suspicion:


Half Life 2 Episode 2 - OS X - Trilinear Filtering - Click to Enlarge

But AF isn’t completely disabled. Using the Windows version for comparison it looks like Half Life 2 Episode 2 just forces 4X AF regardless of what you set the texture filtering option to:


Windows 7 4X AF


OS X 16X AF

The sky and muted colors are still a problem and I can’t seem to find out the cause of that one. There’s some texture banding off in the distance in the sky that’s only visible in the OS X version.

The Performance Story Final Words
POST A COMMENT

95 Comments

View All Comments

  • Scali - Wednesday, June 09, 2010 - link

    I wonder if that test was done correctly.
    I know that Doom3 performed quite poorly on my GeForce 8800GTS aswell...
    After experimenting with the driver settings a bit, I found that the problem was in the "Multiple display performance mode" setting.
    Somehow this had no measurable effect on D3D apps, but in Doom3 (and probably most other OpenGL apps), it was much MUCH faster with "Single display performance mode".
    Reply
  • Brian Klug - Friday, June 04, 2010 - link

    Honestly I think that has more to do with the fact that most of the new apple hardware is using Nvidia before AMD. The exception being the iMac lineup. Reply
  • Setsunayaki - Friday, June 04, 2010 - link

    Personally, I side with common sense.

    I don't care about gaming benchmarks to some 2 year old game, ok? I already was able to play it on Windows upon release and anyone who can blow money on buying a MAC, along with it iPHONES, iPODs, iPADs, (and the subscription fees that go towards some of these devices) definitely has $800 - $1000 to spend on making a gaming PC to enjoy the latest games out there....without having to wait 2 years for their own platform to make a game "playable."

    The largest difference I've seen as a computer scientist and a musician is that Apple Offers programs that are not Professional Programs, but are used by Professionals.

    A lot of professionals get stuck into using those non-professional programs because the professional program user interface on both, MAC and Windows have a high learning curve. On Windows the same program costs a lot less to buy, but still requires a long time to learn.

    Lets not forget the professional is already a professional in his or her own field. Learning a program means they spend more time learning programs and computers and less time working in their fields making a living (until they master a program)...call this "Downtime" for professionals.

    The main difference between Windows, MAC and Linux I have seen is:

    Windows = Professional Programs exist, which are used by professionals on a wide scale, but those programs are not efficient in CPU usage and memory usage, they take longer to learn than MAC versions of the same program but offer more features through free updates and other measures and can be bought at lower prices.

    MAC = Professional Programs exist, but the costs are so high that very few attempt to learn these programs or even buy them. Unfortunately a lot of professionals use non-professional programs in many areas to make things work. It gets them by until they need an extra feature available in the professional program. Not wanting to learn the professional program, they wait for the next version of the non-professional program and pay more for it...

    Linux = Professional Programs exist. Linux users learn to master the command line interface. This is done for speed. But Linux has a philosophy of "One program = One major Function" and this allows for a high level of efficiency. Linux users learn both, the command line interface and also the GUI interface on programs as well. It takes a long time to learn as well, but ultimately in the end they have the best resource management and program efficiency. They wait on features as well like MAC users do, but updates are a lot faster on many of the professional programs out there. The exception comes in server-side programs which are light years ahead of windows or macs since that is what they are built for. Linux suffers in the long-term for availability. It was in the last 5 - 10 years when professional programs emerged while MAC and Windows have had access to professional programs for 10 - 15+ years.

    In the end, rather than fight about it....Its always good to at least own two of the three major OSes and try the one you do not own...not for the sake of argument, but for the sake of reality.

    If you truly love computers...it means you can prove how great you are by learning and trying different Operating Systems. It means anywhere you find a computer, be it Windows, MAC or Linux, you will be able to access it, use it and know what it can and can not do....

    That knowledge goes farther than crucifying one OS in favor for the other.
    Reply
  • toast70 - Friday, June 04, 2010 - link

    Are there ATI drivers for MAC? I have never seen any ( doesn't mean there aren't any i just haven't run into any)
    I would believe that is why it hasn't been tested this way...
    Reply
  • Penti - Friday, June 04, 2010 - link

    Are you joking? You can choose a ATI HD 4870 with the mac pro right in the system builder since Mac Pro 4.1 (March last year) to come installed and delivered with the Mac Pro. More to the point every 27" iMac comes with Mobility HD4850 for the Core iX version and HD4670 or HD4850 (Mobility) with the Core 2 based version. Plenty of people use ATi cards for hackintoshes on top of that. Drivers are in the OS except when the vendors didn't have time to include them. Like when GTX 285 where released.

    Before that you had, HD3870, HD2600, Mobility HD2600, Mobility HD2400, X1950, Mobility X1600 and on G5, G4, G3 etc you've had X800 XT, 9800 Pro, 9700 Pro, 9600 Pro, 9200, 9000, 8500, 7500 and ATI Rage and so on.

    And it's not bad that even Apples integrated graphics handles Valves Source-based games. It makes every Macbook, Macbook Pro 13, Macbook Pro 15 & 17 a gaming machine, although lightweight that. Also every Mac with a dedicated graphics card from the 8600-series and up from nvidia and every ATI card since Apple released Intel-Macs should handle both Valves games and Starcraft 2. Every iMac with a dedicated graphics card should do fine, certainly playable on any 24" and 27" iMac. And they where released when many of the competitors all-in-ones had like X3100 graphics. They certainly targeted the casual or old ex win/dos gamers fine. Of course many primarily game on consoles now days. There's not many exclusive titles. It's confirmed that it runs fine on Core iX 27" iMacs with Mobility HD4850. Certainly doesn't hurt that X1000-series is still supported in 10.6 when official support has been dropped for Windows 7 (I did install Vista drivers). Bugs should still be ironed out though but ones they have been it should run fine for most people. But ATI cards are prevalent.
    Reply
  • setzer - Friday, June 04, 2010 - link

    One thing that people are ignoring in these Windows vs OSx source engine comparisons, is that although the hardware is the same, the APIs are not, the Windows version of Source runs on DX and the OSx version runs on OGL.
    Those two don't compare in IQ or performance, people keep expecting that an open-source api offers the same visual output/performance of a proprietary one, but I really don't see how you can manage that without code branching, which I suspect that Valve doesn't really want to do beyond the point they are now, they have separated their game logic/engine from the render part.
    If they invest enough time and money on this you might get better performance and on par IQ, but i don't expect OGL performance to surpass DX.

    A valid benchmark of this issue would be to run both systems using the opengl render, which i have no idea if it's possible or not in windows, aside from that, the only other benchmark that i can think of is between the linux client and the osx client.
    Reply
  • bobvodka - Friday, June 04, 2010 - link

    Minor nitpick; OpenGL is NOT Open Source. It is an open standard, the OpenGL driver components are very much closed. Reply
  • CptTripps - Friday, June 04, 2010 - link

    This benchmark is perfectly valid in one sense. Which one plays it better. Reply
  • Penti - Saturday, June 05, 2010 - link

    Haha, many vendor specific extensions, functions and effects where developed on OpenGL. It's purely up to the graphic vendors to backport new features to it even to the older versions of the API and support the API and functions they want, it's just an API an Vendor extensible one of that. Some are mandated, some are vendor specific, some started out as vendor specific but became a normal extension, some are backported from newer versions off OGL. The implementation is proprietary. Features get supported cross and forth between DX and OGL. Many of the most advanced features of DX where just implemented on to off OGL or GLSL originally. If you want to try out utilizing new hardware features it's easiest to implement a demo for that on top of OGL. So don't think nVidia and ATi don't care for OGL.

    You can get equal IQ, you can get equal performance. But obviously you have to work on the engine and get drivers issues sorted out like when games are released on Windows. On Consoles you have to work out the kinks yourself and OS X easily surpasses the capabilities of consoles. OGL isn't a static thing. Even if OGL 2.0 is from 2004 it has been extended since by the vendors and OGL 2.0 as in 2004 is still about equal to both PS3 and Xbox 360. You got OpenGL, GLSL, ARB, Nvidia Cg support, OpenCL and newer hardware then the consoles. There's no reason why the console ports would run worse on OS X then on the intended consoles, it's the same companies that even write the drivers that support the hardware. Lacking HDR of course the picture will look different. HDR itself is supported in OpenGL 2.0. And extensions might be useful to achieve it. It can also be made with vertex and fragment programs (GLSL). It's simply neglected from being implemented, remember this is a new rendering layer, not a translation layer. All the same features are accessible. It's what the API are designed to do, be a way to use the hardware. Remember OpenGL 2.0 is old, it however has access to all the same functions as DX9.0c and most of the functions from DX10, 10.1 and 11 (of the hardware supports it). Nothing more is needed for current generation engines like Source. In both DX and OGL's case it's actually the driver that implements the interface. They are responsible for the pipeline, optimizations and performance. Not some component from Apple or Microsoft.
    Reply
  • Ben90 - Thursday, June 10, 2010 - link

    After reading the first article I hypothesized the same conclusion, that the OpenGL implementation wasn't as optimized as the Direct X version. I did some googling and found this articles results mirrored when comparing only OGL across the Windows/Linux/OSX platforms. Its something else Reply

Log in

Don't have an account? Sign up now