By now enough time has passed that I can come back here and hopefully answer/not answer a few questions :)

In the 8+ years I've been running this place, I don't think I've ever pulled an article before. I can't be too specific here, but some folks needed to be kept anonymous and I had to make a decision for the greater good in the long run. I apologize for remaining quiet about it for so long, but it was necessary.

With that out of the way - there's a lot to talk about.

I finally managed to pry a pair of 7800 GTXs away from Derek's hands and I've been working to answer the question of how fast of a CPU need to feed these things. There are a number of variables that have to be taken into account, most importantly being the resolution you're running at. The thing that's truly new about a card as powerful as the G70 is that you really start being limited by your monitor in what resolutions it supports. While owners of large analog CRTs have a lot of flexibility with what resolutions they can run at, LCD owners can't; so if you've got a G70 hooked up to a 1600 x 1200 panel you'll have to make different CPU decisions than if you have a 1920 x 1200 panel. I'm trying to simplify the decision making as best as possible and for this round I'm only focusing on single card solutions, but if there's demand later I can tackle SLI requirements.

I finally hooked up the G70 to the 30" Cinema Display and gave Doom 3 a whirl at 2560 x 1600. What I find most interesting is that once you start getting far above 1600 x 1200 it's no longer about making the game look good, it's about making the game look good on your monitor. For example, there's not too much difference playing Doom 3 at 1920 x 1200 vs. 2560 x 1600, it's just that the former looks great on a 24" monitor while the latter looks great on a 30" monitor. The quest for perfect image quality stops being about resolution and starts being about screen size; almost in a way similar to how consoles used to be, where your only hope for a "better" picture was to go to a larger screen, since you couldn't control resolution.

The pendulum will swing away from ultra high resolutions as games become more and more demanding. There are still some titles that even the G70 can't handle at above 1280 x 1024.

Monday's Athlon 64 Memory Divider article has got me thinking a lot about multitasking and its impacts on higher speed memory. Theoretically there should be some pretty big differences between DDR400 and DDR500 once we get into the heftier multitasking scenarios, but I want to get an idea of exactly how widespread that need is. My initial tests only revealed one scenario where there was a tangible performance boost, but I think they warrant some additional testing. After I'm done with this memory divider stuff I'll head on to that.

Many of you have asked for a Battlefield 2 CPU scaling article and I'm more than happy to oblige, so I've started working on the planning for such an article. Right now I'm stuck trying to figure out how best to make it a manageable benchmarking task, as I'd like to be able to provide accurate CPU/GPU recommendations for each performance class. I think I'll inevitably have to limit what GPUs I cover, but I'll do my best to include the ones you guys want the most.

I've been stuck on a H.264 kick for a while now, so I figured that doing a CPU comparison involving H.264 would be something interesting to do. My only question, other than Quicktime 7 and Nero, what are you folks using to encode H.264 on the PC?

Remember Gigabyte's i-RAM from Computex? Well, one should be in my hands very soon and given the interest in it, it's going to receive top priority as soon as I've got it. Which begs the question, are there any particular tests you all would like to see? I'll admit, I am a bit surprised by the positive response the i-RAM received; I expected people to be interested in it, just not this interested in it.
Comments Locked

55 Comments

View All Comments

  • Anonymous - Friday, July 15, 2005 - link

    h.264 tests should be with the high profile of mpeg-4 avc (ie the latest beta of the ateme/nero codec or x264) cuz that is the cutting edge, highest quality, most cpu intensive codec
  • AT - Friday, July 15, 2005 - link

    I agree - Anand has published a lot of findings that made companies squeel over the period of 8 years, but this was the first time he took down the article. I am sure Anand has gotten threats over these 8 long years and if he was trying to save himself, we should've seen the pattern by now.
  • Anonymous - Friday, July 15, 2005 - link

    Well said Creathir, that's exactly what Anand is saying.
  • sixpak - Friday, July 15, 2005 - link

    The I-RAM should have a jumper to split the sockets and SATA-II support. So if you had 4x512 for example, you would have these in raid0 with 300 MB/s -> 600 MB/s. (2x sata2 outputs).

    Or internal BIOS and PCIe 4x support.
  • dodjer42 - Friday, July 15, 2005 - link

    #17

    The 7800GTX has at least one dual-link DVI. I have two running in SLI on the 30" ACD. Works perfectly. Prior to the 7800GTX the only one I found that really worked was the 6800U 512MB.
  • Eug - Friday, July 15, 2005 - link

    Here is my H.264 playback performance benchmark table:

    http://episteme.arstechnica.com/groupee/forums/a/t...

    Includes a bazillion Macs and Windows PCs.
  • Creathir - Friday, July 15, 2005 - link

    Pete:
    I would have to say, Anand is not covering his own rear, but rather that of his source. It is one thing for piraters to continue to provide a bootleg copy of an article, it is another for Anand to CONTINUE to compromise his source.
    Just some thoughts.
    - Creathir
  • Insomniac - Thursday, July 14, 2005 - link

    Daniel: I think that is part of the problem. Everyone wants to compare consoles to PCs. So the CPUs are weaker than their PC counterparts. The bigger question is are they powerful enough for the console. That answer is yes. Sure, everyone would like to have more, but what they have will be enough. Also, keep in mind a lot of the overhead that a PC has does not exist on a console.
  • MDme - Thursday, July 14, 2005 - link

    iRAM - bench it with the swapfile on iRAM vs using the ram as system ram. example: using a system with 1 gb ram and 1 gb with iRAM vs a system with 2gb RAM and no iRAM.

    would've been nice to know if it was sony or ms who stopped you on the article. My thought is it's probably sony (since you are able to cut through the hype) - just a thought.

  • Goosemaster - Thursday, July 14, 2005 - link

    Did I hear something about HD? :P

Log in

Don't have an account? Sign up now