By now enough time has passed that I can come back here and hopefully answer/not answer a few questions :)

In the 8+ years I've been running this place, I don't think I've ever pulled an article before. I can't be too specific here, but some folks needed to be kept anonymous and I had to make a decision for the greater good in the long run. I apologize for remaining quiet about it for so long, but it was necessary.

With that out of the way - there's a lot to talk about.

I finally managed to pry a pair of 7800 GTXs away from Derek's hands and I've been working to answer the question of how fast of a CPU need to feed these things. There are a number of variables that have to be taken into account, most importantly being the resolution you're running at. The thing that's truly new about a card as powerful as the G70 is that you really start being limited by your monitor in what resolutions it supports. While owners of large analog CRTs have a lot of flexibility with what resolutions they can run at, LCD owners can't; so if you've got a G70 hooked up to a 1600 x 1200 panel you'll have to make different CPU decisions than if you have a 1920 x 1200 panel. I'm trying to simplify the decision making as best as possible and for this round I'm only focusing on single card solutions, but if there's demand later I can tackle SLI requirements.

I finally hooked up the G70 to the 30" Cinema Display and gave Doom 3 a whirl at 2560 x 1600. What I find most interesting is that once you start getting far above 1600 x 1200 it's no longer about making the game look good, it's about making the game look good on your monitor. For example, there's not too much difference playing Doom 3 at 1920 x 1200 vs. 2560 x 1600, it's just that the former looks great on a 24" monitor while the latter looks great on a 30" monitor. The quest for perfect image quality stops being about resolution and starts being about screen size; almost in a way similar to how consoles used to be, where your only hope for a "better" picture was to go to a larger screen, since you couldn't control resolution.

The pendulum will swing away from ultra high resolutions as games become more and more demanding. There are still some titles that even the G70 can't handle at above 1280 x 1024.

Monday's Athlon 64 Memory Divider article has got me thinking a lot about multitasking and its impacts on higher speed memory. Theoretically there should be some pretty big differences between DDR400 and DDR500 once we get into the heftier multitasking scenarios, but I want to get an idea of exactly how widespread that need is. My initial tests only revealed one scenario where there was a tangible performance boost, but I think they warrant some additional testing. After I'm done with this memory divider stuff I'll head on to that.

Many of you have asked for a Battlefield 2 CPU scaling article and I'm more than happy to oblige, so I've started working on the planning for such an article. Right now I'm stuck trying to figure out how best to make it a manageable benchmarking task, as I'd like to be able to provide accurate CPU/GPU recommendations for each performance class. I think I'll inevitably have to limit what GPUs I cover, but I'll do my best to include the ones you guys want the most.

I've been stuck on a H.264 kick for a while now, so I figured that doing a CPU comparison involving H.264 would be something interesting to do. My only question, other than Quicktime 7 and Nero, what are you folks using to encode H.264 on the PC?

Remember Gigabyte's i-RAM from Computex? Well, one should be in my hands very soon and given the interest in it, it's going to receive top priority as soon as I've got it. Which begs the question, are there any particular tests you all would like to see? I'll admit, I am a bit surprised by the positive response the i-RAM received; I expected people to be interested in it, just not this interested in it.
Comments Locked

55 Comments

View All Comments

  • Anand Lal Shimpi - Thursday, July 14, 2005 - link

    Ian

    It wasn't who I mentioned, just some things that were mentioned could have been traced back to those who were anonymous from the start.

    Take care,
    Anand

  • pio!pio! - Thursday, July 14, 2005 - link

    Ooops there's a typo there..I meant check out the mpeg-4 AVC FORUMS on doom9.net
  • Mark Little - Thursday, July 14, 2005 - link

    Freakin' Microsoft and Sony. This is why I like Nintendo. They care more about your gaming experience than who has the bigger Gflop. Xbox and Playstation owners, I envy you not.

    Sorry, you had such a problem with big, corrupt corporations, Anand.
  • pio!pio! - Thursday, July 14, 2005 - link

    What was the article you pulled? I don't even remember.

    Other than quicktime 7 and Nero, there is x264 (and open source encoder in the spirit of xvid), and some other lesser ones..check out the mpeg-4 AVC (another name for h.264) on www.doom9.net they have a great FAQ on what is out there.

    If I were you I would do benchmarks with either x264 or the latest beta of Ateme encoder (the one that goes into Nero..see if you can get on the beta) They are both regarded as the best in that field and are both able to be multithreaded if you want to test that.

    I also suggest if you are to continue with mpeg-4 ASP (h.263..ie divx and xvid) encoder tests to upgrade to the latest divx or even the beta multithreaded one at divxlabs.com
  • Ian - Thursday, July 14, 2005 - link

    You could not have replaced any names in the articles with 'anonymous trusted source'? Wasn't Tim from Epic the only name you mentioned?

Log in

Don't have an account? Sign up now