Gaming's Future, Continued

So what can be done besides making a world look better? One of the big buzzwords in the gaming industry right now is physics. Practically every new game title seems to be touting amazing new physics effects. Perhaps modern physics are more accurate, but having moderate amounts of physics in a gaming engine is nothing new. Games over a decade ago allowed you to do such things as pick up a stone and throw it (Ultima Underworld), or shoot a bow and have the arrow drop with distance. Even if the calculations were crude, that would still count as having "physics" in a game. Of course, there's a big difference in the amount of physics present in Half-Life and those present in Half-Life 2, and most people would agree that the Half-Life 2 was a better experience due to the improved interaction with the environment.

Going forward, physics can only become more important. One of the holy grails of gaming is still the creation of a world that has a fully destructible environment. Why do you need to find a stupid key to get through a wooden door when you're carrying a rocket launcher? Why can't you blow up the side of the building and watch the entire structure fall to the ground, perhaps taking out any enemies that were inside? How about the magical fence that's four inches too tall to jump over - why not just break it down instead of going around? It's true that various games have made attempts in this direction, but it's still safe to say that no one has yet created a gaming environment that allows you to demolish everything as you could in the real world (within reason). Gameplay still needs to play a role in what is allowed, but the more the possibilities for what can be done are increased, the more likely we are to see revolutionary gameplay.

Going along with physics and game world interactions, Valve spoke about the optimizations they've made in a structure called the spatial partition. The spatial partition is essentially a representation of the game world, and it is queried constantly to determine how objects interact. From what we could gather, it is used to allow rough approximations to take place where it makes sense, and it also helps determine where more complex (and accurate) mathematical calculations should be performed. One of the problems traditionally associated with multithreaded programming has been locking access to certain data structures in order to keep the world in a consistent state. For the spatial partition, the vast majority of the accesses are read operations that can occur concurrently, and Valve was able to use lock-free and wait-free algorithms in order to greatly improve performance. A read/write log is used to make sure the return values are correct, and Valve emphasized that the lock-free algorithms were a huge design win when it came to multithreading.

Another big area that can stand to see a lot of improvement is artificial intelligence. Often times, AI has a tacked on feel in current games. You want your adversaries to behave somewhat realistically, but you don't want the game to spend so much computational power figuring out what they should do that everything crawls to a slow. It's one thing to wait a few seconds (or more) for your opponent to make a move in a chess match; it's a completely different story in an action game being rendered at 60 frames per second. Valve discussed the possibilities for having a greater number of simplistic AI routines running, along with a few more sophisticated AI routines (i.e. Alyx in Episode One).




They had some demonstrations of swarms of creatures interacting more realistically with the environment, doing things like avoiding dangerous areas, toppling furniture, swarming opponents, etc. (The action was more impressive than the above screenshots might indicate.) The number of creatures could also be increased depending on CPU power (number of cores as well as clock speed), so where a Core 2 Quad might be able to handle 500 creatures, a single core Pentium 4 could start to choke on only 80 or so creatures.

In the past, getting other creatures in the game world to behave even remotely realistically was sufficient -- "Look, he got behind a rock to get shelter!" -- but there's so much more that can be done. With more computational power available to solve AI problems, we can only hope that more companies will decide to spend the time on improving their AI routines. Certainly, without having spare processor cycles, it is difficult to imagine any action games spending as much time on artificial intelligence as they spend on graphics.

There are a few less important types of AI that could be added as well. One of these is called "Out of Band AI" -- these are AI routines that are independent of the core AI. An example that was given would be a Half-Life 2 scene where Dr. Kleiner is playing chess. They could actually have a chess algorithm running in the background using spare CPU cycles. Useful? Perhaps not that example, unless you're really into chess, but these are all tools to create a more immersive game world, and there is almost certainly someone out there that can come up with more interesting applications of such concepts.

The Future of Gaming? Other Multi-Core Benefits
Comments Locked

55 Comments

View All Comments

  • yacoub - Wednesday, November 8, 2006 - link

    sorry about that. got a little too excited.
  • Ruark - Tuesday, November 7, 2006 - link

    Page 6: ". . . everything crawls to a slow."
  • duploxxx - Tuesday, November 7, 2006 - link

    page 8 test setup, clear a cut/paste job... all of the cpu's are the same Athlon.

    put an allendale in the benchmark, lot's of people want to see how cache related this multithread sw is.

    Valve talks about 64-bit... tests are 32bit? Since some competitors are talking about 64-bit code in there gaming also, should be interesting to see what the difference is vs 32bit.
  • JarredWalton - Tuesday, November 7, 2006 - link

    Sorry 'bout that - I've fixed the CPU line. All systems were tested at stock and 20% OC. The problem with Allendale is that it has different stock clock speeds, so you're not just comparing different CPU speeds (unless you use a lower multiplier on Conroe). Anyway, this is a first look, and the next CPU article should have additional benches.

    The tests are all 32-bit. This is not full Episode 2 code, and most people are waiting for Vista to switch to running 64-bit anyway. All we know is that Episode 2 will support 64-bits natively, but we weren't given any information on how it will perform in such an environment yet.
  • brshoemak - Tuesday, November 7, 2006 - link

    Nice to know where things are headed. Great article.

    Jarred, 2nd page, 2nd paragraph

    quote:

    place it into a pan and start beating the oven


    should be 'heating the oven' - although quite funny as is, you may want to keep it ;)
  • JarredWalton - Tuesday, November 7, 2006 - link

    Darn speech recognition. And "b" and "h" looked close enough that I missed it. Heh. No ovens were harmed in the making of this article.
  • MrJim - Tuesday, November 7, 2006 - link

    This was a very interesting article to read, the future looks bright for us with multi-core systems and Valve games. Excellent work Mr Walton!
  • timmiser - Tuesday, November 7, 2006 - link

    Shoot, I didn't make it past the dinner description. Got too hungry!
  • George Powell - Tuesday, November 7, 2006 - link

    I quite agree. Top notch article there. It is great to see how Valve are committed to giving us the best gaming experience.
  • Regs - Tuesday, November 7, 2006 - link

    The future looks bright for people willing to buy valve and multi-core CPUs!!

Log in

Don't have an account? Sign up now