Closing Thoughts

At this point, we almost have more questions than we have answers. The good news is that it appears Valve and others will be able to make good use of multi-core architectures in the near future. How much those improvements will really affect the gameplay remains to be seen. In the short term, at least, it appears that most of the changes will be cosmetic in nature. After all, there are still a lot of single core processors out there, and you don't want to cut off a huge portion of your target market by requiring multi-core processors in order to properly experience a game. Valve is looking to offer equivalent results regardless of the number of cores for now, but the presentation will be different. Some effects will need to be scaled down, but the overall game should stay consistent. At some point, of course, we will likely see the transition made to requiring dual core or even multi-core processors in order to run the latest games.

We had a lot of other questions for Valve Software, so we'll wrap up with some highlights from the discussions that followed their presentations. First, here's a quick summary of the new features that will be available in the Source engine in the Episode Two timeframe.



We mentioned earlier that Valve is looking for things to do with gaming worlds beyond simply improving the graphics: more interactivity, a better feeling of immersion, more believable artificial intelligence, etc. Gabe Newell stated that he felt the release of Kentsfield was a major inflection point in the world of computer hardware. Basically, the changes this will bring about are going to be far reaching and will stay with us for a long time. Gabe feels that the next major inflection point is going to be related to the post-GPU transition, although anyone's guess as to what exactly it will be and when it will occur will require a much better crystal ball than we have access to. If Valve Software is right, however, the work that they are doing right now to take advantage of multiple cores will dovetail nicely with future computing developments. As we mentioned earlier, Valve is now committed to buying all quad core (or better) systems for their own employees going forward.

In regards to quad core processing, one of the questions that was raised is how the various platform architectures will affect overall performance. Specifically, Kentsfield is simply two Core 2 Duo die placed within a single CPU package (similar to Smithfield and Presler relative to Pentium 4). In contrast, Athlon X2 and Core 2 Duo are two processor cores placed within a single die. There are quite a few options for getting quad core processing these days. Kentsfield offers four cores in one package, in the future we should see four cores in a single die, or you could go with a dual socket motherboard with each socket containing a dual core processor. On the extreme end of things, you could even have a four socket motherboard with each socket housing a single core processor. Valve admitted that they hadn't done any internal testing of four socket or even two socket platforms -- they are primarily interested in desktop systems that will see widespread use -- and much of their testing so far has been focused on dual core designs, with quad core being the new addition. Back to the question of CPU package design (two chips on a package vs. a single die) the current indication is that the performance impact of a shared package vs. a single die was "not enough to matter".

One area in which Valve has a real advantage over other game developers is in their delivery system. With Steam comes the ability to roll out significant engine updates; we saw Valve add HDR rendering to the Source engine last year, and their episodic approach to gaming allows them to iterate towards a completely new engine rather than trying to rewrite everything from scratch every several years. That means Valve is able to get major engine overhauls into the hands of the consumers more quickly than other gaming companies. Steam also gives them other interesting options, for example they talked about the potential to deliver games optimized specifically for quad core to those people who are running quad core systems. Because Steam knows what sort of hardware you are running, Valve can make sure in advance that people meet the recommended system requirements.

Perhaps one of the most interesting things about the multi-core Source engine update is that it should be applicable to previously released Source titles like the original Half-Life 2, Lost Coast, Episode One, Day of Defeat: Source, etc. The current goal is to get the technology released to consumers during the first half of 2007 -- sometime before Episode Two is released. Not all of the multi-core enhancements will make it into Episode Two or the earlier titles, but as time passes Source-based games should begin adding additional multi-core optimizations. It is still up to the individual licensees of the Source engine to determine which upgrades they want to use or not, but at least they will have the ability to add multithreading support.


With all of the work that has been done by Valve during the past year, what is their overall impression of the difficulty involved? Gabe put it this way: it was hard, painful work, but it's still a very important problem to solve. There are talented people that have been working on getting multithreading to work rather than on other stuff, so there has been a significant monetary investment. The performance they will gain is going to be useful however, as it should allow them to create games that other companies are currently unable to build. Even more surprising is how the threading framework is able to insulate other programmers from the complexities involved. These "leaf coders" (i.e. junior programmers) are still able to remain productive within the framework, without having to be aware of the low-level threading problems that are being addressed. One of these programmers for example was able to demonstrate new AI code running on the multithreaded platform, and it only took about three days of work to port the code from a single threaded design to the multithreading engine. That's not to say that there aren't additional bugs that need to be addressed (there are certain race conditions and timing issues that become a problem with multithreading that you simply don't see in single threaded code), but over time the programmers simply become more familiar with the new way of doing things.

Another area of discussion that brought up several questions was in regards to other hardware being supported. Specifically, there is now hardware like PhysX, as well as the potential for graphics cards to do general computations (including physics) instead of just pure graphics work. Support for these technologies is not out of the question according to Valve. The bigger issue is going to be adoption rate: they are not going to spend a lot of man-hours supporting something that less than 1% of the population is likely to own. If the hardware becomes popular enough, it will definitely be possible for Valve to take advantage of it.

As far as console hardware goes, the engine is already running on Xbox 360, with support for six simultaneous threads. The PC platform and Xbox 360 are apparently close enough that getting the software to run on both of them does not require a lot of extra effort. PS3 on the other hand.... The potential to support PS3 is definitely there, but it doesn't sound like Valve has devoted any serious effort into this platform as of yet. Given that the hardware isn't available for consumer purchase yet, that might make sense. The PS3 Cell processor does add some additional problems in terms of multithreading support. First, unlike Xbox 360 and PC processors, the processor cores available in Cell are not all equivalent. That means they will have to spend additional effort making sure that the software is aware of what cores can do what sort of tasks best (or at all as the case may be). Another problem that Cell creates is that there's not a coherent view of memory. Each core has its own dedicated high-speed local memory, so all of that has to be managed along with worrying about threading and execution capabilities. Basically, PS3/Cell takes the problems inherent with multithreading and adds a few more, so getting optimal utilization of the Cell processor is going to be even more difficult.

One final area that was of personal interest is Steam. Unfortunately, one of the biggest questions in regards to Steam is something Valve won't discuss; specifically, we'd really like to know how successful Steam has been as a distribution network. Valve won't reveal how many copies of Half-Life 2 (or any of their other titles) were sold on Steam. As the number of titles showing up has been increasing quite a bit, however, it's safe to say Steam will be with us for a while. This is perhaps something of a no-brainer, but the benefits Valve gets by selling a game via Steam rather than at retail are quite tangible. Valve did confirm that any purchases made through Steam are almost pure profit for them. Before you get upset with that revelation, though, consider: whom would you rather support, the game develpers or the retail channel, publishers, and distributors? There are plenty of people that will willingly pay more money for a title if they know the money will all get to the artists behind the work. Valve also indicated that a lot of the "indie" projects that have made their way over to Steam have become extremely successful compared to their previously lackluster retail sales -- and the creators see a much greater percentage of the gross sales vs. typical distribution (even after Valve gets its cut). So not only does Steam do away with scratched CDs, lost keys, and disc swapping, but it has also started to become a haven for niche products that might not otherwise be able to reach their target audience. Some people hate Steam, and it certainly isn't perfect, but many of us have been very pleased with the service and look forward to continuing to use it.

Some of us (author included) have been a bit pessimistic on the ability of game developers to properly support dual cores, let alone multi-core processors. We would all like for it to be easy to take advantage of additional computational power, but multithreading certainly isn't easy. After the first few "SMP optimized" games came and went, the concerns seemed to be validated. Quake 4 and Call of Duty 2 both were able to show performance improvements courtesy of dual core processors, but for the most part these performance gains only came at lower resolutions. After Valve's Hardware Day presentations, we have renewed hope that games will actually be able to become more interesting again, beyond simply improving visuals. The potential certainly appears to be there, so now all we need to see is some real game titles that actually make use of it. Hopefully, 2007 will be the year that dual core and multi-core gaming really makes a splash. Looking at the benchmark results, are four cores twice as exciting as two cores? Perhaps not yet for a lot of people, but in the future they very well could be! The performance offered definitely holds the potential to open up a lot of new doors.


As a closing comment, there was a lot of material presented, and in the interest of time and space we have not covered every single topic that was discussed. We did try to hit all the major highlights, and if you have any further questions, please join the comments section and we will do our best to respond.

Benchmark Performance
Comments Locked

55 Comments

View All Comments

  • yacoub - Wednesday, November 8, 2006 - link

    sorry about that. got a little too excited.
  • Ruark - Tuesday, November 7, 2006 - link

    Page 6: ". . . everything crawls to a slow."
  • duploxxx - Tuesday, November 7, 2006 - link

    page 8 test setup, clear a cut/paste job... all of the cpu's are the same Athlon.

    put an allendale in the benchmark, lot's of people want to see how cache related this multithread sw is.

    Valve talks about 64-bit... tests are 32bit? Since some competitors are talking about 64-bit code in there gaming also, should be interesting to see what the difference is vs 32bit.
  • JarredWalton - Tuesday, November 7, 2006 - link

    Sorry 'bout that - I've fixed the CPU line. All systems were tested at stock and 20% OC. The problem with Allendale is that it has different stock clock speeds, so you're not just comparing different CPU speeds (unless you use a lower multiplier on Conroe). Anyway, this is a first look, and the next CPU article should have additional benches.

    The tests are all 32-bit. This is not full Episode 2 code, and most people are waiting for Vista to switch to running 64-bit anyway. All we know is that Episode 2 will support 64-bits natively, but we weren't given any information on how it will perform in such an environment yet.
  • brshoemak - Tuesday, November 7, 2006 - link

    Nice to know where things are headed. Great article.

    Jarred, 2nd page, 2nd paragraph

    quote:

    place it into a pan and start beating the oven


    should be 'heating the oven' - although quite funny as is, you may want to keep it ;)
  • JarredWalton - Tuesday, November 7, 2006 - link

    Darn speech recognition. And "b" and "h" looked close enough that I missed it. Heh. No ovens were harmed in the making of this article.
  • MrJim - Tuesday, November 7, 2006 - link

    This was a very interesting article to read, the future looks bright for us with multi-core systems and Valve games. Excellent work Mr Walton!
  • timmiser - Tuesday, November 7, 2006 - link

    Shoot, I didn't make it past the dinner description. Got too hungry!
  • George Powell - Tuesday, November 7, 2006 - link

    I quite agree. Top notch article there. It is great to see how Valve are committed to giving us the best gaming experience.
  • Regs - Tuesday, November 7, 2006 - link

    The future looks bright for people willing to buy valve and multi-core CPUs!!

Log in

Don't have an account? Sign up now