Mainstream Graphics Today

Much of the time integrated graphics solutions are enough for business or casual computing needs, but when they are not there needs to be an affordable next step up in the current generation of hardware. Low end cards are generally not designed with gaming in mind, but that doesn't mean that they are not important to gaming.

Many are interested in this low level of add-in graphics card. The volumes on these parts are much higher than at other price points. Some people don't need to worry about 3D and others as non-gamers and non-tech-savvy are not interested in looking past marketing (and also desire a low price). They either don't know or don't care about the kinds of applications that will have a tough time performing well on their systems.

The reason this impacts PC gaming and game development is because publishers are not going to limit the potential sales of their games to consumers who are already gamers and have at least mid-range graphics cards. In order to attract the money that studios need to develop games on the scale that is the current trend, the target market needs to include a much greater slice of PC owners. It needs to reach down to at least the 9500 GT, if not integrated graphics.

We've said this just about every time we've published an article on lower end hardware: the low end is an anchor tied to the neck of game developers. Well, that's not entirely fair, as the previous generation acts the same way as not everyone with a graphics card upgrades every 18 months. But that just means the low end of the previous generation is the real problem.

There are a lot of amazing things possible with the latest and greatest hardware. But not many developers have the time and energy to really focus on getting the most out of today's $400+ graphics solutions. The bulk of their time needs to be put in to making the game playable on the vast majority of hardware that is currently out there. Sure, sliders and settings exist that do make prettier pictures with more powerful computers, but if the bottom line were more powerful it would impact the performance and quality at every level.

The worst offenders are certainly still integrated solutions. These parts are notoriously slow. Even the faster offerings from AMD and NVIDIA, while the are superior to Intel's dismal graphics components, don't do us any favors. But the lowest end add-in cards, while offering quite a boost over integrated graphics in terms of performance, are still under performers in terms in terms of gaming.

Yes, we know that hardware guys can just give performance away for free. But while casual computer users who have something on this level of hardware will be less frustrated than integrated graphics users, they will still not likely be inspired by anything that is possible on their hardware either.

The value of getting real gaming experiences on mid-range to high end hardware with high production value games cannot be understated. Ever since the days of the Commodore 64 and the Atari ST, the general public has been shown ridiculous things that don't reflect game play. Even today there is an over abundance of time spent showing off cut scenes and non-interactive parts of games that aren't actual game play. People who don't already know what is and is not possible aren't going to buy into the hype.

The best thing that could happen for gaming is for lower end hardware to offer more power so that anyone who had an add-in graphics card could download any demo out there and experience a real taste of what is really possible rather than what they currently get.

Don't get us wrong here -- the industry has seen good steps up in performance on the low end and game developers fitting a good range of features into their titles that are at least playable on low end parts. We just want more. Intel integrated seems to be a lost cause at this point, and even NVIDIA and AMD need to step up their integrated segment as well. The low end parts are where the real war is fought and the future of PC gaming, in large part, is defined by the capabilities of this segment.

Ever wonder why consoles come out of the gate with better looking, better performing games than PCs seem capable of offering? It is because game programmers know exactly the hardware they have to work with and develop exclusively for that. Over the next 5 years, eventually the lowest end part that PC game developers are targeting meets or exceeds the performance of the hardware in consoles and for a short time PC game quality leads the curve (until the next next generation consoles come out with current generation mid-range parts and quality that blows the PC away once again).

If the graphics hardware industry is serious about PC gaming (and we firmly believe they need to be going forward), for the next round of console launches all the players in the graphics market need to be willing to come out with a low end part that meets (or comes really close to) the performance and capabilities of what ever graphics hardware is within the consoles.

Yes, there will still be a significant amount of older hardware in systems that developers will target for a time. But time the consoles spend on top will be significantly reduced, and if people could get the quality of a console on the computer they already own for about $75 rather than the three to four hundred dollar and higher prices of next gen consoles, imagine how many people would opt to upgrade their PC than to run out and buy a new console.

Leveling the playing field in terms of production value is one thing. Time and energy still need to be spent on quality and gameplay. Imagine how much more time can be spent on that if shoe horning graphical effects onto low quality hardware wasn't necessary and developers could spend more time making their game competitive by making it good rather than by making it look as good as possible on the crappy hardware they need to target.

Anyway, things aren't always the way they should be. We can dream quite a bit, but reality probably won't shift because we believe it to be in the best interest of the industry. We'll leave this topic and move on to the hardware that stands to limit the innovation of game developers for the next couple years.

The NVIDIA GeForce 9500 GT goes up against the previous range of NVIDIA hardware from the 8500 GT to the 8600 GTS. As for competitors from AMD, we are looking at the Radeon HD 2400 series. Though we have yet to see AMD's refresh part yet, we will certainly be waiting and hoping to get more out of it than what we are expecting with the 9500 GT.

Index The Card
Comments Locked

37 Comments

View All Comments

  • poohbear - Friday, September 5, 2008 - link

    I think your first paragraph in the article sums up how alot of us feel about graphics hardware: we want to see the boundaries pushed and see what the next gen brings in terms of graphics. The 9500gt hardly does anything in that regards for most of your audience. I was hoping the low end would get a nice boost in performance but its depressing to see it has the same performance as the 8600gt, and the 8600gt had the same performance as the 7600 it replaced. Is this segment of the graphics market even moving forward!?!?

    seriously, u should just recommend people spend the extra $40 and pick up a 8800gt which obliterates these low end cards. just my 2 cents.
  • epyon96 - Friday, September 5, 2008 - link

    It has just enough sarcasm, cyncism, and content to captivate the audience in a boring release.

    The thesis of improving $100 graphics cards in order to induce more developers to PC is interesting. I respectfully disagree that it will actually induce the developers to PCs. The simple reality is that PC games do not sell anywhere near as well. There are other factors pushing console sales that are completely beyond the control of graphic card manufacturers. Granted, they are not helping the situation by creating an unbalanced performance/value curve on the lower end. However, even if the situation is improved with the $100 cards improving, it is simply too late and not enough.

    Too many people and too many system builders are already set on releasing computers with integrated GFX and they are simply pathetic. Furthermore, with the recent rapid shift to laptops from PCs, integrated gfx I imagine will only increase since integrated gfx hsa much better battery life. Mobility seems to be the future of PCs. People no longer regard the PC as a entertainment device for games but a tool for business or home enterntainment where they can connect their XBox, Wii, and PS3.

    Furthermore, piracy is such a big deal with PCs that developers are using that as a reason when it isn't. It is a catch 22 phrase with developers when their product launches fail and they automatically blame piracy. It has gotten to the point where developers will automatically shy away from PCs in favor of Consoles. The classic genre of FPS that PCs are known for are even shifting to consoles. Developers would much rather develop an elaborate and expensive new control interface to cater to the hardcore ex-PC gamers if it means sticking it to a console. Look at Gears of War, Halo franchise, and Metal Gear Solid; all games that would have benefited from the keyboard-mouse PC. What we have left are the premium tier 1 developers like Blizzard with WoW. EA sports is a joke if you look at their track record for developing actual original games. They rely on purchasing other smaller shops like Westwood and Take Two for C&C and Grand Theft Auto. However, those great smaller shops are disappearing. The trend does not seem to stop.

    If Intel Larrabee succeeds in creating better value in the integrated graphics (we are talking about 30-40x improvements over current at the same price point), I see it only benefiting developers and professional graphics engineers. Other factors that are pitted against PC game developers are the costs. It seems that the game development costs have skyrocked exponentially ever since developers equated eye-candy for game sales. Some may argue that they enjoy gameplay and point towards the Wii as an example of gameplay over quality argument. However, the correlation between eye-candy and game sales is undeniable. Prettier games with good marketing will sell; just not a top seller. It is an easy way out for lack of innovative gameplay for some developers.

    Without major shifts in consumer behavior that is not currently evident in the 5-year marketing plans of the industry leaders, the PC gaming industry seems to be on a downward spiral without much hope in recovering. However, the glimmer of hope is that the likely result is that the games that are actually left on PCs will likely be a higher calibre. It is a feeding frenzy where inferior developers will only develop for easy-sell consoles while the remaining developers for PCs are the Blizzards, id, Valve, etc. It raises the average quality of PC games.
  • poohbear - Saturday, September 6, 2008 - link

    oh, and nice post epyon96. its great to read an articulate post like that. cheers for the info.
  • whatthehey - Saturday, September 6, 2008 - link

    Yet another doom and gloom "PC gaming is dying" commentary that is, as usual, completely off the mark. Gaming platforms are cyclical, and there are too many advantages to PCs to totally ignore them. GoW and Halo (and Megal Gear) have always - ALWAYS - been console titles. Halo and GoW were both 360 exclusives, which means Microsoft paid big money to keep them off other platforms in order to entice people to the 360. EA Sports? Don't make me laugh. Sure, the latest copy of Madden will sell like hotcakes on consoles, but sporting games have never been a huge seller for PCs... at least not in the past 10+ years.

    The costs of developing games is so high that to eliminate platforms totally is a terrible decision. If it costs $10-20 million (or more) to make a game, it typically costs a small fraction of that to port the game after it's tested. The reason certain ports sell poorly on PCs usually has more to do with them being available on consoles for months or even years prior to the PC release. Halo 2 is the perfect example: not only was it 3 years late, but MS made it a Vista exclusive. Gee, I wonder why it didn't sell?

    PC gaming might be in a down cycle right now, but PCs are fast outpacing the current consoles (again), so as the gap widens we will start to see more devs return to PC in order to showcase the latest and greatest. 40 years from now, we'll still have pundits claiming the latest consoles are the death of PC gaming, but until PCs stop existing there will always be a large market.
  • wicko - Sunday, September 7, 2008 - link

    I agree with quite a few things said here, like piracy being an "issue" but not really.. but I disagree with PC gaming doing poorly. Actually, PC gaming has apparently been as great as ever, however the difference is that PC gaming is not just for the hardcore anymore. A lot of casual games exist for the PC and its been a very successful venture. Sure, there have been less shooters for the PC, and when some come out they tend to be utter shit ports (mercs 2), late (halo 2) or buggy (GoW). But the MMO's and the RTS games have made up for that, I mean look at WoW for crying out loud, HUGE amount of people playing that (with a monthly fee no less, I do not understand that really..).

    I think there is a major convenience with consoles, and that is that the games are guaranteed(99% of the time) to work. You don't have to worry about how good your console is, because they're equal. But going off cost is bullshit. The cost for a console + TV is relatively equal to the cost of a game capable PC + monitor. For instance, a 360 and a 32" tv is likely to cost (300+600) 900 before taxes. A decent gaming PC and 20" monitor is going to cost 900 as well (700+200), and you could even build a cheaper PC without sacrificing much power.
  • JPForums - Monday, September 8, 2008 - link

    Another thing to consider is the fact that many people will buy a PC anyways. So all you really need to do is take the cost difference between that PC and a gaming PC. Most systems I've upgraded to be game capable only really need a video card. With 8800GTs under $130 and 9800GTX/4850s under $180, it isn't very expensive to make a perfectly game capable PC. (Assuming you don't use a 24+ inch monitor)

    Crysis was mentioned in another post. I run Crysis on Athlon 64 X2 processor with a single 8800GTS on a 22" monitor. Sure I can't crank the graphics settings to the absolute max, but I can easily get them high enough to compare to current consoles. Athlon 64 X2s are selling in entry level systems. Most people (gamer or otherwise) get a system with a Core2 Duo or better processor anyways, so they have even more headroom.

    Further, the cyclic upgrade cycle argument for a console over a PC is a load of crap. I don't have to upgrade my PC at all to maintain this level of performance. The graphics don't get worse, there is just a larger disparity between what a PC is capable of and what the game can do. If a PC can outperform a console, it will always be able to unless it breaks. You only need to upgrade if you want more. The advantage of the PC is that you have that option where you have to wait another 5 years with a console. Case and point, I bought a Radeon 9700 back in the day. It served me well for 7 years. (Still fine, just not so good for new games) Naturally, I upgraded my primary system long before the 7 year mark, but the card was still very useful in my secondary system for LAN parties and the likes. Oh, and for the price of the new consoles when they first come out, you can upgrade to an upper midrange or lower highend video card 3 times, or upgrade the video card, CPU, and memory with some cash to spare. Doing this, you can maintain an increasingly large advantage over the console. Finally, consoles are sold at low or nonexistent profit margins because they make back the money by charging a (up to 50%) premium on games. Big title releases are usually $40 - $50 (Retail) on PC where they are $50 - $60 (Retail) on the Xbox360/PS3.

    That said, I respect the advantage that consoles have in lack of install and games just working. However, the lack of install linits one of the best features of PC games: the ability to modify or add to the game. While consoles are picking up on this, they aren't at parity yet.

    I also agree with the fact that many PC titles are buggy upon release. For me, it isn't a big issue, as I hardly ever buy games (PC or console) directly on release. They few PC games that have been appealing enough to actually buy on release haven't actually been that buggy. (I.E. C&C 3 Tiberium Wars: 1680x1050 + 16X AF + 4X AA produced artifacts so I had to move down to 2X AA ... fixed in a graphics driver update about 2 weeks later) However, these issues are a big deal to many people. (especially technically disinterested people) It should be noted that some so called "bugs" are actually hardware issues in particular systems. As consoles have their own share of hardware issues (RROD) and PCs are much easier and often cheaper to repair (if you know someone who can do it), I'm inclined to excuse a small number of these.

    So what it really comes down to is whether or not you are willing to put up with the PCs quirks for the additional power and flexibility it will give you. For most people it seems the answer is no. However, there will always be a crowd willing to put in extra work to get extra benefit.
  • mmntech - Sunday, September 7, 2008 - link

    The convenience part sums up why I abandoned PC gaming. I got sick of buying games that were essentially pubic betas and I got sick of sinking money into a system just to play them at a reasonable frame rate. With my PS3, I just pop the disc in and it works. No endless patches to download, no intrusive DRM to worry about, and the graphics are better than what my current PC can deliver.

    The biggest mistake PC developers have made is that they've spent far too much time catering to the hardcore crowed. Thus you get games (*cough* Crysis *cough*) that require high end, $1000+ systems to run on. By doing this, they've developed themselves into a corner. There's a reason why EA has been successful with titles like the Sims. They're accessible to causal gamers and don't require a Core 2 Extreme and Quad SLI to be properly enjoyed. Hardcore gamers only account for a small proportion of PC gaming. Staying on the cutting edge doesn't matter to most gamers. If you don't target these people, games aren't going to sell. Hardware cost is a huge thing. What you forget is that with a console, you're guaranteed at least five years of stability where as a gaming PC may only get half that lifespan, if that. For example, over the PS2's lifespan, I've had three separate gaming systems and have made countless upgrades to them. Games should be developed from a middle up perspective, rather than top down as they currently are. I should expect more advanced features with higher end hardware. However, I should be able to buy any midrange system off the shelf at Best Buy (equipped with GPUs such as this one being reviewed here) and expect to be able to play any game at a reasonable frame rate and detail level. In other words, 30fps at medium settings at at least 1024x768. Only Enemy Territory was able to accomplish this.

Log in

Don't have an account? Sign up now