ATI Radeon HD 4350 and 4550: Great HTPC Solutionsby Derek Wilson on September 30, 2008 12:45 AM EST
- Posted in
The Benefits Over Integrated Graphics
For our comparison to integrated graphics, we looked at two games: Crysis and Oblivion. These games tend to cover the spectrum fairly well from DX9 to DX10, and they tell the same story: integrated graphics suck.
First up is Crysis. For integrated graphics, we needed to test everything at the absolute lowest setting, and even that was painful. It is too bad we couldn't test 640x480, as that might have given some of this hardware a chance at playability. But with the tests we did run, none of our integrated solutions were really playable at 1024x768, and only the AMD 780G did anything useful at 800x600. By contrast, the 4350 and the 4550 both we very playable at this very low quality setting. Pushing up to 1280x1024 wasn't really as effective, but the 4550 did still hang on to playable framerates at that resolution.
As for Oblivion, we see nearly the same behavior as with Crysis. There is a huge performance gap between integrated graphics and even the lowest end of add-in cards we are testing today. And these settings with Oblivion are insanely ugly. We would never recommend playing with very low settings ever. It's a horrendous experience.
As usual, Intel's integrated graphics are the biggest joke of the bunch. But that's not any sort of feather in AMD or NVIDIA's cap here: Intel's G35 is just really horrible hardware for 3D.
So, with the benefit over integrated graphics well established, how do these parts compare to the slightly higher price bracket right next door? Let's take a look at how they compare to NVIDIA's 9500 GT DDR2 and AMD's 4670.