AMD 690G: Performance Review

by Gary Key on March 6, 2007 8:00 AM EST
Chipset Overview

The AMD 690G/690V chipset consists of an RS690 Northbridge and SB600 Southbridge. AMD's intent with this chipset is to provide an attractive alternative to the NVIDIA 6100 family, but more importantly they want to provide a total platform solution that is very competitive against the current Intel G965 family. The 690G is directed towards the consumer market with a heavy emphasis on multimedia capabilities via the X1250 graphics core, while the X1200 core on the 690V chipset is targeting the business market where AVIVO capabilities are not as important.


In the case of the X1250, it is no surprise that AMD has reached back to previous generation hardware for the base design of their new integrated GPU. Lower transistor counts mean smaller die sizes and lower costs, and the X100 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which NVIDIA hardware has in their 6100 chipset and current Intel hardware claims to include), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.

Many AVIVO features (including 10-bit per component processing) have been implemented on X1250, bringing higher quality video decoding to integrated graphics. Unfortunately, with this improvement comes some sacrifice, as the number of pipelines on the X1250 is cut down from the X700. The X1250 weighs in at 4 pixel shaders and like other X100 series hardware this also means 4 texture units, z-samples, and pixels per clock. The other major change when compared to the X700 is that the number of vertex shader units have gone from 6 to 0. All vertex shader operations are handled by the CPU.

The core clock speed operates at 400MHz and can be increased to 500MHz within the BIOS depending upon the board manufacturer. We have also overclocked one of our boards to 550MHz with a third party utility but performance unfortunately does not scale well in most games. We have seen performance improvements on average increase anywhere from 3%-12% percent depending upon the application.

As for memory, the GPU can handle up to 1 GB of memory, but support is once again dependent on the BIOS. AMD uses an optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory. For our tests, we found 256MB to be the sweet spot, as performance seemed to be degraded with 512MB or 1GB graphics memory, especially under Vista where the base memory requirements are significantly higher than XP. This may end up being different depending on implementation, but we will stick with the 256MB recommendation for now.

Looking beyond architecture, most people who will actually be using integrated graphics won't be bothered with games or high end 3D applications. This hardware will be most used for 2D and video applications. Let's take a look at the features we can expect in these areas.

Supporting a maximum resolution of 2560x1600, the X1250 can easily run any CRT at maximum resolution. This tops NVIDIA's 6150 max resolution of 1920x1440 and Intel's G965 at 2048x1536. As for output features, the video hardware supports S-Video, YPbPr, HDMI 1.3, and Dual-Link DVI. Of course, the actual interfaces available will depend on the implementation, but the HDMI and DVI ports will also support HDCP.

The GPU supports two independent display outputs, and both DVI and HDMI outputs can be used at the same time. The only caveat is that HDCP will only work over one digital output at a time. This isn't a huge issue, as most people won't be watching two different protected movies at the same time on a single computer. Also, in spite of the single display limitation, HDCP can be used over either HDMI or DVI. This gives the X1250 an advantage over graphics cards that initially supported HDCP. Many cards only allowed HDCP over one HDMI or DVI port while the other was always unprotected only.

As for HDMI, the audio support is enabled through an interface in the RS690 Northbridge while the SB600 Southbridge handles the HD audio controller interface. The standard HD audio codec is supplied by Realtek who has developed a driver package that allows the user to control both the HDMI and HD audio interfaces from a single application. The HDMI audio solution is capable of 32, 44.1 and 48kHz, 2 channel + AC3 (5.1) output.

For video acceleration features, the X1250 is capable of hardware acceleration of MPEG2 and WMV playback. MPEG4 playback decode is not hardware accelerated, but it is supported in software via the driver. DVD and TV (both SD and HD resolution) playback can be offloaded from the CPU, but we have seen some severe choppiness or blank screen issues with HD media formats at 1080p - although 720p worked fine. AMD has indicated that this issue will be addressed in a future driver and the chipset is fully capable of 1080p output with an upper end CPU and proper software support.

For those who wish to use discrete graphics alongside their integrated solution, AMD supports a feature they call SurroundView. This enables support for three independent monitors in systems with integrated and discrete AMD graphics. The feature works as advertised and may be useful for business users who want more than two monitors at a low price. Gamers who want more than two monitors will certainly have to take a different route.

The AMD 690G/690V utilizes the SB600 Southbridge that was introduced last May and continues to be a competitive offering, although both Intel and NVIDIA's latest chipsets are offering six SATA ports along with RAID 5 capability. However, LAN choices are left to the motherboard manufacturer's discretion. In general, the SB600 offers very good SATA and IDE performance while USB throughput slightly trails the Intel and NVIDIA offerings.

Index Test Setup
Comments Locked

70 Comments

View All Comments

  • goinginstyle - Tuesday, March 6, 2007 - link

    I think most of the people missed the comments or observations in the article. The article was geared to proving or disproving the capabilities of the 690g and in a way the competing platforms. It was obvious to me the office crowd was not being addressed in this article and it was the home audience that the tests were geared towards. I think the separation between the two was correct.
    The first computer I bought from Gateway was an IGP unit that claimed it would run everything and anything. It did not and pissed me off. After doing some homework I realized where I went wrong and would never again buy an IGP box unless the video and memory is upgraded, even if it is not for gaming. I have several friends who bought computers for their kids when World of WarCraft came out and bitched non-stop at work because their new Dell or HP would not run the game. At least the author had the balls to state what many of us think. The article was fair and thorough in my opinion although I was hoping to see some 1080P screen shots. Hint Hint
  • Final Hamlet - Tuesday, March 6, 2007 - link

    Too bad one can't edit one's comments...

    My point (besides correcting a mistake) is, that I think that this test is gravely imbalanced... you are testing - as you have said yourself - an office chipset - then why do you do it with an overpowered CPU?
    Office PC's in small businesses go after price and where is the difference in using a mail program between a Core 2 Duo for 1000$ and the smallest and cheapest AMD offering for less than 100$?
  • Gary Key - Tuesday, March 6, 2007 - link

    quote:

    My point (besides correcting a mistake) is, that I think that this test is gravely imbalanced... you are testing - as you have said yourself - an office chipset - then why do you do it with an overpowered CPU?


    We were not testing an office chipset. We are testing chipsets marketed as an all in solution to the home, home/office, multimedia, HTPC, and casual gaming crowd. The office chipsets are the Q965/963 and 690V solutions. The G965 and 690G are not targeted to the office workers and were not tested as such. Our goal was to test these boards in the environment and with applications they are marketed to run.
  • JarredWalton - Tuesday, March 6, 2007 - link

    We mentioned this above, but basically we were looking to keep platform costs equal. Sure, X2 3800+ is half as expensive and about 30% slower than the 5200+. But since the Intel side was going to get an E6300 (that's what we had available), the use of a low-end AMD X2 would have skewed results the other direction. We could have used an X2 4800+ to keep costs closer, but that's an odd CPU choice as well as we would recommend spending the extra $15 to get the 5200+.

    The intent was not to do a strict CPU-to-CPU comparison as we've done that plenty (as recently as the http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">X2 6000+ launch). We wanted to look at platform and keep them relatively equal in the cost department. All you have to do is look at the power numbers to see that the 5200+ with 690G compares quite well (and quiet well) to the E6300 with G965.

    The major selling point of this chipset is basically that it supports HDMI output. That's nice, and for HTPC users it could be a good choice. Outside of that specific market, though, there's not a whole lot to put this IGP chipset above other offerings. That was what we were hoping to convey with the article. It's not bad, but neither is it the greatest thing since sliced bread.

    If you care at all about GPU performance, all of the modern IGP solutions are too slow. If you don't care, then they're all fast enough to do whatever most people need. For typical business applications, the vast majority of companies are still running Pentium 4, simply because it is more than sufficient. New PCs are now coming with Core 2 Duo, but I know at least a few major corporations that have hundreds of thousands of P4 and P3 systems in use, and I'm sure there are plenty more. Needless to say, those corporations probably won't be touching Vista for at least three or four years - one of them only switched to XP as recently as two years back.
  • JarredWalton - Tuesday, March 6, 2007 - link

    Perhaps it's because the companies releasing these products make so much noise about how much better their new IGP is compared to the older offerings from their competitors? If AMD had released this and said, "This is just a minor update to our previous IGP to improve features and video quality; it is not dramatically faster and is not intended for games" then we would cut them some slack. When all of the companies involved are going on about how much faster percentage-wise they are than the competition (never mind that it's 5 FPS vs. 4 FPS), we're inclined to point out how ludicrous this is. When Intel hypes the DX9 capability of their G965 and yet still can't run most DX9 applications, maybe someone ought to call them on the carpet?

    Obviously, these low performance IGPs have a place in the business world, but Vista is now placing more of a demand on the GPU than ever before, and bare minimum functionality might now be adequate for a lot of people. As for power, isn't it interesting that the HIGHEST PERFORMANCE IGP ends up using the least amount of power? Never mind the fact that Core 2 Duo already has a power advantage over the X2 5200+!

    So, while you might like to pull out the names and call us inane 15 year olds, there was certainly thought put into what we said. Just because something works okay doesn't mean it's great, and we are going to point out the flaws in a product regardless of marketing hype. Given how much effort Intel puts into their CPUs, a little bit more out of their IGP and drivers is not too much to ask for.
  • TA152H - Wednesday, March 7, 2007 - link

    Jared,

    Maybe they didn't intend their products to be tested in the way you did. As someone pointed out, playing at 800 x 600 isn't that bad, and doesn't ruin the experience unless you have an obsession. Incredibly crude games were incredibly fun, so the resolution isn't going to make or break a game, it's the ideas behind it that will.

    You can't be serious about what you want AMD to say. You know they can't, they are in competition and stuff like that would be extremely detrimental to them. Percentages are important, because they may not running the same games as you are, at the same settings. You would prefer they use absolutes as if they would give more information? Did AMD actually tell anyone these were excellent for all types of game? I never saw that.

    With regards to CPUs and GPUs, you are trying to obfuscate the point. Everyone uses a CPU, some more than others. But, they do sell lower power ones, and even single core ones. Not everyone uses 3D functionality. If you don't get it, I DON'T want it on certain machines of mine. I don't run stuff like that on them, and I don't want the higher power use or heat dissipation problems from it. What you call effort isn't at all, it's a tradeoff. Don't confuse it with you get something for nothing if Intel puts more into it. You pay for it, and that's the problem. People who use it should, people that don't, shouldn't, so the kiddies can play their shoot 'em ups.

    Just so you know, I'm both. I have mostly work machines, but two play machines. I like playing some games that require a good 3D card, but just don't like the mentality the the whole world should subsidize a bunch of gameplayers when they don't need it. That's what add-in cards are for. I would be equally against it if no one made 3D cards because most people didn't need them. I like choices, and I don't want to pay for excessive 3D functionality on something that will never use it, to help gameplayers out. Both existing is great, and IGPs will creep up as they always have, when it becomes inexpensive (both in power and initial cost) to add capabality, so the tradeoff is minor.
  • StriderGT - Tuesday, March 6, 2007 - link

    Does this chipset support 5.1 LPCM over HDMI or not??? Or more plainly can someone send 5.1 (games, HD movies, etc) digitally to receiver with the 690G? According to your previous article on the 690G 5.1 48khz was supported over the HDMI port. Now its back to 2 channel and AC3 bitstream. Which is it?
  • Gary Key - Wednesday, March 7, 2007 - link

    It is two channel plus AC3 over HDMI. That is the final spec on production level boards and drivers. We will have a full audio review up in a week or so that also utilizes the on-board codec.
  • StriderGT - Thursday, March 8, 2007 - link

    Why is this happening? Why on earth can't they produce a PC HDMI Audio solution that outputs up to 7.1 LPCM (96khz/24bit) for ALL sources!?! They already do that for 2 channel sources!!!! Do you have any info from the hardware vendors regarding the reason/s they will not produce such a straightforward and simple solution?!?

    PS There are lots of people demanding a TRUE PC HDMI Audio solution not this SPDIF hacks...
  • Renoir - Tuesday, March 6, 2007 - link

    I'm also interested to know more specifics about the audio side of this chipset. The support of HDMI v1.3 suggests that with an appropriate driver and supporting playback software Dolby TrueHD and DTS-HD bitstreams should be able to be sent via HDMI to a v1.3 receiver with the necessary decoders. Is this a possibility?

Log in

Don't have an account? Sign up now