µATX Part 2: Intel G33 Performance Review

by Gary Key on 9/27/2007 3:00 AM EST
POST A COMMENT

26 Comments

Back to Article

  • tooter2 - Sunday, September 30, 2007 - link

    Hi all. I had just ordered the DS2R board when I read your review, and how poor this board overclocked, exceeding fsb of 400, contrary to what I had read elsewhere. I was a bit concerned to say the least. Well, I just spent an hour running the newest memtest86 using this board with an e6750 at 7 X 500 = 3.50 GHz at default vcore using 2 X 1gig of DDR2 6400 GSkill at 5-5-5-15 with vdimm at +.2, and all other settings at default except for the power management settings so as to be sure that I was running at the high speeds. This was with the Intel stock cooler. I've also run memtest at 8 x 463 = 3.70 GHz, default vcore. CPU temp never exceed 38C. And I've used an older Antec Neopower 480 for my psu. I should add that this is with on-board video in a bare-bones setup, i.e., no case, no hdds, ide optical drive. This board appears to be an overclocking monster, not at all like your results.

    And I plan to use a video card in this board, but I bought it for its mATX size plus the fact that I can get a video card later. I want to see how the new AMD cards pan out, plus what Nvidia comes back with. This will be used in a HTPC setup, but a setup where I can play games as well. Hence, the e6750.
    Reply
  • tooter2 - Sunday, September 30, 2007 - link

    In the above post, I meant to say "not exceeding 400 fsb". Reply
  • jonp - Saturday, September 29, 2007 - link

    The Asus P5K-VM feature set chart shows only 1 PCI when it should say 2. Reply
  • overzealot - Saturday, September 29, 2007 - link

    The g33m-ds2r comes with an eSATA expansion slot bracket. It also makes it quite clear that it's supported on the product's site and on the box it comes in. Reply
  • falacy - Friday, September 28, 2007 - link

    Something I have noticed over the years is that this site doesn't really take an objective look at the "low end" hardware, from the perspective of those of who would purposely purchase these items - even though we're "tech savy". For instance,though I do agree that the absence of a DVI port isn't great, I find it hard to believe that I'm the only person who is still happily using a 17" CRT monitor at 1024x768 and it's pretty insulting to hear that anything without a DVI port isn't worth looking at. Did everyone forget that CRT monitors have better visual quality that LCDs - unless you're able to shell out far, far more money? I digress...

    Here is my path to the P5K-VM:

    When I moved in 2003, after losing my great job, I had to sell my good computer and when I finally got settled I was far too poor to replace it. That was 2004. Anyhow, I needed a computer so bought a Dell desktop (P4 2.8) and used it until 2005 with an Ati 9550SE graphics card. This was good enough to play Star Wars Galaxies, Everquest, and a whack of other games that I played at the time. Of course, it ran everything office-like too. Later on I was given an ATI 9800XT video card, which was very expensive when it first came out. Anyhow, in 2006 I upgraded to an Asrock board that could handle a Core2, yet still had the AGP slot so I could make use of the 9800XT. At the time, there weren't any cheap Core2 processors, so I bought a P4 531 and it was a decent upgrade from the Dell. All this was awesome (and for the games I played I was happy), until recently when I bought I bought a Pentium Dual-Core 2160 and then was lucky enough to have the fan on my 9800XT fail, which awesomely fried the GPU. Yay. I was back to using the i865G graphics, as had given away my 9950SE and the only other cards in my collection of junk weren't any better than the onboard video.

    And this brings me to yesterday, when I set up my new system.

    I bought 1GB RAM and a P5K-VM and after testing it out, I found that the graphics capabilities trounce the i865G onboard video, in the practical testing of playing World of Warcraft as well as in 3DMark2000 scores; the G33 is smooth and playable in WoW at 1024x768, where as the i865G was somewhat choppy at 800x600. Also, the apart from playing at 1x AA rather than 4x AA, the G33 scored the same in 3DMark2000 as the old Ati 9550SE that I used to have. And finally, it really isn't that much of a downgrade from the Ati 9800XT (which sucked up so much power even in idle, the air from my PSU went from HOT to cool when I stopped using it!) in World of Warcraft (the only game I really play now). Sure, AA is nice, but I like the electicity/heat/noise savings better. Down the road, I may purchase a fanless PCI-E graphics card if the NEED arises.

    All together, I believe a lot more credit should be given to the value of these motherboards. In fact, I have felt since the first onboard video chipsets to offer full AGP support that so long as you're not giving up any important features, it's pretty stupid for the average person to buy a motherboard without onboard video - you never know when you're going to need it! There is a huge list of fun gaming titles that the onboard graphics can play with Playstation 2 quality (or better) graphics quality and I think that this information is lost on the Anandtech crowd. Also, these systems can run with Windows XP and 1GB RAM and be completely amazing in compairison to what was available just two short years ago!

    The P5K-VM is a perfect motherboard for a person like me, who has some dispoable income to build a computer over time and enough patience to make that happen. Eventually, I can add 8GB of RAM or a wicked graphics card (if ever I feel like playing more than WoW or Neverwinter Nights) or 4 SATA drives to run software RAID5 (or 4 IDE drives to use with my promise controller) or a camcorder to use the 1394 or a super-mega quad-core, low power consumption CPU.

    Seems to me that anyone with a 17" CRT monitor, which often has better visual quality than the crappy LCDs people peddle these days, would be very wise to buy one of these boards now and upgrade as "the itch" and their budget fits!
    Reply
  • lopri - Saturday, September 29, 2007 - link

    Dunno whether I should laugh or cry over your post. Are you being serious or sarcastic? Sorry it was a long day and I'm not that a sharp person. Reply
  • falacy - Sunday, September 30, 2007 - link

    That's exactly what I am talking about: The inability of so many people on Anandtech to see from the "Average Person"'s perspective. Funny enough, it just happens to be that "Average Person" makes up the majority of the computer purchasers in North America.

    As a person who has managed a "ma & pa" computer store in a small town, I can tell you that even the most inept of "boony noobs" out there has some computer knowledge these days. And, many people still have some decent hardware kicking around that, considering the things they actually USE a computer for, they can squeeze a bit more value out of. Heck, it was just two years ago that replaced an AMSTRAD 200 portable computer with a laptop that ended up frustrating the hell out the customer, because it didn't do all the things here ancient computer did (such as print to her equally ancient printer). In fact, my computer is housed in a modified 486 AT server tower that we took in on trade that was being used as an office server until the we replaced it in 2005. For "Average People" doing average things with average expectations, it's amazing how long a computer can last (Personally, I used my Celeron 300a Malay @450MHz & Abit BH6 Rev2 for over 3 years). Look at it this way, if all you're doing is crunching numbers and typing, my 486SX 25MHz laptop with Word Perfect 5.1 and Lotus 123 will still get the job done (and it will boot faster than anything else out there today).

    Anyhow, it may come as a surprise that not everyone has enough money to just buy what ever the heck they want, when ever they feel like it. No, most of us have to set priorities in life - I believe that has something to do with being an adult and/or a parent. Consiquently, "Average People" like me (in wealth, as aposed to computer knowledge) have to wiegh the pros and cons a little more carefully and for someone like myself, I'd rather through some spare money at more storage space for my movies or a camcorder or a better TV to watch said movies on than I would an uber graphics card.

    The plain truth of the matter is that the G33 under Windows XP can play fun games like Quake 3, Neverwinter Nights, World of Warcraft, and many other great 1999-2004 titles. All the while, it can do so using that CRT you probably already own, that likely still looks just as good as new and will give you a sharper image at 1024x768 than the low-end LCD you'd likely buy. Finally, a board like the P5K-VM is amazing, because should a person strike it rich they could upgrade the hell out of their computer without ever needing to consider buying a new motherboard - DDR3, 45nm CPU, Gigabit LAN, 8 channel audio? Boy, that seems pretty "bleeding edge" from vantage point on ye o'l interweb!

    There's a lot of potential (for the "Average Person") in these boards.
    Reply
  • strikeback03 - Tuesday, October 02, 2007 - link

    I don't think a CRT can ever give a "sharper" image than an LCD - kinda the nature of the beast with discrete pixels vs a scanning electron beam. Now your CRT probably has better colors and almost certainly has better viewing angles than the average cheap LCD, but is almost certainly not sharper.

    also, 1024x768 is REALLY small. Even using the internet is cramped, and forget the average programming environment or photo editing program.

    finally, it does not appear the P5K-VM supports DDR3. The chipset can, but most motherboard makers are choosing either DDR2 or DDR3, as the slots are different, and they cannot be used simultaneously.
    Reply
  • ltcommanderdata - Friday, September 28, 2007 - link

    Can we please get a review of the 14.31.1 XP driver for the GMA X3000 that enables hardware DX9.0c SM3.0 acceleration? I know you've switched over to Vista, but the 15.6 driver release notes don't mention that they added hardware acceleration so it looks like only the 14.31 and the newer 14.31.1 XP drivers have it. I would love to see a comparison between the GMA X3000, Xpress X1250, Geforce 7150, and a discrete X1300HM and 8500GT.

    You're probably saving the new drivers for an IGP review when the G35 GMA X3500 comes out (October 21?), but it would be nice to have numbers for the GMA X3000 too for comparison.
    Reply
  • IntelUser2000 - Tuesday, October 02, 2007 - link

    quote:

    Can we please get a review of the 14.31.1 XP driver for the GMA X3000 that enables hardware DX9.0c SM3.0 acceleration? I know you've switched over to Vista, but the 15.6 driver release notes don't mention that they added hardware acceleration so it looks like only the 14.31 and the newer 14.31.1 XP drivers have it. I would love to see a comparison between the GMA X3000, Xpress X1250, Geforce 7150, and a discrete X1300HM and 8500GT.

    You're probably saving the new drivers for an IGP review when the G35 GMA X3500 comes out (October 21?), but it would be nice to have numbers for the GMA X3000 too for comparison.


    I agree, they should run XP driver tests. Better yet, they should test out G965 to see the taste of G35.

    Here's my results:

    From Gary-"We set our quality settings to medium or low where applicable except for the first two are set to high and the sliders are set in the middle spot."

    With that in mind, I did a test. However I wasn't sure whether Object Scarring and Post processing were on or off. I did both tests.

    AT settings+Object Scarring/Post Processing Off-12.5
    AT settings+Object Scarring/Post Processing On-11.6

    I also use Dual Channel DDR2-800 with 5-5-5-15 ram and E6600. I found out that in Company of Heroes, performance increased by 10% going from 5-6-6-18 to 5-5-5-15.

    Supreme Commander: 8.381

    I am using 14.31.1 driver and XP SP2.



    Reply
  • strikeback03 - Friday, September 28, 2007 - link

    When building a couple computers for work using the MSI P35 Platinum board, it appears they don't support eSATA hot-swapping, at least not in XP. I know my Foxconn G965 board at home can do it. Is this behavior still present in the MSI board here? Is it a P35 limitation, or BIOS, or what? Reply
  • hans007 - Friday, September 28, 2007 - link

    I dont get it...

    an svdo card (add2-n) with dvi output costs what $6 on ebay. why dont you guys just buy one, so you can test these with digital.

    also the video driver in vista 32bit is not as mature still as the recently released gma 3000 compatible XP driver.

    most people actually have XP so could another round of benchmarks in XP be run? I probably wont even get vista for at leas tanother year, since well its pointless and has no reason for being bought at this point.
    Reply
  • lopri - Saturday, September 29, 2007 - link

    Well.. it seems like you're using Windows XP and a monitor via VGA. Then why bother with these new IGP-based mATX boards? 915G/945G (or GeForce 6100) series would be a better choice for you. They are a lot cheaper (~$50 probably) and XP support is as mature as can be.

    In the center of this new wave of IGPs is the advent of HD contents. Vista is kinda necessary-evil in a sense but in general it handles HD and multimedia contents a lot better than XP and has a more intuitive/prettier UI for a living room environment. CRT has long been dead in living rooms, and if you prefer CRT over LCD for some reason (professional gaming maybe?) IGP wouldn't be an option to begin with.

    I'd say DVI is the minimum requirement, HDMI w/HDCP being a preferred solution in these days and nights.
    Reply
  • veritronx - Thursday, September 27, 2007 - link

    One thing that may have been overlooked.. The MSI board is the only one suitable for people looking to use a dual-slot graphics card as well as, say, a creative sound card, with some space between them. For that reason the only board reviewed that I would look at buying would be the MSI. Reply
  • Ajax9000 - Thursday, September 27, 2007 - link

    From page 1:
    quote:

    The innovation in the IGP market has been lagging for some time but has picked up in recent months with the introduction of the AMD 690G, AMD Radeon X1250, and now the NVIDIA MCP73 series. All of these solutions offer native DVI/HDMI output, HD decode and playback, ...


    Read the following Nvidia pages and the news is somewhat disappointing re HD video.

    Summary PDF -- http://www.nvidia.com/object/IO_35712.html">http://www.nvidia.com/object/IO_35712.html
    AMD (MCP78) features -- http://www.nvidia.com/object/mobo_gpu_features_ben...">http://www.nvidia.com/object/mobo_gpu_features_ben...
    AMD (MCP78) specs -- http://www.nvidia.com/object/mobo_gpu_tech_specs.h...">http://www.nvidia.com/object/mobo_gpu_tech_specs.h...
    Intel (MCP73) features -- http://www.nvidia.com/object/mcp_features_benefits...">http://www.nvidia.com/object/mcp_features_benefits...
    Intel (MCP73) specs -- http://www.nvidia.com/object/mcp_intel_techspecs.h...">http://www.nvidia.com/object/mcp_intel_techspecs.h...

    PureVideo is only listed for the MCP78 (7050PV+630a) combination. All the other AMD chipsets and none of the Intel chipsets have PureVideo HD.

    If, in the future, they release an MCP73 using (say) 7050PV+630i then memory will be limited to DDR667.

    There is no details thus far, but what would be good is if the new chipset fixes the HD Audio problem that all current HDMI video cards seem to suffer from (i.e. the problem whereby the chipset supports HD Audio, but the video cards can only accept SPDIF-grade audio for HDMI pass-through).
    Reply
  • BansheeX - Thursday, September 27, 2007 - link

    Everyone who is letting these boards have it for not including HDMI/DVI is completely right. It makes no sense. Sure, I could buy a cheap DVI graphics card and stick in there, but if I have to do that, why would I buy a board with onboard graphics in the first place?

    Sadly missing from this review is the board that DOES include onboard DVI, Intel's own DG33TL. Even sadder is that it takes Intel to make the feature-full board while the OEM companies go for the minimum.
    Reply
  • Emma - Thursday, September 27, 2007 - link

    I agree with the others, as most computers I build have IGP's, being able to directly compare each of the available IGP's on the market would be about the best thing from a review for a long time.

    The 6100/6150SE should also be included as this is still widely sold.

    Also of interest would be a summary of what other nVidia and AMD IGP's are on the horizon.

    Thanks!
    Reply
  • Owls - Thursday, September 27, 2007 - link

    "We generally feel that users like to install games into the same colored slots for dual channel operation, but MSI chooses to color channel A orange and channel B green."

    I wasn't aware you could install games into DIMM slots.
    Reply
  • JarredWalton - Thursday, September 27, 2007 - link

    Sorry 'bout that - I was helping Gary out a bit and managed to mangle the text. Blame the speech recognition. That or I'm just slurring my words a bit. :) Reply
  • 8steve8 - Thursday, September 27, 2007 - link

    great article tackling most the issues that we care about!


    question #1: why bother reviewing boards without DVI or HDMI?

    whether we are building pc's for friends/offices etc, or an office/server box for ourselves, or we want it to find a home in its post-gaming life when we ditch it for something better... DVI will be key. inexcusable that they pinch pennies there and frankly not worth your time considering these boards. gigabyte has a g33 board with dvi/hdmi, as does intel...
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...

    question #2: this would have been muuuch more useful like 5 months ago when G33 was new. now it's about to be eclipsed by the 7150 and g35.



    overall i cant wait to see a similar roundup with modern chipsets like g35/nvidia 7150/ g690 and 7050pv for amd...

    would be interesting to see a cost/perf of integrated platforms including cpu costs...
    mobo + cpu costs... it seems amd has some good cheap 690g boards out there, with dvi/hdmi for around $75. (almost $50 cheaper than a g33 board with dvi)...
    Reply
  • sprockkets - Friday, September 28, 2007 - link

    Sad how an AMD 7050 board can be had for $80, $40 cheaper with the same features. It is the premium you pay for having dvi.

    Oddly enough too is that the Gigabyte board you quote doesn't use all solid caps yet the lower end board does. And of course, they didn't bother with solid caps on their new AMD boards period, cause "AMD is second tier."
    Reply
  • tayhimself - Thursday, September 27, 2007 - link

    Preposterous!! Why do they even bother making this junk without DVI. More and more I find that I don't want a leet board that overclocks 100 Mhz higher but a stable board with the right features. -sigh- Reply
  • 8steve8 - Thursday, September 27, 2007 - link

    and on top of it, these igp's are not suited well for gaming or videos,,, (the two applications where you may not notice the difference between a digital and analog interface), so they will be used for text/office work... an application where the discrepancies in the user-experience of analog vs digital interfaces with an LCD are undeniable.

    again, great article.,, but in the end, I sort of wonder why waste ur time exploring these boards when your time is better spent on solutions that deserve our money?
    Reply
  • JarredWalton - Thursday, September 27, 2007 - link

    I think both of those G33 + SDVO models launched long after Gary had started work on this uATX stuff. Good to see that some people are including the necessary chip, as uATX without DVI is simply unacceptable. Unfortunately, testing some of this stuff takes a lot more time than we would like. We're working to address that, however. Reply
  • jenli - Thursday, September 27, 2007 - link

    I would love to see a review of motherboards with igp
    that can be converted to raid servers by using the lone
    pcie 16x slot.

    Have fun,
    Reply
  • CK804 - Thursday, September 27, 2007 - link

    I'm doing exactly what you mention with an Intel DG965RY. I have an Areca ARC-1210 fitted in there with 3 320GB WD Caviar SE16s in RAID 5. Reply

Log in

Don't have an account? Sign up now