Back to Article

  • ultimatex - Wednesday, August 22, 2012 - link

    I got this MOBO from Newegg the first day they had it available , I couldn't believe the price since it offered 8x8x8x8x , Picked it up the first day and havent looked back. Doesnt look as cool as the Asrock extreme9 but it still looks good. Awesome Job Gygabyte , Anandtech should have given them a Gold not bronze though since the fan issue is a minor issue. Reply
  • Arbie - Wednesday, August 22, 2012 - link

    For gaming, at least, how many people are really going to build a 2xGPU system? Let alone 3x or 4x. The are so few PC games that can use anything more than one strong card AND are worth playing for more than 10 minutes. I actually don't know of any such games, but tastes differ. And some folks will have multi-monitor setups, and possibly need two cards. But overall I'd think the target audience for these mobos is extremely small.

    Maybe for scientific computing?
  • Belard - Wednesday, August 22, 2012 - link

    Yep.... considering that most AAA PC games are just ports from consoles... having 3-4 GPUs is pointless. The returns get worse after the first 2 cards.

    Only those with 2~6 monitors can benefit with 2-3 cards.

    Also, even $80 Gigabyte boards will do 8x x 8x SLI/CF just fine.

    But hey, someone wants to spend $300 on a board... more power to them.
  • cmdrdredd - Wednesday, August 22, 2012 - link

    "Only those with 2~6 monitors can benefit with 2-3 cards."

    Oh really? 2560x1440 on a single card is garbage in my view. I am not happy with 50fps average.
  • rarson - Wednesday, August 22, 2012 - link

    If you're going multi-GPU on a single monitor, you're wasting money. Reply
  • Sabresiberian - Wednesday, August 22, 2012 - link

    Because everyone should build to your standards, O god of all things computer.

    Do some reading; get a clue.
  • Steveymoo - Thursday, August 23, 2012 - link


    If you have a 120hz monitor, 2 GPUs make a tonne of difference. Before you come back with a "no one can see 120hz" jibe. That is also incorrect.... My eyes have orgasms every once in a while when you get those ultra detail 100+ fps moments in battlefield, that look great!
  • von Krupp - Friday, August 24, 2012 - link

    No. Metro 2033 is not happy at 2560x1440 with just a single HD 7970, and neither are Battlefield 3 or Crysis. The Total War series also crawls at maximum settings.

    I bought the U2711 specifically to take advantage of two cards (and for accurate colours, mind you). I have a distaste for multi-monitor gaming and will continue to have such as long as they keep making bezels on monitors.

    So please, don't go claiming that multi-card is useless on a single monitor because that just isn't true.
  • swing848 - Monday, December 08, 2014 - link

    At this date, December 2014, with maximum eye candy turned on, there are games that drop a refrence AMD R9 290 below 60 fps on a single monitor at 1920x1080 [using an Intel i5-3570K at 4GHz to 4.2GHz] Reply
  • Sabresiberian - Wednesday, August 22, 2012 - link

    This is not 1998, there are many games built for the PC only, and even previously console-oriented publishers aren't just making ports for the PC, they are developing their games to take advantage of the goodness only PCs can bring to the table. Despite what console fanboys continue to spew, PC gaming is on the rise, and console gaming is on the relative decline. Reply
  • mayankleoboy1 - Wednesday, August 22, 2012 - link

    Where are the GPGPU benchmarks? AFAIK, those are affected by the PCIE 3.0 bandwidth, as shown in the HD7970 review.

    Games are more or less happy with a PCIE2.0 x8 .
  • MrSpadge - Thursday, August 23, 2012 - link

    A few GP-GPU apps are affected, most aren't. Even PCIe 3 is slow as hell from the perspective of the GPU, so you try to avoid external communication as much as possible. Reply
  • TimoKyyro - Wednesday, August 29, 2012 - link

    I was hoping to see some GPU rendering too. I'm using CUDA on Blender and I really need more GPU power. It would be nice to know if there is a difference between 4x 680 or 2x 690 on different PCIe setups. Reply
  • extide - Wednesday, August 22, 2012 - link

    Thanks for providing the diagrams of lane routing. I wish ALL manufacturers would supply a diagram with their boards so you know how to set it up when you are building a system. Sadly, these diagrams are the exception, not the rule. :( Reply
  • processinfo - Wednesday, August 22, 2012 - link

    For me only EVGA seems worth consideration (I don’t like a fan on chipset though).

    I have few requirements that others do not meet.

    I want PS/2 keyboard port (don’t care about mouse). I don’t see it as legacy. It is still superior to USB for keyboard. Works on interrupts instead of pulling, allows as many keys pressed without ghosting as you wish (know it probably does not matter in real life but I like that anyway).

    Display port output is mandatory for me these days. While it is true that this kind of mobo will run dedicated graphics card (or more than one for that matter) I like to have output here for possibility to use it with CPU graphics if my graphic cards breaks and needs replacement (I had that happen and waited almost two weeks for new one). HDMI is no go because does not support high enough resolution.

    Gigabyte is out for me because audio chip. Maybe it is better but it does not do 7.1 and I will lose two channels in my Tiamat 7.1 headset.
  • rwpritchett - Wednesday, August 22, 2012 - link

    You should check out some of the newer USB keyboards. I don't know how they do it, but some of them can now do full NKRO without PS/2. My Nighthawk X9 can do full NKRO over USB. Reply
  • processinfo - Thursday, August 23, 2012 - link

    Interesting but this is not possible with standard USB keyboard protocol. If it does that it has to use some tricks and most likely custom keyboard driver.

    Also I have Thermaltake Meka G1 that I like and I purchased because I got tired replacing membrane keyboards so I rather buy motherboard with PS/2 then new keyboard.

    My point is that at this price point and clearly meant for gamers (who else is using more than one graphic card in non-workstation pc) they should think about such details especially when they go overboard with other ports, e.g., who needs all 4 kinds of display output on gaming mobo, or 10 USB ports on back plate alone (if you need plenty you can have them on bracket connected to header).
  • MacGyverSG1 - Wednesday, August 22, 2012 - link

    I loved the review. The G1.Sniper 3 was on my short list for a while. Could get back on, though.

    I'm waiting for the ASUS Maximus V Extreme to get tested next.

    I only need a motherboard to complete my new build. I plan on running this new rig for 6+ years so I want a board that can keep up with the times.
  • just4U - Thursday, August 23, 2012 - link

    I am staying away from the Rampage/Maximus lines from Asus this time out as Gigabyte has pretty much brought better value accross the board on their gamer boards. I don't expect Asus to catch up till the next chipset.. Reply
  • goinginstyle - Thursday, August 23, 2012 - link

    I tried the G1 Sniper 3 and returned it a few days later. The audio was a significant downgrade from the Assassin series, EFI is clunky at best and the board had serious problems with a GSKill 16GB 2666 kit, not to mention the lousy fan controls.

    Purchased a Maximus Formula V and never looked back as the EFI, Fan Controls, Clocking and Audio are much better in every way compared to the Sniper board. There is no way Gigabyte has brought better value than ASUS with the Z77 chipset. You get what you pay for and the GB is overpriced once you actually use the board and compare it to ASUS or even ASRock.
  • JohnBS - Thursday, November 01, 2012 - link

    I am looking for a rock solid MB, so of course I turned to ASUS. However, the reviews from verified buyers showed multiple issues with 3.0 USB ports losing power, system instability after months of use, and multiple instances of the board not working in one or more memory slots. Bent pins from the factory and complete DOA issues as well. A few reports of complete failure when the Wi-Fi card was inserted, yet gone with the card removed. This was mainly the Maximus IV series. Then I thought I'd look into the Maximus V series, because I really wanted ASUS, and was kinda sad to read reviews. Same issues from verified buyers of the Maximus V, more so with the USB 3.0 problems and the Wi-Fi/Bluetooth add-on card failures. In common were multiple complaints about customer service.

    So I emailed the ASUS rep who was replying to everyone's post, with specific attention on the recurring problems and how I was concerned about buying a MB. I got the email back, stating they were aware of the recurring problems listed on the user reviews, but that they are isolated occurrences.

    I really need a rock solid x16 x 2 pci-e mb right now, and that's why I'm still searching. I'm planning on overclocking an i7-2700k with an gtx 690 and a 120z monitor for high res gaming. The sniper 3 looks good, but the front audio plug reaching the board's bottom audio header might be something I can't work around.

    Just want something reliable. If there's a known issue, I'm always in that percentile that gets hit with the RMA process. I'm trying so hard to avoid that.

    (Went with 690 instead of dual 680 for heat, noise, power draw considerations).
  • jonjonjonj - Friday, October 26, 2012 - link

    you mean gigabyte in the evga conclusion?

    "the EVGA does not keep pace with ASUS and EVGA even at stock speeds."
  • couchassault9001 - Friday, November 02, 2012 - link

    So for gaming benchmarks is it correct that the cpu multipliers were at 40 on the g1.sniper and 36 on the evga? if so it seems to be a rather unfair comparison. Being that the sniper cpu is running 11% faster

    I'd be amazed if someone was looking at these boards with no intent to overclock like crazy, as i'm trying to decide between these 2 boards myself, and i'm sure i'll be pushing my 3770k as far as it will go.

    The evga consumed ~8% less power than the sniper under load.

    dirt 3 showed a 9% frame rate drop in the frame rate going from g1 to evga. metro 2033 showed a 3.6% drop in frame rate going from g1 to evga. Both of these are on the 4 7970 benchmarks. the 3 and below the gap is much tighter with it being under 1% with one card.

    I know this may be nit picking to some, but i plan on running 5760x1080 3d so 4 7970 performance on a i7-3770k is exactly what i'm looking at.

Log in

Don't have an account? Sign up now