Conclusion: Gigabyte G1.Sniper 3

The Gigabyte G1.Sniper 3 has followed the trend of its sister, the Z77X-UD5H (I say sister rather than brother because they are ‘mother’ boards, right?).  The key to getting a good motherboard on market is to make it perform well, and then make it feel like it is good value.  Not only is it imperative that you try and undercut the competition, but the package must be complete in comparison to the other boards you are being compared to.

This is what the G1.Sniper 3 does for $280.  We have a package that includes the PLX PEX 8747 solution for multiple GPUs, a gaming network port from Qualcomm Atheros in the Killer 2201-B alongside an Intel NIC, ten USB 3.0 ports, ten SATA ports, all the video outputs on the IO, some legacy in the PS/2 and IEEE1394 connectivity, and a TPM for business users.  Inside the box we have a plethora of SATA cables, along with a USB 3.0 bracket, WiFi card, antenna, an eSATA bracket with cables and a pair of SLI bridges.

Where the G1.Sniper 3 ends up being very sneaky is in terms of the default speed the CPU is set to run at.  Currently ASUS and Gigabyte motherboards run a feature that ASUS call Multicore Enhancement, which means instead of running at 39x/39x/38x/37x at 1/2/3/4 core load for an i7-3770K, they will run at 39x/39x/39x/39x.  The Gigabyte G1.Sniper 3 goes one further, in this case making the CPU run one multiplier above the maximum turbo bin – in this case 40x/40x/40x/40x.  As a result, the G1.Sniper 3 takes effectively a clean sweep in all our 2D testing which requires most heavily on the CPU and memory.  For your money, Gigabyte has provided an out-of-the-box overclock which will beat the competition, on the understanding that it technically breaks your CPU warranty (and probably the motherboard warranty as well).  Not that they advertise this of course – it all comes out in the reviews though.

My take on this situation has varied over the past few chipsets where MultiCore Enhancement has been a factor.  In X79, I disabled it and made the board run at Intel specifications.  Then I realized that users will most likely run these boards at stock, so it is up to the manufacturer as to how adventurous they want their default settings to be, much in the same way manufacturers may provide aggressive memory settings.  With Gigabyte taking it to a new level, I will still operate under the circumstance that it is a ‘feature’ – Gigabyte are clearly willing to take the inherent risk in their product.

This extra enhancement on the CPU translates in certain GPU tests as well, which will be good to the ears of gamers.  Overclocking on the G1.Sniper 3 mirrored what we have achieved with other Z77 boards, as with Ivy Bridge our temperature limitations really kick in due to voltages above 1.2 V.

If Gigabyte were to be brought down by an area of the package, it would be the eternal issue of their software and fan controls, which have remained stagnant over the past 18 months or so.  I do hope that they give a fresh injection to R&D to develop the potential of the software, and combine this with a bit more money in the fan headers, as Gigabyte’s main competitor has this tied up and in the bag for now.

It is easy to recommend the Gigabyte G1.Sniper 3 – for the price it provides the performance, the functionality and the extras in the box that a user needs.  Somehow Gigabyte has been able to undercut the competition to good effect, and passing this good value package onto consumers.  There are of course some rough edges as with any product, but out of this roundup it would be the Gigabyte I would recommend for most usage scenarios that require the PLX PEX 8747.

With this in mind, I would like to give the Gigabyte G1.Sniper 3 the AnandTech Editors Choice Bronze Award.  For price, performance, and the sense of a good value package, the G1.Sniper 3 offers one of the best price competitive PLX PEX 8747 packages available today.

Editors Choice Bronze Award
Gigabyte G1.Sniper 3

Gaming Benchmarks Conclusion: ASRock Z77 Extreme9
Comments Locked

24 Comments

View All Comments

  • ultimatex - Wednesday, August 22, 2012 - link

    I got this MOBO from Newegg the first day they had it available , I couldn't believe the price since it offered 8x8x8x8x , Picked it up the first day and havent looked back. Doesnt look as cool as the Asrock extreme9 but it still looks good. Awesome Job Gygabyte , Anandtech should have given them a Gold not bronze though since the fan issue is a minor issue.
  • Arbie - Wednesday, August 22, 2012 - link

    For gaming, at least, how many people are really going to build a 2xGPU system? Let alone 3x or 4x. The are so few PC games that can use anything more than one strong card AND are worth playing for more than 10 minutes. I actually don't know of any such games, but tastes differ. And some folks will have multi-monitor setups, and possibly need two cards. But overall I'd think the target audience for these mobos is extremely small.

    Maybe for scientific computing?
  • Belard - Wednesday, August 22, 2012 - link

    Yep.... considering that most AAA PC games are just ports from consoles... having 3-4 GPUs is pointless. The returns get worse after the first 2 cards.

    Only those with 2~6 monitors can benefit with 2-3 cards.

    Also, even $80 Gigabyte boards will do 8x x 8x SLI/CF just fine.

    But hey, someone wants to spend $300 on a board... more power to them.
  • cmdrdredd - Wednesday, August 22, 2012 - link

    "Only those with 2~6 monitors can benefit with 2-3 cards."

    Oh really? 2560x1440 on a single card is garbage in my view. I am not happy with 50fps average.
  • rarson - Wednesday, August 22, 2012 - link

    If you're going multi-GPU on a single monitor, you're wasting money.
  • Sabresiberian - Wednesday, August 22, 2012 - link

    Because everyone should build to your standards, O god of all things computer.

    Do some reading; get a clue.
  • Steveymoo - Thursday, August 23, 2012 - link

    Incorrect.

    If you have a 120hz monitor, 2 GPUs make a tonne of difference. Before you come back with a "no one can see 120hz" jibe. That is also incorrect.... My eyes have orgasms every once in a while when you get those ultra detail 100+ fps moments in battlefield, that look great!
  • von Krupp - Friday, August 24, 2012 - link

    No. Metro 2033 is not happy at 2560x1440 with just a single HD 7970, and neither are Battlefield 3 or Crysis. The Total War series also crawls at maximum settings.

    I bought the U2711 specifically to take advantage of two cards (and for accurate colours, mind you). I have a distaste for multi-monitor gaming and will continue to have such as long as they keep making bezels on monitors.

    So please, don't go claiming that multi-card is useless on a single monitor because that just isn't true.
  • swing848 - Monday, December 8, 2014 - link

    At this date, December 2014, with maximum eye candy turned on, there are games that drop a refrence AMD R9 290 below 60 fps on a single monitor at 1920x1080 [using an Intel i5-3570K at 4GHz to 4.2GHz]
  • Sabresiberian - Wednesday, August 22, 2012 - link

    This is not 1998, there are many games built for the PC only, and even previously console-oriented publishers aren't just making ports for the PC, they are developing their games to take advantage of the goodness only PCs can bring to the table. Despite what console fanboys continue to spew, PC gaming is on the rise, and console gaming is on the relative decline.

Log in

Don't have an account? Sign up now