Gigabyte G1.Sniper 3 BIOS

The BIOS of a system is a measure of how much effort a manufacturer likes to put into every aspect of their design.  A good BIOS reflects commitment and perseverance to producing a good product versus just getting one out there into the market.  There is no use ever having one BIOS engineer on staff, even if a company only produces one motherboard.  The design of the BIOS has to come from multiple sources, and a creative piece has to come out which improves the user experience and the functionality, rather than being merely another check point on the list.

So here we are with the Gigabyte G1.Sniper 3 BIOS, which is essentially an identical replica of the BIOS we saw on the Z77X-UD5H reviewed recently, despite this motherboard being part of the gaming range of Gigabyte motherboards.  Our opening screen represents Gigabyte’s ‘3D’ mode screen, showcasing a representative motherboard on the screen.  Different parts of the image are clickable, which present menus for the specific areas of the motherboard.

While this is a good way to introduce users to the BIOS, it does leave several questions, as we raised with the Z77X-UD5H.  We have no text declaring which motherboard is being used (if the board is in a case), the BIOS version or the CPU.  The motherboard point is the most poignant, as Gigabyte use a generic motherboard image in their 3D BIOS rather than one specific to the motherboard at hand.  Other manufacturers also include information such as the temperatures, the memory count, memory speeds, voltages, and fan speeds on the front page – for Gigabyte’s model, we have to click through to find this information.

Users can also adjust the fan controls here, by clocking on the ‘Fan Control’ option on the menu at the bottom.  As mentioned in previous Gigabyte reviews, the fan controls on their boards are not the best by any stretch, as the options available to users consist of choosing a fan ramp in terms of PWM values per degree.  We would prefer options which relate to % fan speeds to temperature, with options to select initial and final temperatures and speeds such that the ramp was calculated automatically by the system.  In our review of Biostar Z77 motherboards, we at least got an automatic testing option to tell us which PWM values should be set, even if they are an arbitrary scale for the majority of users should they wish to delve in and understand what is being said.

The main section of the Gigabyte BIOS is found in the Advanced option on the bottom row.  This pulls up a more vintage style BIOS scenario, easily navigable by both the mouse and the keyboard.  The first screen is labeled ‘MIT’, and we also get information regarding the BIOS version, the BCLK, memory size, temperatures and voltages here.  The MIT screen also has a ‘Current Status’ option which gives a more detailed overview of some of the more important numbers relating to the hardware in the system.

The overclocking options are found in a series of three menus from the MIT screen.  To adjust the CPU and memory frequencies, these options are found in the ‘Advanced Frequency Settings’ menu.  In the ‘Advanced Memory Settings’ screen, the memory frequency is again adjustable, but also the subtimings are adjustable here.  For voltages of the CPU or the memory, or other voltages in the system and Load Line Calibration settings, these are found in the ‘Advanced Voltage Settings’ menu.  It is a little frustrating having to navigate between several menus at once to pick the CPU speed, and then set the appropriate voltage and LLC.  I hope that in the future Gigabyte will make an all-in-one menu and have the appropriate options visible to see.

Elsewhere in the BIOS are the fan settings in a different menu format, as well as boot order selection, advanced peripheral management, and the BIOS flash utility.

Gigabyte G1.Sniper 3 Overview, Visual Inspection, Board Features Gigabyte G1.Sniper 3 Software
POST A COMMENT

23 Comments

View All Comments

  • ultimatex - Wednesday, August 22, 2012 - link

    I got this MOBO from Newegg the first day they had it available , I couldn't believe the price since it offered 8x8x8x8x , Picked it up the first day and havent looked back. Doesnt look as cool as the Asrock extreme9 but it still looks good. Awesome Job Gygabyte , Anandtech should have given them a Gold not bronze though since the fan issue is a minor issue. Reply
  • Arbie - Wednesday, August 22, 2012 - link

    For gaming, at least, how many people are really going to build a 2xGPU system? Let alone 3x or 4x. The are so few PC games that can use anything more than one strong card AND are worth playing for more than 10 minutes. I actually don't know of any such games, but tastes differ. And some folks will have multi-monitor setups, and possibly need two cards. But overall I'd think the target audience for these mobos is extremely small.

    Maybe for scientific computing?
    Reply
  • Belard - Wednesday, August 22, 2012 - link

    Yep.... considering that most AAA PC games are just ports from consoles... having 3-4 GPUs is pointless. The returns get worse after the first 2 cards.

    Only those with 2~6 monitors can benefit with 2-3 cards.

    Also, even $80 Gigabyte boards will do 8x x 8x SLI/CF just fine.

    But hey, someone wants to spend $300 on a board... more power to them.
    Reply
  • cmdrdredd - Wednesday, August 22, 2012 - link

    "Only those with 2~6 monitors can benefit with 2-3 cards."

    Oh really? 2560x1440 on a single card is garbage in my view. I am not happy with 50fps average.
    Reply
  • rarson - Wednesday, August 22, 2012 - link

    If you're going multi-GPU on a single monitor, you're wasting money. Reply
  • Sabresiberian - Wednesday, August 22, 2012 - link

    Because everyone should build to your standards, O god of all things computer.

    Do some reading; get a clue.
    Reply
  • Steveymoo - Thursday, August 23, 2012 - link

    Incorrect.

    If you have a 120hz monitor, 2 GPUs make a tonne of difference. Before you come back with a "no one can see 120hz" jibe. That is also incorrect.... My eyes have orgasms every once in a while when you get those ultra detail 100+ fps moments in battlefield, that look great!
    Reply
  • von Krupp - Friday, August 24, 2012 - link

    No. Metro 2033 is not happy at 2560x1440 with just a single HD 7970, and neither are Battlefield 3 or Crysis. The Total War series also crawls at maximum settings.

    I bought the U2711 specifically to take advantage of two cards (and for accurate colours, mind you). I have a distaste for multi-monitor gaming and will continue to have such as long as they keep making bezels on monitors.

    So please, don't go claiming that multi-card is useless on a single monitor because that just isn't true.
    Reply
  • swing848 - Monday, December 08, 2014 - link

    At this date, December 2014, with maximum eye candy turned on, there are games that drop a refrence AMD R9 290 below 60 fps on a single monitor at 1920x1080 [using an Intel i5-3570K at 4GHz to 4.2GHz] Reply
  • Sabresiberian - Wednesday, August 22, 2012 - link

    This is not 1998, there are many games built for the PC only, and even previously console-oriented publishers aren't just making ports for the PC, they are developing their games to take advantage of the goodness only PCs can bring to the table. Despite what console fanboys continue to spew, PC gaming is on the rise, and console gaming is on the relative decline. Reply

Log in

Don't have an account? Sign up now