Gigabyte GA-Z77X-UD5H Overclocking

Note: Ivy Bridge does not overclock like Sandy Bridge.  For a detailed report on the effect of voltage on Ivy Bridge (and thus temperatures and power draw), please read Undervolting and Overclocking on Ivy Bridge.

Experience with Gigabyte GA-Z77X-UD5H

Overclocking on the UD5H was a mixed back of results.  The automatic overclocks worked, but only if the system liked the memory you were using - the automatic overclocks apply some changes to memory that caused our system to fail using our default G.Skill DDR3-2400 9-11-11 kit.  However, when we used a Patriot DDR3-2133 kit, all was well.

Manual overclocking was clear-cut, as the system applies the overclock at the start of POST rather than the end.  This meant that during the OS loading, if the system was very unstable, a BSOD would show and we entered the BIOS to change the voltages. 

Overall results were a little disappointing, given the other good performances we had with the motherboard.

Methodology:

Our standard overclocking methodology is as follows.  We select the automatic overclock options and test for stability with PovRay and OCCT to simulate high-end workloads.  These stability tests aim to catch any immediate causes for memory or CPU errors.

For manual overclocks, based on the information gathered from previous testing, starts off at a nominal voltage and CPU multiplier, and the multiplier is increased until the stability tests are failed.  The CPU voltage is increased gradually until the stability tests are passed, and the process repeated until the motherboard reduces the multiplier automatically (due to safety protocol) or the CPU temperature reaches a stupidly high level (100ºC+).

Our test bed is not in a case, which should push overclocks higher with fresher (cooler) air.  We also are using Intel's All-in-one Liquid Cooler with its stock fan.  This is a 120mm radiator liquid cooler, designed to mimic a medium-to-high end air cooler.

Automatic Overclock:

For our automatic overclocking, we had to utilize a Patriot DDR3-2133 2x2GB memory kit as the system failed to boot with our G.Skill DDR3-2400 4x4 GB kit when the automatic overclock settings were selected.

EasyTune6 offers three levels of automatic overclocking, along with an Auto Tuning option that stresses the system while raising speeds and voltages.  Here are our results.

At ET Level 1, the system applied a 102.3 MHz BCLK and 41x multiplier, giving a final CPU speed of 4198 MHz.  This gave a +0.150 volt offset to the CPU core, and set the memory to XMP but reduced the speed back one strap due to the enhanced BCLK.  Maximum temperatures for this setting were 78ºC during PovRay and 80ºC during OCCT.

At ET Level 2, the system applied a 103.4 MHz BCLK and 43x multiplier, giving a final CPU speed of 4446.2 MHz.  In the OS, a load voltage of 1.248 volts was reported, and stress testing gave maximum temperatures of 84ºC during PovRay and 88ºC during OCCT.  Memory was also adjusted to one strap below XMP.

At ET Level 4, the system applied a 104.3 MHz BCLK and 45x multiplier, giving a final CPU speed of 4693.9 MHz.  In the OS, a load voltage of 1.284 volts was reported, and stress testing gave maximum temperatures of 96ºC during PovRay and 98ºC during OCCT.  Memory was also adjusted to one strap below XMP.

The Auto Tuning option gave the following experience:

- The system rebooted, and loads a stress-testing program.
- This program gradually raised the multiplier and BCLK.
- The stress testing part of the program crashed at 47x103.5, but tests continued.
- System hard reset at 49x103.5.
- System booted into OS at 48x104 and loaded program again, which crashed and shut down.
- System rebooted at 46x103.3 for full load and 48x103.3 for single core loads. 

This overclock showed 1.296 volts at full load, giving 93ºC during PovRay and 95ºC during OCCT.  However, performing the single core benchmark on PovRay caused a memory error.

Manual Overclock:

Manual overclocking was performed in the BIOS, where the CPU voltage was fixed at 1.100 volts and the multiplier started at 44x.  Here are the results:

At 44x, the system was stable with a BIOS voltage set at 1.100 volts, which led to a load voltage of 1.068 volts in the OS.  Peak temperatures during stability testing were 68ºC during PovRay and 71ºC during OCCT.

At 45x, the system was stable with a BIOS voltage set at 1.125 volts, which led to a load voltage of 1.096 volts in the OS.  Peak temperatures during stability testing were 71ºC during PovRay and 72ºC during OCCT.

At 46x, the system was stable with a BIOS voltage set at 1.175 volts and Load Line Calibration set to Extreme, which led to a load voltage of 1.176 volts in the OS.  Peak temperatures during stability testing were 80ºC during PovRay and 81ºC during OCCT.

At 47x, the system was stable with a BIOS voltage set at 1.225 volts and Load Line Calibration set to Extreme, which led to a load voltage of 1.224 volts in the OS.  Peak temperatures during stability testing were 88ºC during PovRay and OCCT.

At 48x, the system was stable with a BIOS voltage set at 1.275 volts and Load Line Calibration set to Extreme, which led to a load voltage of 1.284 volts in the OS.  Peak temperatures during stability testing were 97ºC during PovRay and 96ºC during OCCT.

Gigabyte GA-Z77X-UD5H In The Box, Voltage Readings Test Setup, Power Consumption
Comments Locked

70 Comments

View All Comments

  • Belard - Thursday, July 26, 2012 - link

    Its in the BIOS POWER settings. I've been building some systems with its smaller sister boards. You can go to gigabyte, track down the manual and look it up... it should be there. Also, it gives you the option to power up with a mouse.

    Even a wireless USB keyboard manage to power up the system (cool).
  • shin0bi272 - Wednesday, July 25, 2012 - link

    I know intel is capable of doing on chip video and there have been boards with onboard video for forever but the trend of putting 9001 video ports on the back of the thing instead of oh say 1 is disturbing.

    Lets be honest if youre a gamer and you want 3 way SLI you dont need onboard video. Likewise if youre not a gamer and you want to plug your monitor into the motherboard you dont need 3way pci-e 3.0 sli. Pick one and go with it!

    Plus if you wanted to you could include a couple of adapters to go from dvi to vga or dvi to hdmi and have 1 plug on the board itself which will save space on the back i/o panel and allow for more important things like more esata or usb3.0 or even that wifi that the review alluded to.

    This is a case of a motherboard manufacturer trying to please everyone with 1 board instead of making a gamer board and a HTPC board and a file server board. Saves them money but screws the consumer.
  • shin0bi272 - Wednesday, July 25, 2012 - link

    oh and if usb 3.0 is backwards compatible with 2.0 ... why include 2.0 at all?
  • Dustin Sklavos - Wednesday, July 25, 2012 - link

    USB 3.0 support is still a little bit hinky; a fresh install of Windows 7 may not recognize your keyboard if it's plugged into a USB 3.0 port without drivers.

    And uh...I use two of the display outputs on the back of my motherboard. Multi-monitor isn't that uncommon these days.
  • IanCutress - Wednesday, July 25, 2012 - link

    Each USB 3.0 controller has an associated Bill of Materials cost. You only get 4 USB 3.0 from the chipset, but 12 USB 2.0. USB 3.0 as Dustin says is a bit flaky at times - technically the Intel USB 3.0 should work at boot but they do not always, depends on how the motherboard traces are routed.

    Regarding boards and video outputs. If the CPU has the capability to, motherboard manufacturers get slammed if they don't include at least one or two video outputs just in case a user wants them. Imagine I had this board and strapped in a few NVIDIA GPUs for CUDA programming. If I could, I'd use the onboard IGP for my display, then have the GPUs purely for computational needs, and still have all the PCIe 3.0 bandwidth I would need.

    Ian
  • Grok42 - Wednesday, July 25, 2012 - link

    I don't think I've ever agreed and disagreed with a post so much before.

    I think it is about time that motherboards ship with the ability to run multi-monitor setups out of the box. Hopefully all four can drive a monitor at once! What is crazy is that they are shipping with 3-4 DIFFERENT connectors! I think all graphics connectors are completely terrible. Not one of them could drive an iPad3 screen even if the DVI was dual-link. This is why Apple is moving to thunderbolt I think but it still isn't clear to me that a Thunderbolt port could drive a hi-res display like an iPad. The next connector should have the ability to drive an 16k display so we can live with one connector for a decade. Monitors last 2x-4x the lifespan of a computer. Build a connector that will last!

    Of all the things we need more of, USB isn't one of them. At work we drive 24-48 USB devices on standard low end dell computers. Most DIY motherboards like this support at least 6 and more typically 10. If you need more than that a simple hub which you already have in your monitor/keyboard/mouse/toaster will give you all you need.

    Now the part I think you're spot on is that they are trying to please everyone with one board. I would expand this to the entire industry. If you've seen any of my screeds about computer cases you know that there is really one one type of case for sale, the one that sorta works for everyone but isn't great for anyone. The MB market is better but still a mess. Right now they have lines that are broken into feature grades with each higher level board simply adding more stuff. Instead they should be aimed at what consumers want to build.

    If you are making a highly overclockable board with support for 3 PCI-E graphics boards do you really need/want 10 SATA ports? Who is overclocking their file server and run an SLI console? The problem is if you back off to a lower grade board you lose something you do need so your file server has SLI support even if you don't want it.
  • epobirs - Thursday, July 26, 2012 - link

    You are completely wrong. The iPad3 display is merely 2048x1536. Not a big deal for dual link DVI which has been used to drive 2560x1600 displays for many years before the 'Retina' designation came out of Apple's marketing department. The idea that the iPad3 display is somehow the bleeding edge of screen tech is laughter inducing. The only thing remarkable about is the small size. Such resolutions are old news for large displays, especially in the professional markets. Keep in mind, the Retina designation is about pixel density, not just resolution.

    The only port on that panel that cannot drive a Retina display without breaking a sweat is the legacy VGA. DVI is showing its age but we have two successors already in HDMI and Display Port. Both of those are capable of driving 4K displays that won't be common in the consumer sector for several years. More importantly, the on-board GPU tops out at 4K, so equipping the board to drive anything greater is an utter waste.

    The newer ports are already designed with monitors most people won't be able to consider buying for a very long time. Nor do today's displays have the same longevity as CRTs did. Fortunately, they compensate by rapidly improving in bang for the buck. When my $300 1680x1050 22" monitor, which seemed an amazing bargain when first purchased, died after a bit over three years, I replaced it with a 27" 1080p screen for around $250. On another desk I put in an ACER 32" HDTV as the monitor for $250, just because I could. (I remember paying close to $1,000 for my first 17" CRT that weighed close to 80 pounds.)

    Trying to design for what will be called for a decade from now is just a waste of time. Extremely few consumers will benefit and there is a good chance NOBODY will benefit because something came along that changed things so much as to render your long term plan badly obsolete. The payoff just isn't there. VGA has been around since 1987 but there aren't any displays from that era or ten years later that are worth the trouble to use today.

    I'm reminded of back in 1999 when A certain type of Apple snob loved to go about how the original 128K Mac had no Y2K issues. Who cares? If you were still relying on an 80s Mac in 1999 your life would have to be so miserable as to make Y2K terribly low on your list of troubles.

    As for USB and SATA ports, in a full sized ATX board I'd far rather have ports going unused than have to add more later. If I want minimalist I'll build with a smaller board and case. They each have their place.
  • aaronb1138 - Wednesday, September 5, 2012 - link

    Even VGA can drive up to around 2560x1600 @ 60 Hz, but cable quality and length becomes a factor (you need a $15-30 shielded cable instead of a $5 one). I run a Sony FW900 at 2048x1280 @ 85 Hz over VGA cleanly (BNC or VGA connectors, both are equal with good cables).

    Dual link DVI can run 2560x1600 @ 60 Hz at 10 bits per color (30 bit color).
  • Belard - Thursday, July 26, 2012 - link

    This is not a true 3X SLI board. It has the slots, but not the lanes to do full blown 16x16x6 or even 8x8x8. At $140~170, its a upper mid-range board.

    I build system with Gigabtye mATX boards... it'll support 8x8 SLI or two Cross-fire boards, it also has 3 16x slots. Not bad for $80 (Microcenter discounts).

    So having the various types of video ports is very good for typical people who only use a single monitor. With the 4 types, everyone is covered. For a dual monitor with DVI inputs, I used a DVI-DVI cable and spent $15 for a HDMI>DVI cable... not a big deal.

    Even $400 video cards will require adapter cables to use in multi-monitor setups.
  • rickon66 - Wednesday, July 25, 2012 - link

    I still think that this board has great bang for the buck, especially since itis often available for $139.99 at Micro Center.

Log in

Don't have an account? Sign up now