Overvolting & The EVGA EVBot

So much of the GTX 680 Classified is geared around overvolting, so we wanted to break this out into its own section.

For some time now NVIIDA has kept a tight leash on their partners' designs, with partners required to clear their designs with NVIDIA before they can sell them. NVIDIA’s interest in this matter is that video card manufacturing is a true partnership – their name and brand is at stake as much as the partners – so they want to be sure that GeForce cards meet their standards. This includes cooling, build quality, and noise limits (the reason the GTX 680C is limited to 55% fan speed). NVIDIA doesn’t publish what those requirements are so we can only speculate from the outside about what’s going on, but clearly NVIDIA has been unhappy with some partners’ custom designs in the past.

With the GeForce 600 series things have become a bit harder for NVIDIA’s partners. As we briefly mentioned before, NVIDIA is shying away from hardcore overclocking with the GeForce 600 series. Specifically, there are two things going on:

  1. Partners wishing to have a card with a TDP over 195W (i.e. a base power target 170W) must use a custom board with suitable power circuitry. NVIDIA won’t allow partners to ship higher-power cards using the reference board.
  2. Software overvoltage control is forbidden.

These rules impact two classes of cards. The first are heavily factory overclocked cards using the reference PCB, which goosed the GPU voltage to hit their high factory overclocks. Partners that wish to ship heavily overclocked cards will now need to bin to stay within NVIDIA’s power requirements. Meanwhile the second class of cards impacted is of course overclocking-focused cards like the GTX 680 Classified. Voltage control becomes necessary beyond a certain point, and since NVIDIA is requiring custom PCBs and disallowing software voltage control EVGA had to get creative.

The end result of that creativity is that EVGA has fallen back to their EVBot controller. By using a beefed up PCB along with an external controller, EVGA can offer voltage control while meeting NVIDIA’s guidelines.

The problem with having to resort to this kind of creativity is that it is unquestionably a step back for enthusiasts. Truth be told we don’t mind the voltage lock on reference cards – letting users play with voltage control on cards that aren’t meant for it can definitely lead to problems (e.g. GTX 590). However lack of software voltage control makes the whole thing very messy, as we’ll see in a moment. No doubt NVIDIA has their reasons for going this route, but we don’t believe that anyone has benefitted from a lack of software voltage control on these premium overclocking cards. NVIDIA could and should do better here, since this is effectively an arbitrary restriction that offers no benefit on cards such as the GTX 680 Classified.

Moving on, before getting into the nitty-gritty of voltage control let’s quickly discuss EVGA’s EVBot.

EVBot

Introduced a few years ago, EVGA’s EVBot is a controller intended to offer external control and monitoring capabilities for EVGA’s high-end video cards and motherboards. The device itself is a simple controller with an almost iPod-like design, and contains a small firmware that tells the EVBot how to interact with various devices. The EVBot draws power from the device it’s controlling, meaning it’s basically just a small controller (in the electronics sense) with buttons and a backlit monochrome screen. Finally, 4 headers are found on the bottom, allowing one EVBot to control up to 1 motherboard plus 3 video cards.

Short of an EVGA motherboard or video card, EVBot includes everything else necessary to use it. EVGA includes 4 EVBot cables, so you can control up to the controller’s limit of 4 devices without hunting down any further parts. The 4ft long cables are rather generous, but if you have a large case (like we do) then you’ll want to make sure you have an opening for the cable fairly close to where you intend to keep the controller, as there’s not a ton of slack space to run it behind a full size tower.

Once powered up, the EVBot is pretty simple to navigate through. Depending on the device being controlled different functions become available, with motherboards in particular getting a full range of voltage and clock adjustments, along with hardware monitoring. Video cards on the other hand get a much smaller feature set; the functionality is limited to voltage adjustment. Which means that the EVBot needs to be used in concert with software utilities such as EVGA’s Precision X in order to actually overclock and to monitor that overclock.

Ultimately the functionality EVBot provides for the GTX 680 Classified is essential, but because all it can be used for is voltage control it’s not all that useful. If EVBot could offer full clock control so that you could load a clock and voltage profile at the same time it would be far more useful, and for that matter would be a good hardware analog to what the EVGA Precision X software does.

Finally, if you’re purchasing an EVBot you’ll want to make sure you’re purchasing it directly from EVGA. While EVGA uses a standard hardware interface for the EVBot, the device needs hardware-specific profiles to operate. EVBot’s firmware is upgradable, but only via motherboards, so if you have an earlier firmware then you’re not going to be able to upgrade it for the GTX 680 Classified if the only EVGA product you have is the card. At the same time owners with both an EVGA motherboard and a GTX 680 Classified will want to pay close attention to the EVBot’s firmware limitations – because the EVBot’s firmware is so small, it can only store a couple of profiles. The GTX 680 Classified firmware (P15) doesn’t hold profiles for EVGA’s X79 motherboards, for example.

Overvolting With EVBot

So how does overvolting with EVBot work? For better or worse it’s actually very simple.

EVBot exposes 5 settings on the GTX 680 Classified: NVVDD, FBVDD, PEXVDD1, PEXVDD2, and OCP. These control the GPU voltage, the RAM voltage, the PCIe voltage, and OverCurrent Protection respectively. As is typically the case for video card overclocking, it’s the GPU and RAM voltage that are going to be the important settings for most users.

EVBot Function Table
Name Function
NVVDD GPU Voltage
FBVDD Memory Voltage
PEXVDD1 PCIe Voltage #1
PEXVDD2 PCIe Voltage #2
OCP OverCurrent Protection

Overvolting is simply a matter of dialing in the desired voltage, in the usual 0.0625v increments we’ve come to expect on GTX 680 products. EVGA tells us that GPU voltages up to 1.3v are safe, but of course your mileage may vary and the ultimate goal is to reach the desired clockspeed on as little voltage as possible.

Note that once EVBot is used to take control of the GPU voltage, it overrides NVIDIA’s standard control mechanism and leaves you in full control of the voltage. This happens completely transparently to NVIDIA’s software, which means that any overclocking tools using the NVAPI (which is to say, all of them) will continue to report the voltage the drivers are asking for rather than the real voltage. This makes the voltage monitoring points on the GTX 680 Classified all the more important, since that’s the only way to get a real voltage reading once voltage control is used.

However this also means that voltages are decoupled from NVIDIA’s clock domains. Consequently the GTX 680 Classified will idle at a higher voltage, increasing the idle power consumption of the card. This isn’t a huge problem, but it is a tradeoff for overvolting.

Gallery: EVGA EVBot

On that note, it’s unfortunate that EVBot voltage settings are not fully persistent. They will survive a soft reboot, but if the card is fully powered down you’ll need to reset the desired voltages the next time the card is powered up. For anyone intending to use an overvolted card on a regular basis, this means you’ll need to keep an EVBot plugged in at all times so that you can reset the voltages. Note that this also effectively precludes having EVGA Precision X or other utilities apply an overclock on startup, since you’ll want to set the voltage first before turning up the clockspeeds.

With all of that said, at the end of the day the EVBot does its job well enough given the limitations imposed by NVIDIA. The fact that this is a separate purchase on top of the GTX 680 Classified is unfortunate, but after thinking it through it wouldn’t make a ton of sense to include the EVBot with the card since all anyone would ever need is the one EVBot. Still, the $80 price tag on top of the $660 card means that GTX 680 overvolting is not for the thrifty at this time.

Meet The EVGA GeForce GTX 680 Classified The Test
Comments Locked

75 Comments

View All Comments

  • HisDivineOrder - Tuesday, July 24, 2012 - link

    To be fair, AMD started the gouging with the 7970 series, its moderate boost over the 580 series, and its modest mark-up over that line.

    When nVidia saw what AMD had launched, they must have laughed and rubbed their hands together with glee. Because their mainstream part was beating it and it cost them a LOT less to make. So they COULD have passed those savings onto the customer and launched at nominal pricing, pressuring AMD with extremely low prices that AMD could probably not afford to match...

    ...or they could join with the gouging. They joined with the gouging. They knocked the price down by $50 and AMD's pricing (besides the 78xx series) has been in a freefall ever since.
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    You people are using way too much tin foil, it's already impinged bloodflow to the brain from it's weight crimping that toothpick neck... at least the amd housefire heatrays won't further cook the egg under said foil hat.

    Since nVidia just barely has recently gotten a few 680's and 670's in stock on the shelves, how pray tell, would they produce a gigantic 7 billion transistor chip that it appears no forge, even the one in Asgard, could possibly currently have produced on time for any launch, up to and perhaps past even today ?

    See that's what I find so interesting. Forget reality, the Charlie D semi accurate brain fart smell is a fine delicacy for so many, that they will never stop inhaling.

    Wow.

    I'll ask again - at what price exactly was the 680 "midrange" chip supposed to land at ? Recall the GTX580 was still $499+ when amd launched - let's just say since nVidia was holding back according to the 20lbs of tinfoil you guys have lofted, they could have released GTX680 midrange when their GTX580 was still $499+ - right when AMD launched... so what price exactly was GTX680 supposed to be, and where would that put the rest of the lineups on down the price alley ?

    Has one of you wanderers EVER comtemplated that ? Where are you going to put the card lineups with GTX680 at the $250-$299 midrange in January ? Heck ... even right now, you absolute geniuses ?
  • natsume - Sunday, July 22, 2012 - link

    For that price, I prefer rather the Sapphire HD 7970 Toxic 6GB @ 1200Mhz
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    Currently unavailable it appears.

    And amd fan boys have told us 7970 overclocks so well to (1300mhz they claim) so who cares.

    Toxic starts at 1100, and no amd fan boy would admit the run of the mill 7970 can't do that out of the box, as it's all we've heard now since January.

    It's nice seeing 6GB on a card though that cannot use even 3GB an maintain a playable frame rate at any resolution or settings, including 100+ Skyrim mods at once attempts.
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    Sad how it loses so often to a reference GTX680 in 1920 and at triple monitor resolutions.

    http://www.overclockersclub.com/reviews/sapphire__...
  • Sabresiberian - Sunday, July 22, 2012 - link

    One good reason not to have it is the fact that software overclocking can sometimes be rather wonky. I can see Nvidia erring on the cautious side to protect their customers from untidy programs.

    EVGA is a company I want to love, but they are, in my opinion, one that "almost" goes the extra mile. This card is a good example, I think. Their customers expressed a desire for unlocked voltage and 4GB cards (or "more than 2GB"), and they made it for us.

    But they leave the little things out. Where do you go to find out what those little letters mean on the EVBot display? I'll tell you where I went - to this article. I looked in the EVBot manual, looked up the manual online to see if it was updated - it wasn't; scoured the website and forums, and no where could I find a breakdown of what the list of voltage settings meant from EVGA!

    I'm not regretting my purchase of this card; it is a very nice piece of hardware. It just doesn't have the 100% commitment to it a piece of hardware like this should.

    But then, EVGA, in my opinion, does at least as good as anybody, in my opinion. MSI is an excellent company, but they released their Lightning that was supposed to be over-voltable without a way to do it. Asus makes some of the best stuff in the business - if their manufacturing doesn't bungle the job and leave film that needs to be removed between heatsinks and what they should be attached to.

    Cards like this are necessarily problematic. To make them worth their money in a strict results sense, EVGA would have to guarantee they overclock to something like 1400MHz. If they bin to that strict of a standard, why don't they just factory overclock to 1400 to begin with?

    And, what's going to be the cost of a chip guaranteed to overclock that high? I don't know; I don't know what EVGA's current standards are for a "binning for the Classified" pass, but my guess is it would drive the price up, so that cost value target will be missed again.

    No, you can judge these cards strictly by value for yourself, that's quite a reasonable thing to do, but to be fair you must understand that some people are interested in getting value from something other than better frame rates in the games they are playing. For this card, that means the hours spent overclocking - not just the results, the results are almost beside the point, but the time spent itself. In the OC world that often means people will be disappointed in the final results, and it's too bad companies can't guarantee better success - but if they could, really what would be the point for the hard-core overclocker? They would be running a fixed race, and for people like that it would make the race not worth running.

    These cards aren't meant for the general-population overclocker that wants a guaranteed more "bang for the buck" out of his purchase. Great OCing CPUs like Nehalem and Sandy Bridge bring a lot of people into the overclocking world that expect to get great results easily, that don't understand the game it is for those who are actually responsible for discovering those great overclocking items, and that kind of person talks down a card like this.

    Bottom line - if you want a GTX 680 with a guaranteed value equivalent to a stock card, then don't buy this card! It's no more meant for you than a Mack truck is meant to be a family car. However, if you are a serious overclocker that likes to tinker and wants the best starting point, this may be exactly what you want.

    ;)
  • Oxford Guy - Sunday, July 22, 2012 - link

    Nvidia wasn't happy with the partners' designs, eh? Oh please. We all remember the GTX 480. That was Nvidia's doing, including the reference card and cooler. Their partners, the ones who didn't use the awful reference design, did Nvidia a favor by putting three fans on it and such.

    Then there's the lack of mention of Big Kepler on the first page of this review, even though it's very important for framing since this card is being presented as "monstrous". It's not so impressive when compared to Big Kepler.

    And there's the lack of mention that the regular 680's cooler doesn't use a vapor chamber like the previous generation card (580). That's not the 680 being a "jack of all trades and a master of none". That's Nvidia making an inferior cooler in comparison with the previous generation.
  • CeriseCogburn - Tuesday, July 24, 2012 - link

    I, for one, find the 3rd to the last paragraph of the 1st review page a sad joke.

    Let's take this sentence for isntance, and keep in mind the nVidia reference cooler does everything better than the amd reference:
    " Even just replacing the cooler while maintaining the reference board – what we call a semi-custom card – can have a big impact on noise, temperatures, and can improve overclocking. "

    One wonders why amd epic failure in comparison never gets that kind of treatment.

    If nVidia doesn't find that sentence I mentioned a ridiculous insult, I'd be surprised, because just before that, they got treated to this one: " NVIDIA’s reference design is a jack of all trades but master of none "

    I guess I wouldn't mind one bit if the statements were accompanied by flat out remarks that despite the attitude presented, amd's mock up is a freaking embarrassingly hot and loud disaster in every metric of comparison...

    I do wonder where all these people store all their mind bending twisted hate for nVidia, I really do.

    The 480 cooler was awesome because one could simply remove the gpu sink and still have a metal covered front of the pcb card and thus a better gpu HS would solve OC limits, which were already 10-15% faster than 5870 at stock and gaining more from OC than the 5870.

    Speaking of that, we're supposed to sill love the 5870, this sight claimed the 5850 that lost to 480 and 470 was the best card to buy, and to this day our amd fans proclaim the 5870 a king, compare it to their new best bang 6870 and 6850 that were derided for lack of performance when they came out, and now 6870 CF is some wonderkin for the fan boys.

    I'm pretty sick of it. nVidia spanked the 5000 series with their 400 series, then slammed the GTX460 down their throats to boot - the card all amd fans never mention now - pretending it never existed and still doesn't exist...
    It's amazing to me. All the blabbing stupid praise about amd cards and either don't mention nVidia cards or just cut them down and attack, since amd always loses, that must be why.
  • Oxford Guy - Tuesday, July 24, 2012 - link

    Nvidia cheaped out and didn't use a vapor chamber for the 680 as it did with the 580. AMD is irrelevant to that fact.

    The GF100 has far worse performance per watt, according to techpowerup's calculations than anything AMD released in 40nm. The 480 was very hot and very loud, regardless of whether AMD even existed in the market.

    AMD may have a history of using loud inefficient cooling, but my beef with the article is that Nvidia developed a more efficient cooler (580's vapor chamber) and then didn't bother to us it for the 680, probably to save a little money.
  • CeriseCogburn - Wednesday, July 25, 2012 - link

    The 680 is cooler and quieter and higher performing all at the same time than anything we've seen in a long time, hence "your beef" is a big pile of STUPID dung, and you should know it, but of course, idiocy never ends here with people like you.

    Let me put it another way for the faux educated OWS corporate "profit watcher" jack***: " It DOESN'T NEED A VAPOR CHAMBER YOU M*R*N ! "

    Hopefully that penetrates the inbred "Oxford" stupidity.

    Thank so much for being such a doof. I really appreciate it. I love encountering complete stupidity and utter idiocy all the time.

Log in

Don't have an account? Sign up now