Meet The EVGA GeForce GTX 1070 Ti FTW2: Precision XOC

Considering that a lot of iCX features are neutered without it, Precision XOC is nigh-mandatory for, and essentially a software extension of, an iCX graphics card, which is not necessarily a negative. What makes this particularly relevant today is how the GTX 1070 Ti was positioned with respect to standardizing shipping clocks and promoting overclocking prowess. Several board partners ended up using their in-house overclocking utilities to advertise higher potential performance of their GTX 1070 Ti cards.

Naturally, EVGA incorporated Precision XOC for their semi-manual overclocking solution, developing a new single-step feature called “EVGA Precision XOC Scanner” or simply “XOC Scanner”, exclusive to the GTX 1070 Ti. While it differs from using OC ScannerX in the usual manner, it uses a series of automated OC ScannerX preset tests to apply an overclock. Simply starting Precision XOC for the first time will bring up a prompt to auto-overclock your GTX 1070 Ti, and after asking for the serial number, offer three different choices.

However, the in-application descriptions of the three options don’t make the end result very clear. In running XOC Scanner on our GTX 1070 Ti FTW2, the Quick Test results in an Offset overclock and the Full Test results in an overclocked voltage/frequency curve, but without adjusting any other element. Selecting Manual will send you to the regular Precision XOC application. But once there, running a Basic scan will bring up the three options again, while the Manual scan goes straight into voltage-frequency curve testing.

In effect, the XOC Scanner is applying the default OC Scanner settings almost like a preset, but changing the settings actually does change the parameters of the Quick and Full Test options.

In the overclocking section, we will see how these presets fare in benchmarks.

As discussed earlier, the other myriad features of the iCX system can be controlled from Precision XOC as well, namely configuration of LED colors and behavior.

To top even that, the LED Sync tool can coordinate lighting across EVGA graphics cards, closed-loop cooler, and chassis. Capping it off is the asynchronous fan control via separate sliders and fan curves.


LED Sync in action, posted by EVGA Product Manager Jacob Freeman (EVGA Forums)

But because the GTX 1070 Ti FTW2 iCX is the one EVGA GTX 1070 Ti card where eschewing Precision XOC means you’ve paid a premium for nothing, it’s almost safe to assume that Precision XOC will be used. And in that case, the auto-prompting XOC Scanner may as well be an inherent part of the GTX 1070 Ti FTW2 iCX package too, which is also not necessarily a negative. As far as how XOC Scanner works, the process is rather hands-off: you click a choice, a few tests happen, and once it’s over your card runs faster. Which sounds to be exactly what was intended.

Meet The EVGA GeForce GTX 1070 Ti FTW2: iCX The Test
Comments Locked

47 Comments

View All Comments

  • DnaAngel - Tuesday, May 22, 2018 - link

    I wouldn't hold your breath if you think a simple die shrink of the same architecture is going to be "a decent bump in performance". It will be slight (~10%), as typical refreshes are.

    To get a "decent bump in performance" (>20%) you have to wait till the next architecture generation. Navi/Volta in this case.
  • DnaAngel - Monday, May 21, 2018 - link

    AMD has Navi. Yea, and? Vega was supposed to the "Pascal killer" and yet a 475 dollar 1070Ti matches or outperforms their 800 dollar Vega 64 at 1080/1440p in most titles LOL.

    Navi will just be playing catchup to Volta anyway.
  • Hixbot - Thursday, February 1, 2018 - link

    Soo.. what you're saying is mining is the problem. OK got it.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Sure, if you want to be an obtuse retard about it. I clearly explained that miner demand is merely just _one_ of many facets of the GPU pricing issue. Miner demand is no different from Gamer demand, at least in terms of how it affects supply and therefore pricing. 1 GPU bought for mining or gaming is 1 less GPU in circulation, and when there's a low enough amount of GPUs on the market, the price is going to go up.

    And like I already explained, supply could be "fixed" by ordering many more cards to be produced, but because the demand isn't necessarily stable, AIB partners are hesitant to supply more on the market, because they'll be the ones on the losing end when they're stuck on supply that won't sell, should alternative coins tank in price.
  • Tetracycloide - Friday, February 2, 2018 - link

    TLDR of your 3 point explanation is simply "Miners." All the things you've said are just extra details of how "Miners" is the explanation.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Nice reading comprehension. It's a supply side issue that won't be fixed since suppliers aren't confident in the sustainability of demand. And because of that, the supply side won't be burned out (since they're running a business and generating excess supply has a large risk associated with it) and would rather let the GPU pricing handle itself in a low supply/high demand market.

    There's also the GPU scalpers and 3rd party seller market making the pricing worse than they are, since they're draining supply even though they're not the end-users demanding the product. (And these guys are the ones marking up the GPU prices, not Newegg, Amazon, or most brick and mortar retailers.)

    Look, I hate memecoin miners, too. They're wasting a shitload of energy to mine fictitious and worthless money to then put it on a highly volatile stock market like rollercoaster simulator, and they like to brag about how if every pleb had invested in memebucks they'd be "millionaires" when the fact of any volatile market is that very few are big winners, and most are incurring losses.

    But the problem is more than just the miners themselves. There's supply side that won't ramp up production. There's 3rd party market and scalpers selling the GPUs at exorbitant prices, and even memory manufacturers like Samsung playing a part due to rising price of GDDR5(x), which increases the BOM cost for any GPU made.

    If you had even a single brain cell in your head you would've understood from my post that "Oh, yeah, miners are just one piece of the problem. I get ya."
  • mapesdhs - Tuesday, February 6, 2018 - link

    I gave up trying to convey the nuance about these issues last week. Some people just want to believe in simplistic answers so they can blame a certain group and vocally moan, even though they're often part of the problem. There are other factors aswell, such as game devs not making games more visually complicated anymore, review hype/focus on high frequency gaming & VR (driven by gamers playing mostly FPS titles and others that fit this niche), and just the basic nature of current 3D tech being a natural fit for mining algorithms (shaders, etc.) In theory there is a strong market opportunity for a completely new approach to 3D gfx, a different arch, a proper GPU (modern cards are not GPUs; their visual abilities are literally the lowest priority), because atm the cards AMD/NVIDIA are producing are far more lucratively targeted at Enterprise and AI, not gamers; the latter just get the scraps off the table now, something The Good Old Gamer nicely explained a few months ago with a pertinent clip from NVIDIA:

    https://www.youtube.com/watch?v=PkeKx-L_E-o

    When was the last time a card review article even mentioned new visual features for 3D effects? It's been many years. Gamers are not playing games that need new features, they're pushing for high refresh displays (a shift enhanced by freesync/gsync adoption) so game devs aren't adding new features as that would make launch reviews look bad (we'll never have another Crysis in that way again), and meanwhile the products themselves are mathematically ideal for crypto mining tasks, a problem which makes (as the above chap says) both the AIBs and AMD/NVIDIA very reluctant to increase supply as that would create a huge supply glut once the mining craze shifts and the current cards get dumped, damaging newer product lines (miners have no brand loyalty, and AIBs can't risk the unsold stock potential, though in the meantime they'll happily sell to miners directly).

    I notice toms has several articles about mining atm. I hope AT doesn't follow suit. I didn't read the articles, but I bet they don't cover the total environmental cost re the massive e-waste generated by mining conglomerates. I'd rather tech sites that say they care about their readers didn't encourage this mining craze, but then it's a bandwagon many want to jump on while the rewards appear attractive. Ironically, at least LLT is doing a piece intended to show just how much of a con some of these mining setups can be.
  • boozed - Wednesday, January 31, 2018 - link

    Magic beans
  • StevoLincolnite - Wednesday, January 31, 2018 - link

    I bought my RX 580 for $400AUD almost a year ago. It actually hit $700 AUD at one point. Was nuts.

    Normally I would buy two... But this is the first time I have gone single GPU since the Radeon x800 days where you needed a master GPU.
    The costs are just out of control. Glad I am only running a 1440P display so I don't need super high-end hardware.
  • IGTrading - Wednesday, January 31, 2018 - link

    What I find the most interesting is that AMD Fury X absolutely destroys the GeForce 980 in absolutely all benches :) .

    I guess all those nVIDIA buyers feel swindled now ....

Log in

Don't have an account? Sign up now