Meet the EVGA GeForce GTX 1070 Ti FTW2: iCX

First things first, there are no surprises with the specifications and physical design of EVGA’s GeForce GTX 1070 Ti FTW2 iCX. But as the top card in the EVGA GTX 1070 Ti stack, the GTX 1070 Ti FTW2 is the most feature-packed model. It also boasts the only RGB LED capability of the bunch, though this partially serves a more practical purpose in addition to the decoration.

Nevertheless, the centerpiece of GTX 1070 Ti FTW2 is the iCX thermal sensor and cooler system, first introduced by EVGA’s GTX 1080, 1070, and 1060 iCX series. An array of onboard thermal sensors and microcontroller units dot the PCB, allowing for enhanced monitoring and regional temperature adaptive behavior. All this can be displayed, logged, and adjusted through EVGA’s Precision XOC utility. In the beginning, only Precision XOC could read the iCX thermal sensor data, but at this point GPU-Z and other standard monitoring tools can display and log that data as well.

So while the dual fan open air construction looks very similar to EVGA’s ACX 3.0 cooler, the fans do not operate the same way. iCX features asynchronous fan cooling, dictated by the 9 thermal sensors distributed among the GPU, PWM/VRMs, and memory. The GPU fan operates in response to the reported GPU temperatures, while the VRM/memory fan operates in response to VRM/memory temperatures. While modern GPUs all have built-in temperature sensors – as do most high-end VRM setups, for that matter – it's very rare for a card's fan speed to be tied to anything besides the GPU. This despite the fact that VRMs in graphics cards and motherboards alike can get quite toasty on their own.

EVGA, for that matter, is no stranger to overheating VRMs. In 2016, their GeForce 10-series FTW models with ACX 3.0 cooler cards began suffering from overheated VRMs, due precisely to the fact that their cooling system was focused on GPU temperatures while ignoring VRM temperatures. In the end, EVGA released an updated VBIOS and offered free thermal mod kits to those affected. But it also served to spur the company into developing the iCX family of cards, where not only was VRM cooling beefed up with more passive components (better transfer from the FETs to heat dissipation surfaces), but also having a fan respond to VRM temperatures. As a result the iCX series can be seen as the pendulum swinging the other way – for most users it's probably a bit overkill, but you can't accuse EVGA of not learning their lesson or taking the matter more seriously these days.

Within Precision XOC, the asynchronous fans may be controlled separately, and can also be assigned separate fan curves for more granular control. The idea is that intelligent fanspeeds can reduce excess fan noise and improve hotspot cooling. Combined with the pre-existing semi-passive zero fan speed idling, iCX cards – and users – can tune or tune fan operation for noise and fan lifespan. In theory, this could also be used to reduce fan power in exchange for extra GPU power on TDP limited NVIDIA-approved designs, something we saw in EVGA’s GTX 970 FTW ACX 2.0, but would mean less for the GTX 1070 Ti FTW2 and its 180W TDP.

The GPU, PWM, and memory thermal sensors go so far as to assist the “G”, “P”, and “M” RGB LEDs on the side. By default, the three status LEDs change color according to temperature; this may be customized to fit or simply replaced with different colors for aesthetic lighting purposes.

Returning to the actual cooler, the GTX 1070 Ti FTW2 incorporates a form-fitted multi-part backplate and backplate, with thermal pads in the appropriate areas. And as part of the iCX cooling solution, some of the plates have pin-like protrusions, or “pin fins”, that increase surface area and in turn improves heat dissipation. Additionally, the iCX heatsink has half-open and L-shaped fins, and the fins themselves have small holes. The openings are meant to improve airflow, while the L-shape is intended to increase surface contact.

Finally, as one last layer of protection, the iCX solution includes a safety fuse on the PCB, which protects the board against more damage should a component fail. So as a whole, iCX, or “Interactive Cooling Xtreme”, is much more focused on user-facing features, rather than a significant reworking of ACX 3.0’s pure cooling performance.

Elsewhere on the GTX 1070 Ti FTW2 board is its 10+2 power phase design, which is presumably a doubled 5 phase design as observed on the GTX 1080 Ti FTW2. Along the top, board features SLI HB connectors as usual, and adjacent to the dual 8-pin PCIe power connectors is the small Dual BIOS toggle.

EVGA GTX 1070 Ti FTW2 BIOSes
  Fan Curve Zero Idle Fan Speed Power Limit Temperature Target
BIOS 1 (Default) Standard Yes 120%
(216W)
83C
BIOS 2 Aggressive No 130%
(235W)
93C

Now a staple higher-end EVGA feature, Dual BIOS is exactly what it sounds like: 2 usable VBIOSes, selected via that small switch on the top of the card. The purpose of the Dual BIOS function then is two-fold: offer a second configuration for the card for less common use cases, and to include a second BIOS so that the card can safely have further BIOSes flashed without rendering the card unbootable, which is handy if you're into BIOS modifications.

In the case of the FTW2, these BIOSes are fairly different, and as a result the BIOS selection can have a material impact on overclocking performance. The default, out of the box BIOS, BIOS 1 (Master) runs the card as described thus far, with a typical low-noise fan curve, standard 83C temperature target, a maximum power limit of 216W (120%), and zero fan speed idle enabled. BIOS 2 (Slave) is far more overclocking oriented, dropping zero fan speed idle while utilizing a more aggressive fan curve, and on the power/temperature side the card's temperature target is raised to the 93C maximum and, most importantly, the power limit is raised to 235W (130%).

It goes without saying then that BIOS 2 is meant to offer the better out-of-the-box overclocking experience – in particular the increased power limit can't be achieved in software – but it still has to respect NVIDIA's voltage limitations, not to mention the card's power limitations. Speaking of which, considering that the GTX 1070 Ti is presumably still unfriendly to actual voltage increases, the 375W total from dual 8-pins and PCIe slot is overkill for the default dual BIOS.

As far as display output goes, the GTX 1070 Ti FTW2 is fairly straightforward with 3 DisplayPorts, 1 HDMI, and 1 DVI-D, covering the usual bases of TV and VR with HDMI, as well as DVI-only monitors and legacy uses with DVI-D.

The EVGA GeForce GTX 1070 Ti FTW2 Review Meet The EVGA GTX 1070 Ti FTW2: Precision XOC
Comments Locked

47 Comments

View All Comments

  • DnaAngel - Tuesday, May 22, 2018 - link

    I wouldn't hold your breath if you think a simple die shrink of the same architecture is going to be "a decent bump in performance". It will be slight (~10%), as typical refreshes are.

    To get a "decent bump in performance" (>20%) you have to wait till the next architecture generation. Navi/Volta in this case.
  • DnaAngel - Monday, May 21, 2018 - link

    AMD has Navi. Yea, and? Vega was supposed to the "Pascal killer" and yet a 475 dollar 1070Ti matches or outperforms their 800 dollar Vega 64 at 1080/1440p in most titles LOL.

    Navi will just be playing catchup to Volta anyway.
  • Hixbot - Thursday, February 1, 2018 - link

    Soo.. what you're saying is mining is the problem. OK got it.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Sure, if you want to be an obtuse retard about it. I clearly explained that miner demand is merely just _one_ of many facets of the GPU pricing issue. Miner demand is no different from Gamer demand, at least in terms of how it affects supply and therefore pricing. 1 GPU bought for mining or gaming is 1 less GPU in circulation, and when there's a low enough amount of GPUs on the market, the price is going to go up.

    And like I already explained, supply could be "fixed" by ordering many more cards to be produced, but because the demand isn't necessarily stable, AIB partners are hesitant to supply more on the market, because they'll be the ones on the losing end when they're stuck on supply that won't sell, should alternative coins tank in price.
  • Tetracycloide - Friday, February 2, 2018 - link

    TLDR of your 3 point explanation is simply "Miners." All the things you've said are just extra details of how "Miners" is the explanation.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Nice reading comprehension. It's a supply side issue that won't be fixed since suppliers aren't confident in the sustainability of demand. And because of that, the supply side won't be burned out (since they're running a business and generating excess supply has a large risk associated with it) and would rather let the GPU pricing handle itself in a low supply/high demand market.

    There's also the GPU scalpers and 3rd party seller market making the pricing worse than they are, since they're draining supply even though they're not the end-users demanding the product. (And these guys are the ones marking up the GPU prices, not Newegg, Amazon, or most brick and mortar retailers.)

    Look, I hate memecoin miners, too. They're wasting a shitload of energy to mine fictitious and worthless money to then put it on a highly volatile stock market like rollercoaster simulator, and they like to brag about how if every pleb had invested in memebucks they'd be "millionaires" when the fact of any volatile market is that very few are big winners, and most are incurring losses.

    But the problem is more than just the miners themselves. There's supply side that won't ramp up production. There's 3rd party market and scalpers selling the GPUs at exorbitant prices, and even memory manufacturers like Samsung playing a part due to rising price of GDDR5(x), which increases the BOM cost for any GPU made.

    If you had even a single brain cell in your head you would've understood from my post that "Oh, yeah, miners are just one piece of the problem. I get ya."
  • mapesdhs - Tuesday, February 6, 2018 - link

    I gave up trying to convey the nuance about these issues last week. Some people just want to believe in simplistic answers so they can blame a certain group and vocally moan, even though they're often part of the problem. There are other factors aswell, such as game devs not making games more visually complicated anymore, review hype/focus on high frequency gaming & VR (driven by gamers playing mostly FPS titles and others that fit this niche), and just the basic nature of current 3D tech being a natural fit for mining algorithms (shaders, etc.) In theory there is a strong market opportunity for a completely new approach to 3D gfx, a different arch, a proper GPU (modern cards are not GPUs; their visual abilities are literally the lowest priority), because atm the cards AMD/NVIDIA are producing are far more lucratively targeted at Enterprise and AI, not gamers; the latter just get the scraps off the table now, something The Good Old Gamer nicely explained a few months ago with a pertinent clip from NVIDIA:

    https://www.youtube.com/watch?v=PkeKx-L_E-o

    When was the last time a card review article even mentioned new visual features for 3D effects? It's been many years. Gamers are not playing games that need new features, they're pushing for high refresh displays (a shift enhanced by freesync/gsync adoption) so game devs aren't adding new features as that would make launch reviews look bad (we'll never have another Crysis in that way again), and meanwhile the products themselves are mathematically ideal for crypto mining tasks, a problem which makes (as the above chap says) both the AIBs and AMD/NVIDIA very reluctant to increase supply as that would create a huge supply glut once the mining craze shifts and the current cards get dumped, damaging newer product lines (miners have no brand loyalty, and AIBs can't risk the unsold stock potential, though in the meantime they'll happily sell to miners directly).

    I notice toms has several articles about mining atm. I hope AT doesn't follow suit. I didn't read the articles, but I bet they don't cover the total environmental cost re the massive e-waste generated by mining conglomerates. I'd rather tech sites that say they care about their readers didn't encourage this mining craze, but then it's a bandwagon many want to jump on while the rewards appear attractive. Ironically, at least LLT is doing a piece intended to show just how much of a con some of these mining setups can be.
  • boozed - Wednesday, January 31, 2018 - link

    Magic beans
  • StevoLincolnite - Wednesday, January 31, 2018 - link

    I bought my RX 580 for $400AUD almost a year ago. It actually hit $700 AUD at one point. Was nuts.

    Normally I would buy two... But this is the first time I have gone single GPU since the Radeon x800 days where you needed a master GPU.
    The costs are just out of control. Glad I am only running a 1440P display so I don't need super high-end hardware.
  • IGTrading - Wednesday, January 31, 2018 - link

    What I find the most interesting is that AMD Fury X absolutely destroys the GeForce 980 in absolutely all benches :) .

    I guess all those nVIDIA buyers feel swindled now ....

Log in

Don't have an account? Sign up now