Power, Temperature, & Noise

As expected given the NVIDIA-standardized GTX 1070 Ti clocks, the standard benchmarks for the GTX 1070 Ti FTW2 were rather humdrum in terms of raw performance, sticking closely to the reference Founders Edition card. Nevertheless, the FTW2's custom features certainly come into play for power, temperature, and noise, factors not to be underrated especially when compared against typically louder and hotter reference blower models. Generally speaking, modern GPU boost technology will typically take advantage of a board’s better power and temperature characteristics for longer and higher boosts, but with mandated reference clockspeeds the GTX 1070 Ti FTW2 simply operates cooler and quieter.

GeForce Video Card Average Clockspeeds
Game EVGA GTX 1070 Ti FTW2 GTX 1070 Ti GTX 1070
Max Boost Clock
1898MHz
1898MHz
1898MHz
Battlefield 1 1860MHz
1826MHz
1797MHz
Ashes: Escalation 1850MHz
1838MHz
1796MHz
DOOM 1847MHz
1856MHz
1780MHz
Ghost Recon Wildlands 1860MHz
1840MHz
1807MHz
Dawn of War III 1860MHz
1848MHz
1807MHz
Deus Ex: Mankind Divided 1855MHz
1860MHz
1803MHz
Grand Theft Auto V 1862MHz
1865MHz
1839MHz
F1 2016 1860MHz
1840MHz
1825MHz
Total War: Warhammer 1855MHz
1832MHz
1785MHz

Though it does appear that the GTX 1070 Ti boosts a little higher and more consistently, there’s little change for the out-of-the-box GTX 1070 Ti FTW2 performance compared to the Founders Edition. For the majority of the standard benchmarks, the difference was within the margin of error.

As for power, the GTX 1070 Ti FTW2’s extra capabilities are rather muted at stock. At idle, the board turns off the fans under certain temperatures – the default master BIOS has a 60 degree threshold – and technically speaking, the LEDs pull some power, but total system consumption rarely reflects such small differences and adjustments.

Idle Power Consumption

While the stated TDP remains 180W, the GTX 1070 FTW2 does possess two 8-pin PCIe power connectors over the Founders Edition’s single 6-pin. Considering the default 100% power limit, this extra power draw capacity can hardly be used in most applications, and for Battlefield 1 system consumption only ends up around 8W higher. But a power virus like FurMark has much less qualms about taking as much as it can, with the GTX 1070 Ti FTW2 immediately pulling a little extra, in the region of 30W at the wall.

Load Power Consumption - Battlefield 1

Load Power Consumption - FurMark

Like most high quality custom boards, the GTX 1070 Ti FTW2 can maintain a typical idling temperature with passive cooling. Under load, the fans kick in and the card settles just below its default 72 degree throttle point, even while running FurMark.

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

EVGA GTX 1070 Ti FTW2 iCX Readings
  Battlefield 1 (1440p) FurMark
GPU Temperature 68°C 70°C
iCX GPU2 Temp. 65°C 70°C
iCX MEM1 Temp. 50°C 53°C
iCX MEM2 Temp. 59°C 64°C
iCX MEM3 Temp. 70°C 78°C
iCX PWR1 Temp. 65°C 70°C
iCX PWR2 Temp. 66°C 72°C
iCX PWR3 Temp. 66°C 72°C
iCX PWR4 Temp. 66°C 72°C
iCX PWR5 Temp. 68°C 75°C
Left Fan Speed (GPU) 962 RPM 1191 RPM
Right Fan Speed (PWM/MEM) 1066 RPM 1320 RPM

At idle, of course, the graphics card utilizes zero fan speed idle. Under load, the cooling design proves capable enough with the fans at a relatively low speed, resulting in a rather quiet profile. Both fans ramp up asynchronously, and both unsurprisingly ramp up higher in FurMark, which features higher temperatures across all the iCX sensors. The general idea behind asynchronous fans can be seen in how the right fan speeds up in response to higher memory and PWM temperatures.

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

This kind of power, temperature, and noise profile will suit some just fine: a quiet card with purposeful temperature LEDs, all without user intervention. Others will immediately notice the unutilized headroom. With XOC Scanner, EVGA looks to court the former by with a single-step automatically applied overclock. And on that note, we move on to the overclocking…

Compute & Synthetics Overclocking
Comments Locked

47 Comments

View All Comments

  • Dr. Swag - Wednesday, January 31, 2018 - link

    Fury x always was faster than a 980
  • DnaAngel - Tuesday, May 22, 2018 - link

    Heck my old R9 390 would match or outperform the 980 in a good amount of titles. Which was always comical, as the 390 was supposed to compete with the 970, which it did at launch with launch drivers, but after a few months of driver improvements, it was going toe to toe with not the 970, but the 980 lol.
  • masouth - Wednesday, January 31, 2018 - link

    Either I'm just missing your sarcasm or you are dredging up 2.5 year old GPU news regarding the Fury X being faster than a GPU released 9 months before it while simultaneously ignoring that the GTX 980ti which was released 2 weeks before the Fury X was faster than the Fury X?

    Sounds like another day in the world of GPUs. Anybody that thinks buying the top tier is going to stay there for much more than 6-12 months either caught the market at the absolute perfect time or is basically delusional.
  • masouth - Wednesday, January 31, 2018 - link

    and yes I realize you are commenting on the charts but how data ends up on charts seems like old news as well.
  • Makaveli - Wednesday, January 31, 2018 - link

    So winning by 3-5 fps in a game is destroying now??

    nice troll bait.
  • CiccioB - Thursday, February 1, 2018 - link

    You made my day... comparing the Fury X with the 980 to signa point... wait.. the new flagship Vega64 is faster than the 1050Ti!!!!!!!!!
    Yes, all 1050Ti owners are crying out for that result!
    AMD fanboys can really become ridiculous
  • BurntMyBacon - Thursday, February 1, 2018 - link

    I completely agree that the Fury X vs 980 is a bad comparison and I have no intention of defending what should not be defended.

    Regarding ridiculousness however, the 980 is a single SKU below the proper competition (980Ti) and still in the same high end classification. While not the case here, there have often in the past been price disparities between the top end SKUs from ATi and nVidia that warrant such a comparison.

    1080Ti > 1080 > 1070Ti > 1070 > 1060 > 1050Ti
    N/A > Vega64 > Vega56 > N/A > 580 > 570

    On the other hand, your Vega64 vs 1050Ti comparison is pitting AMDs top GPU against a card 5 SKUs below nVidia's top GPU (4 SKUs below the proper competition) and classified lower mid range at best. Then you proceed to suggest these comparisons are somehow similar and label IGTrading as a "ridiculous" AMD fanboy. IGTrading's comparison was certainly the wrong comparison, but who's more deserving of the "ridiculous fanboy" label?
  • DnaAngel - Tuesday, May 22, 2018 - link

    That analogy doesn't really work. Maybe in 4K, but only a small fraction of users are at 4K right now. The majority are in 1080/1440p and at those resolutions, the 1070Ti is not only trading blows, but sometimes outperforming not the Vega 56 it was meant to compete against, but the 800 dollar Vega 64.
  • Jad77 - Wednesday, January 31, 2018 - link

    ...the Lights and Sensors and the weirdest graphics card cosmetics I've ever seen. Why didn't EVGA use the new 1080 ti shrouds? Whatever, I've gone without Pascal this long I'm waiting for Volta.
  • Gunbuster - Wednesday, January 31, 2018 - link

    I'm confused on that as well. It's got a half-hearted Tron sorta thing going on with the light diffusers, but then has the steampunk rivet looking screw things, then the dollar store electronics looking badges laid over and blocking two of the lights...

Log in

Don't have an account? Sign up now