Power, Temperature, & Noise

As expected given the NVIDIA-standardized GTX 1070 Ti clocks, the standard benchmarks for the GTX 1070 Ti FTW2 were rather humdrum in terms of raw performance, sticking closely to the reference Founders Edition card. Nevertheless, the FTW2's custom features certainly come into play for power, temperature, and noise, factors not to be underrated especially when compared against typically louder and hotter reference blower models. Generally speaking, modern GPU boost technology will typically take advantage of a board’s better power and temperature characteristics for longer and higher boosts, but with mandated reference clockspeeds the GTX 1070 Ti FTW2 simply operates cooler and quieter.

GeForce Video Card Average Clockspeeds
Game EVGA GTX 1070 Ti FTW2 GTX 1070 Ti GTX 1070
Max Boost Clock
1898MHz
1898MHz
1898MHz
Battlefield 1 1860MHz
1826MHz
1797MHz
Ashes: Escalation 1850MHz
1838MHz
1796MHz
DOOM 1847MHz
1856MHz
1780MHz
Ghost Recon Wildlands 1860MHz
1840MHz
1807MHz
Dawn of War III 1860MHz
1848MHz
1807MHz
Deus Ex: Mankind Divided 1855MHz
1860MHz
1803MHz
Grand Theft Auto V 1862MHz
1865MHz
1839MHz
F1 2016 1860MHz
1840MHz
1825MHz
Total War: Warhammer 1855MHz
1832MHz
1785MHz

Though it does appear that the GTX 1070 Ti boosts a little higher and more consistently, there’s little change for the out-of-the-box GTX 1070 Ti FTW2 performance compared to the Founders Edition. For the majority of the standard benchmarks, the difference was within the margin of error.

As for power, the GTX 1070 Ti FTW2’s extra capabilities are rather muted at stock. At idle, the board turns off the fans under certain temperatures – the default master BIOS has a 60 degree threshold – and technically speaking, the LEDs pull some power, but total system consumption rarely reflects such small differences and adjustments.

Idle Power Consumption

While the stated TDP remains 180W, the GTX 1070 FTW2 does possess two 8-pin PCIe power connectors over the Founders Edition’s single 6-pin. Considering the default 100% power limit, this extra power draw capacity can hardly be used in most applications, and for Battlefield 1 system consumption only ends up around 8W higher. But a power virus like FurMark has much less qualms about taking as much as it can, with the GTX 1070 Ti FTW2 immediately pulling a little extra, in the region of 30W at the wall.

Load Power Consumption - Battlefield 1

Load Power Consumption - FurMark

Like most high quality custom boards, the GTX 1070 Ti FTW2 can maintain a typical idling temperature with passive cooling. Under load, the fans kick in and the card settles just below its default 72 degree throttle point, even while running FurMark.

Idle GPU Temperature

Load GPU Temperature - Battlefield 1

Load GPU Temperature - FurMark

EVGA GTX 1070 Ti FTW2 iCX Readings
  Battlefield 1 (1440p) FurMark
GPU Temperature 68°C 70°C
iCX GPU2 Temp. 65°C 70°C
iCX MEM1 Temp. 50°C 53°C
iCX MEM2 Temp. 59°C 64°C
iCX MEM3 Temp. 70°C 78°C
iCX PWR1 Temp. 65°C 70°C
iCX PWR2 Temp. 66°C 72°C
iCX PWR3 Temp. 66°C 72°C
iCX PWR4 Temp. 66°C 72°C
iCX PWR5 Temp. 68°C 75°C
Left Fan Speed (GPU) 962 RPM 1191 RPM
Right Fan Speed (PWM/MEM) 1066 RPM 1320 RPM

At idle, of course, the graphics card utilizes zero fan speed idle. Under load, the cooling design proves capable enough with the fans at a relatively low speed, resulting in a rather quiet profile. Both fans ramp up asynchronously, and both unsurprisingly ramp up higher in FurMark, which features higher temperatures across all the iCX sensors. The general idea behind asynchronous fans can be seen in how the right fan speeds up in response to higher memory and PWM temperatures.

Idle Noise Levels

Load Noise Levels - Battlefield 1

Load Noise Levels - FurMark

This kind of power, temperature, and noise profile will suit some just fine: a quiet card with purposeful temperature LEDs, all without user intervention. Others will immediately notice the unutilized headroom. With XOC Scanner, EVGA looks to court the former by with a single-step automatically applied overclock. And on that note, we move on to the overclocking…

Compute & Synthetics Overclocking
Comments Locked

47 Comments

View All Comments

  • DnaAngel - Tuesday, May 22, 2018 - link

    I wouldn't hold your breath if you think a simple die shrink of the same architecture is going to be "a decent bump in performance". It will be slight (~10%), as typical refreshes are.

    To get a "decent bump in performance" (>20%) you have to wait till the next architecture generation. Navi/Volta in this case.
  • DnaAngel - Monday, May 21, 2018 - link

    AMD has Navi. Yea, and? Vega was supposed to the "Pascal killer" and yet a 475 dollar 1070Ti matches or outperforms their 800 dollar Vega 64 at 1080/1440p in most titles LOL.

    Navi will just be playing catchup to Volta anyway.
  • Hixbot - Thursday, February 1, 2018 - link

    Soo.. what you're saying is mining is the problem. OK got it.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Sure, if you want to be an obtuse retard about it. I clearly explained that miner demand is merely just _one_ of many facets of the GPU pricing issue. Miner demand is no different from Gamer demand, at least in terms of how it affects supply and therefore pricing. 1 GPU bought for mining or gaming is 1 less GPU in circulation, and when there's a low enough amount of GPUs on the market, the price is going to go up.

    And like I already explained, supply could be "fixed" by ordering many more cards to be produced, but because the demand isn't necessarily stable, AIB partners are hesitant to supply more on the market, because they'll be the ones on the losing end when they're stuck on supply that won't sell, should alternative coins tank in price.
  • Tetracycloide - Friday, February 2, 2018 - link

    TLDR of your 3 point explanation is simply "Miners." All the things you've said are just extra details of how "Miners" is the explanation.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Nice reading comprehension. It's a supply side issue that won't be fixed since suppliers aren't confident in the sustainability of demand. And because of that, the supply side won't be burned out (since they're running a business and generating excess supply has a large risk associated with it) and would rather let the GPU pricing handle itself in a low supply/high demand market.

    There's also the GPU scalpers and 3rd party seller market making the pricing worse than they are, since they're draining supply even though they're not the end-users demanding the product. (And these guys are the ones marking up the GPU prices, not Newegg, Amazon, or most brick and mortar retailers.)

    Look, I hate memecoin miners, too. They're wasting a shitload of energy to mine fictitious and worthless money to then put it on a highly volatile stock market like rollercoaster simulator, and they like to brag about how if every pleb had invested in memebucks they'd be "millionaires" when the fact of any volatile market is that very few are big winners, and most are incurring losses.

    But the problem is more than just the miners themselves. There's supply side that won't ramp up production. There's 3rd party market and scalpers selling the GPUs at exorbitant prices, and even memory manufacturers like Samsung playing a part due to rising price of GDDR5(x), which increases the BOM cost for any GPU made.

    If you had even a single brain cell in your head you would've understood from my post that "Oh, yeah, miners are just one piece of the problem. I get ya."
  • mapesdhs - Tuesday, February 6, 2018 - link

    I gave up trying to convey the nuance about these issues last week. Some people just want to believe in simplistic answers so they can blame a certain group and vocally moan, even though they're often part of the problem. There are other factors aswell, such as game devs not making games more visually complicated anymore, review hype/focus on high frequency gaming & VR (driven by gamers playing mostly FPS titles and others that fit this niche), and just the basic nature of current 3D tech being a natural fit for mining algorithms (shaders, etc.) In theory there is a strong market opportunity for a completely new approach to 3D gfx, a different arch, a proper GPU (modern cards are not GPUs; their visual abilities are literally the lowest priority), because atm the cards AMD/NVIDIA are producing are far more lucratively targeted at Enterprise and AI, not gamers; the latter just get the scraps off the table now, something The Good Old Gamer nicely explained a few months ago with a pertinent clip from NVIDIA:

    https://www.youtube.com/watch?v=PkeKx-L_E-o

    When was the last time a card review article even mentioned new visual features for 3D effects? It's been many years. Gamers are not playing games that need new features, they're pushing for high refresh displays (a shift enhanced by freesync/gsync adoption) so game devs aren't adding new features as that would make launch reviews look bad (we'll never have another Crysis in that way again), and meanwhile the products themselves are mathematically ideal for crypto mining tasks, a problem which makes (as the above chap says) both the AIBs and AMD/NVIDIA very reluctant to increase supply as that would create a huge supply glut once the mining craze shifts and the current cards get dumped, damaging newer product lines (miners have no brand loyalty, and AIBs can't risk the unsold stock potential, though in the meantime they'll happily sell to miners directly).

    I notice toms has several articles about mining atm. I hope AT doesn't follow suit. I didn't read the articles, but I bet they don't cover the total environmental cost re the massive e-waste generated by mining conglomerates. I'd rather tech sites that say they care about their readers didn't encourage this mining craze, but then it's a bandwagon many want to jump on while the rewards appear attractive. Ironically, at least LLT is doing a piece intended to show just how much of a con some of these mining setups can be.
  • boozed - Wednesday, January 31, 2018 - link

    Magic beans
  • StevoLincolnite - Wednesday, January 31, 2018 - link

    I bought my RX 580 for $400AUD almost a year ago. It actually hit $700 AUD at one point. Was nuts.

    Normally I would buy two... But this is the first time I have gone single GPU since the Radeon x800 days where you needed a master GPU.
    The costs are just out of control. Glad I am only running a 1440P display so I don't need super high-end hardware.
  • IGTrading - Wednesday, January 31, 2018 - link

    What I find the most interesting is that AMD Fury X absolutely destroys the GeForce 980 in absolutely all benches :) .

    I guess all those nVIDIA buyers feel swindled now ....

Log in

Don't have an account? Sign up now