Final Words

Wrapping up this review, it’s safe to say that we’re in a bind about what kind of conclusion is possible right now. The EVGA GeForce GTX 1070 Ti FTW2 exists in a time of video card oblivion, where any mid-range and above desktop GPU is worth its weight in Ethereum. On a purely hardware level, the card implements the iCX solution as expected, operates quietly, and has RGB blinkies. When manually overclocked, it can perform near or at the level of a GTX 1080.

These were hardly areas of concerns to begin with: iCX has featured in Pascal GeForces for some time now, while GTX 1070 Ti design can reuse tried-and-true GTX 1080 and 1070 coolers and PCBs. iCX itself traces back to the overheating issues with the ACX 3.0 cooler on GTX 1080 and 1070 FTWs, eventually bringing about a VBIOS update and free thermal mod kit. And the iCX cooler was not a radical departure from EVGA's ACX design in the first place.

Unfortunately, the success and popularity of all GTX 1070 Ti boards was going to be based on pricing. Squeezing in the price window between the GTX 1080 and 1070 to block out the Radeon RX Vega 56, the GTX 1070 Ti’s standardized clocks kept performance from threatening GTX 1080s. Given the November release date, now would have been the perfect time to see how the GTX 1080/1070 Ti/1070 fared in practice, if not for cryptomining demand. Of course, with this level of demand across all vendors, the GTX 1070 Ti no longer threatens anything.

Refocusing on the GTX 1070 Ti FTW2, the iCX functionality and Precision XOC’s XOC Scanner go well together in featuring a factory-guided overclock that is almost as straightforward as it gets. While going the manual route still allows for reasonable overclocks. The possibility was mentioned that the single-step XOC Scanner could be extended to more products, though it would be interesting to see if the idea of reference clock only GPUs is related to this.

On the other hand, the GTX 1070 Ti FTW2 seems to have a split focus. The casual click-and-play user would dabble with LEDs and opt for the XOC Scanner factory-guided overclock, but would find less value in the detailed sensor data, asynchronous fan control, Dual BIOS, and power system; features that they may not use at all. An overclocking-inclined user would prefer to ignore XOC Scanner and use all the iCX features, but in that case, what would be appealing about the clock-standardized GTX 1070 Ti except for the price? And as a quiet card, there exists the alternate option of the GTX 1070 Ti FTW Ultra Silent. Not to mention the GTX 1080 and 1070 options.

Pricing would normally be the arbiter of these scenarios. At its $500 MSRP, the GTX 1070 Ti FTW2 is near-cost with cheaper GTX 1080s, rather than a straightforward proportional option like the Founders Edition. Tentatively speaking, the FTW2 iCX featureset may be more valuable to you than the performance difference, which might be recouped with some luck and manual overclocking.

But today, our regular turn of phrase, ‘there’s no such thing as a bad card, only bad prices’ takes on new shades of meaning. The EVGA GeForce GTX 1070 Ti FTW2 is not a bad card by any means – indeed it's quite a good card, as is usually the case with EVGA. However even though it is well-built and has good features, can good cards even exist when all prices are this bad? At least on a relative basis, if you can either pick up a GTX 1070 Ti FTW2 at or near MSRP, or if you can part with the extra cash at current market prices, then it's a compelling enough option.

However on an absolute basis, the market price of the card will give anyone a good reason to do a double-take. A buyer's market it is not, which is not EVGA's fault, but a reality we must all live with. Meanwhile we're getting increasingly worried that by the time the GPU market has normalized, it might already be time for the next generation of consumer graphics cards.

Overclocking
Comments Locked

47 Comments

View All Comments

  • DnaAngel - Tuesday, May 22, 2018 - link

    I wouldn't hold your breath if you think a simple die shrink of the same architecture is going to be "a decent bump in performance". It will be slight (~10%), as typical refreshes are.

    To get a "decent bump in performance" (>20%) you have to wait till the next architecture generation. Navi/Volta in this case.
  • DnaAngel - Monday, May 21, 2018 - link

    AMD has Navi. Yea, and? Vega was supposed to the "Pascal killer" and yet a 475 dollar 1070Ti matches or outperforms their 800 dollar Vega 64 at 1080/1440p in most titles LOL.

    Navi will just be playing catchup to Volta anyway.
  • Hixbot - Thursday, February 1, 2018 - link

    Soo.. what you're saying is mining is the problem. OK got it.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Sure, if you want to be an obtuse retard about it. I clearly explained that miner demand is merely just _one_ of many facets of the GPU pricing issue. Miner demand is no different from Gamer demand, at least in terms of how it affects supply and therefore pricing. 1 GPU bought for mining or gaming is 1 less GPU in circulation, and when there's a low enough amount of GPUs on the market, the price is going to go up.

    And like I already explained, supply could be "fixed" by ordering many more cards to be produced, but because the demand isn't necessarily stable, AIB partners are hesitant to supply more on the market, because they'll be the ones on the losing end when they're stuck on supply that won't sell, should alternative coins tank in price.
  • Tetracycloide - Friday, February 2, 2018 - link

    TLDR of your 3 point explanation is simply "Miners." All the things you've said are just extra details of how "Miners" is the explanation.
  • JoeyJoJo123 - Monday, February 5, 2018 - link

    Nice reading comprehension. It's a supply side issue that won't be fixed since suppliers aren't confident in the sustainability of demand. And because of that, the supply side won't be burned out (since they're running a business and generating excess supply has a large risk associated with it) and would rather let the GPU pricing handle itself in a low supply/high demand market.

    There's also the GPU scalpers and 3rd party seller market making the pricing worse than they are, since they're draining supply even though they're not the end-users demanding the product. (And these guys are the ones marking up the GPU prices, not Newegg, Amazon, or most brick and mortar retailers.)

    Look, I hate memecoin miners, too. They're wasting a shitload of energy to mine fictitious and worthless money to then put it on a highly volatile stock market like rollercoaster simulator, and they like to brag about how if every pleb had invested in memebucks they'd be "millionaires" when the fact of any volatile market is that very few are big winners, and most are incurring losses.

    But the problem is more than just the miners themselves. There's supply side that won't ramp up production. There's 3rd party market and scalpers selling the GPUs at exorbitant prices, and even memory manufacturers like Samsung playing a part due to rising price of GDDR5(x), which increases the BOM cost for any GPU made.

    If you had even a single brain cell in your head you would've understood from my post that "Oh, yeah, miners are just one piece of the problem. I get ya."
  • mapesdhs - Tuesday, February 6, 2018 - link

    I gave up trying to convey the nuance about these issues last week. Some people just want to believe in simplistic answers so they can blame a certain group and vocally moan, even though they're often part of the problem. There are other factors aswell, such as game devs not making games more visually complicated anymore, review hype/focus on high frequency gaming & VR (driven by gamers playing mostly FPS titles and others that fit this niche), and just the basic nature of current 3D tech being a natural fit for mining algorithms (shaders, etc.) In theory there is a strong market opportunity for a completely new approach to 3D gfx, a different arch, a proper GPU (modern cards are not GPUs; their visual abilities are literally the lowest priority), because atm the cards AMD/NVIDIA are producing are far more lucratively targeted at Enterprise and AI, not gamers; the latter just get the scraps off the table now, something The Good Old Gamer nicely explained a few months ago with a pertinent clip from NVIDIA:

    https://www.youtube.com/watch?v=PkeKx-L_E-o

    When was the last time a card review article even mentioned new visual features for 3D effects? It's been many years. Gamers are not playing games that need new features, they're pushing for high refresh displays (a shift enhanced by freesync/gsync adoption) so game devs aren't adding new features as that would make launch reviews look bad (we'll never have another Crysis in that way again), and meanwhile the products themselves are mathematically ideal for crypto mining tasks, a problem which makes (as the above chap says) both the AIBs and AMD/NVIDIA very reluctant to increase supply as that would create a huge supply glut once the mining craze shifts and the current cards get dumped, damaging newer product lines (miners have no brand loyalty, and AIBs can't risk the unsold stock potential, though in the meantime they'll happily sell to miners directly).

    I notice toms has several articles about mining atm. I hope AT doesn't follow suit. I didn't read the articles, but I bet they don't cover the total environmental cost re the massive e-waste generated by mining conglomerates. I'd rather tech sites that say they care about their readers didn't encourage this mining craze, but then it's a bandwagon many want to jump on while the rewards appear attractive. Ironically, at least LLT is doing a piece intended to show just how much of a con some of these mining setups can be.
  • boozed - Wednesday, January 31, 2018 - link

    Magic beans
  • StevoLincolnite - Wednesday, January 31, 2018 - link

    I bought my RX 580 for $400AUD almost a year ago. It actually hit $700 AUD at one point. Was nuts.

    Normally I would buy two... But this is the first time I have gone single GPU since the Radeon x800 days where you needed a master GPU.
    The costs are just out of control. Glad I am only running a 1440P display so I don't need super high-end hardware.
  • IGTrading - Wednesday, January 31, 2018 - link

    What I find the most interesting is that AMD Fury X absolutely destroys the GeForce 980 in absolutely all benches :) .

    I guess all those nVIDIA buyers feel swindled now ....

Log in

Don't have an account? Sign up now