Synthetics

As always we’ll also take a quick look at synthetic performance. These tests mainly serve as a canary for finding important architectural changes, and with the exception of pixel throughput we are not expecting any major changes for GTX 980 and GM204.

Synthetic: TessMark, Image Set 4, 64x Tessellation

GM204 is designed to have an ever-so-slightly higher triangle throughput rate than GK110 – 16 tris/clock versus 15 tris/clock, and sure enough the GTX 980 comes out on top in TessMark, slightly edging out the GTX 780 Ti. The difference is only very slight here, and though GM204 should be a bit more powerful than GK110 in practice it’s a dead heat.

Moving on, we have our 3DMark Vantage texture and pixel fillrate tests, which present our cards with massive amounts of texturing and color blending work. These aren’t results we suggest comparing across different vendors, but they’re good for tracking improvements and changes within a single product family.

Synthetic: 3DMark Vantage Texel Fill

Beginning with Maxwell NVIDIA reduced their texture-to-compute ratio from 12:1 to 16:1. As a result of this change Maxwell GPUs have fewer texture units than comparable Kepler GPUs. Compounding this effect is the fact that Maxwell CUDA cores are more efficient than Kepler CUDA cores, leading to NVIDIA placing fewer cores overall and further reducing the texture fill rate.

As a result the GTX 980 is not texture fillrate competitive with any of the GK110 cards. It is competitive with the GK104 cards, but only because these cards had the same number of texture units at 128. NVIDIA has told us that they believe this new ratio is a better fit for modern workloads, and judging from the performance we’re seeing elsewhere it would appear that NVIDIA is right.

Synthetic: 3DMark Vantage Pixel Fill

On the other hand, thanks to NVIDIA’s newer 3rd generation delta color compression technology, our 3DMark pixel fillrate performance is through the roof. GTX 980 comes very close to doubling the throughput of our GK110 cards and more than doubles the throughput of the GK104 cards, reflecting the fact that it has 64 ROPs and more importantly has the available bandwidth to put them to good use.

This benchmark in a nutshell is why NVIDIA can deliver chart-topping performance despite having only 2/3rds the memory bandwidth of GTX 780 Ti. By improving their color compression to this point, NVIDIA can significantly reduce their memory bandwidth requirements Maxwell 2, allowing them to do more with less. In real games the result won’t be anywhere near this remarkable since this is a pure pixel fillrate test, but it goes to show that NVIDIA has been able to expand their effective memory bandwidth in concert with their ROP and shader performance improvements.

GRID 2 Compute
POST A COMMENT

274 Comments

View All Comments

  • Kutark - Sunday, September 21, 2014 - link

    I'd hold on to it. Thats still a damn fine card. Honestly you could probably find a used one on ebay for a decent price and SLI it up.

    IMO though id splurge for a 970 and call it a day. I've got dual 760's right now, first time i've done SLI in prob 10 years. And honestly, the headaches just arent worth it. Yeah, most games work, but some games will have weird graphical issues (BF4 near release was a big one, DOTA 2 doesnt seem to like it), others dont utilize it well, etc. I kind of wish id just have stuck with the single 760. Either way, my 2p
    Reply
  • SkyBill40 - Wednesday, September 24, 2014 - link

    @ Kutark:

    Yeah, I tried to buy a nice card at that time despite wanting something higher than a 660Ti. But, as my wallet was the one doing the dictating, it's what I ended up with and I've been very happy. My only concern with a used one is just that: it's USED. Electronics are one of those "no go" zones for me when it comes to buying second hand since you have no idea about the circumstances surrounding the device and seeing as it's a video card and not a Blu Ray player or something, I'd like to know how long it's run, it's it's been OC'd or not, and the like. I'd be fine with buying another one new but not for the prices I'm seeing that are right in line with a 970. That would be dumb.

    In the end, I'll probably wait it out a bit more and decide. I'm good for now and will probably buy a new 144Hz monitor instead.
    Reply
  • Kutark - Sunday, September 21, 2014 - link

    Psshhhhh.... I still have my 3dfx Voodoo SLI card. Granted its just sitting on my desk, but still!!!

    In all seriousness though, my roommate, who is NOT a gamer, is still using an old 7800gt card i had laying around because the video card in his ancient computer decided to go out and he didnt feel like building a new one. Can't say i blame him, Core 2 quad's are juuust fine for browsing the web and such.
    Reply
  • Kutark - Sunday, September 21, 2014 - link

    Voodoo 2, i meant, realized i didnt type the 2. Reply
  • justniz - Tuesday, December 9, 2014 - link

    >> the power bills are so ridiculous for the 8800 GTX!

    Sorry but this is ridiculous. Do the math.

    Best info I can find is that your card is consuming 230w.
    Assuming you're paying 15¢/kWh, even gaming for 12 hours a day every day for a whole month will cost you $12.59. Doing the same with a gtx980 (165w) would cost you $9.03/month.

    So you'd be paying maybe $580 to save $3.56 a month.
    Reply
  • LaughingTarget - Friday, September 19, 2014 - link

    There is a major difference between market capitalization and available capital for investment. Market Cap is just a rote multiplication of the number of shares outstanding by the current share price. None of this is available for company use and is only an indirect measurement of how well a company is performing. Nvidia has $1.5 billion in cash and $2.5 billion in available treasury stock. Attempting to match Intel's process would put a significant dent into that with little indication it would justify the investment. Nvidia already took on a considerable chunk of debt going into this year as well, which would mean that future offerings would likely go for a higher cost of debt, making such an investment even harder to justify.

    While Nvidia is blowing out AMD 3:1 on R&D and capacity, Intel is blowing both of them away, combined, by a wide margin. Intel is dropping $10 billion a year on R&D, which is a full $3 billion beyond the entire asset base of Nvidia. It's just not possible to close the gap right now.
    Reply
  • Silma - Saturday, September 20, 2014 - link

    I don't think you realize how many billion dollars you need to spend to open a 14 nm factory, not even counting R&D & yearly costs.
    It's humongous, there is a reason why there are so few foundries in the world.
    Reply
  • sp33d3r - Saturday, September 20, 2014 - link

    Well, if the NVIDIA/AMD CEOs is blind enough and cannot see it coming, then intel are gonna manufacture their next integrated graphics on a 10 or 8 nm chip and though immature will be a tough competition to them in terms of power and efficiency and even weight.

    remember currently pcs load integrated graphics as a must by intel and people add third party graphics only 'cause intels is not good enough literally adding weight of two graphics cards (Intels and third partys) to the product. Its all worlds apart more convenient when integrated graphics outperforms or able to challenge third party GPUs, we would just throw away NVIDIA and guess what they wont remain a monopoly anymore rather completely wiped out

    Besides Intels integrated graphics are getting more mature in terms of not just die size with every launch, just compare 4000s with 5000s, it wont be long before they catch up.
    Reply
  • wiyosaya - Friday, September 26, 2014 - link

    I have to agree that it is partly not about the verification cost breaking the bank. However, what I think is the more likely reason is that since the current node works, they will try to wring every penny out of that node. Look at the prices for the Titan Z. If this is not an attempt to fleece the "gotta have it buyer," I don't know what is. Reply
  • Ushio01 - Thursday, September 18, 2014 - link

    Wouldn't paying to use the 22nm fabs be a better idea as there about to become under used and all the teething troubles have been fixed. Reply

Log in

Don't have an account? Sign up now