Final Fantasy XV (DX11)

Upon arriving to PC earlier this, Final Fantasy XV: Windows Edition was given a graphical overhaul as it was ported over from console, fruits of their successful partnership with NVIDIA, with hardly any hint of the troubles during Final Fantasy XV's original production and development.

In preparation for the launch, Square Enix opted to release a standalone benchmark that they have since updated. Using the Final Fantasy XV standalone benchmark gives us a lengthy standardized sequence to utilize OCAT. Upon release, the standalone benchmark received criticism for performance issues and general bugginess, as well as confusing graphical presets and performance measurement by 'score'. In its original iteration, the graphical settings could not be adjusted, leaving the user to the presets that were tied to resolution and hidden settings such as GameWorks features.

Since then, Square Enix has patched the benchmark with custom graphics settings and bugfixes for better accuracy in profiling in-game performance and graphical options, though leaving the 'score' measurement. For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard. Final Fantasy XV also supports HDR, and it will support DLSS at some later date.

Final Fantasy XV - 3840x2160 - Ultra QualityFinal Fantasy XV - 2560x1440 - Ultra QualityFinal Fantasy XV - 1920x1080 - Ultra Quality

At 1080p and 1440p, the RTX 2060 (6GB) returns to its place between the GTX 1080 and GTX 1070 Ti. Final Fantasy is less favorable to the Vega cards so the RTX 2060 (6GB) is already faster than the RX Vega 64. With the relative drop in 4K performance, there are more hints of 6GB being potentially insufficient.

Final Fantasy XV - 99th Percentile - 3840x2160 - Ultra QualityFinal Fantasy XV - 99th Percentile - 2560x1440 - Ultra QualityFinal Fantasy XV - 99th Percentile - 1920x1080 - Ultra Quality

 

Wolfenstein II Grand Theft Auto V
Comments Locked

134 Comments

View All Comments

  • Storris - Tuesday, January 8, 2019 - link

    The RTX2060 game bundle includes RTX showcases Battlefield 5 and Anthem, yet you haven't tested either of those games.

    What's the point of an RTX review, if the RTX doesn't actually get reviewed?

    Also, what's the point of a launch, and the day 1 driver, when no-one can buy the card yet?
  • catavalon21 - Thursday, March 7, 2019 - link

    Paper launches are nothing new for either Nvidia or AMD GPUs.
  • eastcoast_pete - Tuesday, January 8, 2019 - link

    My take-home is: the 2060 is a good, maybe even very good graphics card. Price-performance wise, it's not a bad proposition, if (IF) you're reasonably sure that you won't run into the memory limit. The 6 GB the 2060 comes with is vintage Nvidia: it'll keep the 2060 off the 2070's back even for games that wouldn't require the 2070's bigger GPU brawn, and give Nvidia an easy way to make a 2060 Ti model in the near future; just add 2 GB for a full 8.
    That's my biggest beef with this card: it could have gone from a good to a great mid-upper level card just by giving it the 8 GB VRAM to start with. Now, it's not so sure how future proof it is.
  • TheJian - Tuesday, January 8, 2019 - link

    Going to do this in a few posts, since I was writing while reading a dozen or more reviews and piling up a TON of data. I own AMD stock (NV soon too), so as a trader, you HAVE to do this homework, PERIOD(or you're dumb, and like to lose money...LOL). Don't like data or own stock? Move along.

    Why is DLSS and RT or VRS benchmarks not shown? It should have been the HIGHLIGHT of the entire article. NO HD textures in far cry (would a user turn this off before testing it?)?
    https://www.youtube.com/watch?v=9mxjV3cuB-c
    CLEARLY DLSS is awesome. Note how many times DLSS makes the 2060 run like a 2080 with TAA. 39% improvement he says with DLSS. WOW. 6:29 you see 2060+DLSS BEATING 2080 with TAA. Note he has MULTIPLE tests here and a very good vid review with many useful data points tested. Why can't anandtech show any of these games that use NEW TECH? Ah right, sold out to AMD as a portal site. Same as Tomshardware (your sister site, no dlss or RT there either, just COMING soon...LOL). Note he also says in there, it would be INSANE to do RTX features and not have 2060 capable as it will be the BASE of RTX cards likely for years (poor will just get them next year at 7nm or something for a little cheaper than this year people) kind of how Intel screwed base graphics with, well CRAP graphics integrated so devs didn't aim higher. This is the same with last gen console stuff, which held us back on PC for how long? @9:19, 60% faster than 1060 for 40% more MONEY (in older crysis 3 even). It was 42% faster than RX 590 in the same game. Next game Shadow of the Tomb Raider, 59% faster than 1060, 40% faster than RX590. Out of 11 titles tested it’s better than Vega56 in 10 of them, only far cry 5 was better on vega56 (only because of perf spurt in beginning of benchmark or that one lost too). Beats Vega64 in many too even rebenched with latest drivers as he notes.

    @ 14:30 of the vid above Wolf New Collossus with VRS perf turned on vs. 1060 92% higher fps (again for 40% more cash)! Vega56 just died, 64 not far behind, as you get RT+DLSS on NV which just adds to above info. Cheapest Vega on newegg $369, Vega64 higher at $399. Tough sell against 2060 WITH RT+DLSS+VRS and less watts (210 V56, 295 V64, 160 for 2060 RTX - that's bad). Power bill for 50w 8hrs a day is $19 @ .12 (and many places over .2 in USA never mind elsewhere). So double that for V64 at best (less than 8hrs) if you game and have a kid etc that does too on that PC. Easy to hit 8hrs avg even alone if you game heavy just on weekends. You can easily put in 20hrs on a weekend if you're single and a gamer, and again easily another 4 a night during the week. Got kids, you’ll have more people doing damage. My current old Dell 24 (11yrs old Dell wfp2407-hc) uses ~110w. Simply replacing it pays for Gsync, as I'd save the same $19 a year (for a decade? @ .12 watt cost, many places in USA over .20 so savings higher for some) just buying a 30in new model at 50w. Drop that to 27in Dell and it goes to 35w! Think TCO here people, not today's price. So simply replacing your monitor+gpu (say 2060), might save you $39 a year for 5-10yrs. Hey, that's a free 2060 right there ;) This is why I'll laugh at paying $100 more for 7nm with 1070ti perf (likely much higher) with better watts/more features. I know I'll play on it for 5yrs probably then hand it to someone else in the family for another 3-5. I game more on my main PC than TV (htpc), so 1070ti can move to the HTPC and I'll save on the main pc with 7nm more. Always think TCO.

    https://www.ign.com/boards/threads/variable-rate-s...
    “The end result is that instead of the RTX 2080 improving on the GTX 1080 by an average of around 25 to 30% in most titles, the 2080 outperforms the 1080 by around 60% in Wolfenstein II using VRS.”
    “So in essence, turning the VRS to Performance mode gets you half way between a 1080 Ti and a 2080 Ti, as opposed to basically matching the 1080 Ti in performance.” And again mentions next gen consoles/handheld to have it.

    https://store.steampowered.com/hwsurvey/Steam-Hard...
    Again, why are you even bothering with 4k at 1.42% usage on steam (125 MILLION GAMERS). NOBODY is using it. Yeah, I call 1.42% NOBODY. Why not test more USEFUL games at resolutions people actually use? This is like my argument with Ryan on the 660ti article where he kept claiming 1440p was used by enthusiasts...LOL. Not even sure you can claim that TODAY, years later as 1080p is used by 60.7% of us and only 3.89% on 1440p. Enthusiasts are NOT playing 4k or even 1440p unless you think gaming enthusiasts are only 5% of the public? Are you dense? 4k actually dropped .03%...ROFLMAO. 72% of MULTI-MONITOR setups are not even 4k…LOL. Nope, just 3840x1080. Who is paying you people to PUSH 4k when NOBODY uses it? You claimed 1440p in 2012 for GTX 660ti. That is stupid or ignorant even TODAY. The TOTAL of 1440p+4K is under 5%. NOBODY is 5%. Why is 4K listed before 1080p? NOBODY is using it, so the focus should be 1080P! This is like setting POLICY in your country based on a few libtards or extremists wanting X passed. IE, FREE COLLEGE for everyone, without asking WHO PAYS FOR IT? Further, NOT realizing MANY people have no business going to college as they suck at learning. Vocational for them at best. You are wasting college on a kid that has a gpa under 3.0 (attach any gpa you want, you get the point). 4k, but, but, but…So what, NOBODY USES IT. Nobody=1.42%...LOL.

    MipsToRemove, again lowering qual? Why not test full-on, as many users don't even know what .ini files are?...LOL. I'm guessing settings like this make it easier for AMD. MipsToRemove=0 sets ground textures to max and taxes vid memory. What are you guys using? If it's not 0 why?

    “The highest quality preset, "Mein leben!", was used. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the preset, neither GPU Culling nor Deferred Rendering was enabled. NVIDIA Adaptive Shading was not enabled.”
    So in this case you turn off tech for both sides that NOBODY would turn off if buying EITHER side (assuming quality doesn’t drop SET properly). You buy AMD stuff for AMD features, and you do the same for RTX stuff on NV etc. Who goes home and turns off MAIN features of their hardware. Unless it makes a game UNPLAYABLE why the heck would ANYONE do this? So why test like this? Who tests games in a way WE WOULD NOT PLAY them? Oh, right, anandtech. Providing you with the most useless tests in the industry, “Anandtech”. We turn off everything you’d use in real life, you’re welcome…LOL.

    Anandtech quote:
    “hidden settings such as GameWorks features” for Final Fantasy and no DLSS. Umm, who buys NV cards to turn off features?
    “For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features and 'Model LOD', the latter of which is left at standard.”
    WTH are you doing here? Does it speed up NV cards? If yes why would anyone turn it off (trying to show NV weaker?)? I paid for that crap, so I’d definitely turn it on if it is FASTER or higher QUALITY.
  • TheJian - Tuesday, January 8, 2019 - link

    Crap, have to reply to my own post to keep info in order (5.5 pages in word...LOL). Oh well, I'll just title 2nd post and on so I don't have to care ;)

    https://www.techpowerup.com/reviews/NVIDIA/DLSS_Su...
    If someone can test DLSS in EARLY DECEMBER, why can’t Anandtech in January? You could at least show it on NV vs. NV without so people see how FAST it is (39% as noted before by DigitalFoundry youtube vid above). Ah, right, you don’t want people to know there is a 39% bump in perf coming for many games huh? I see why AMD will skip it, it takes a lot of homework to get it right for each game, as the article from techpowerup discusses. Not feasible on 50mil net, maybe next year:
    “DLSS is possible only after NVIDIA has generated and sampled what it calls a "ground truth" images—the best iteration and highest image quality image you can engender in your mind, rendered at a 64x supersampling rate. The neural network goes on to work on thousands of these pre-rendered images for each game, applying AI techniques for image analysis and picture quality optimization. After a game with DLSS support (and NVIDIA NGX integration) is tested and retested by NVIDIA, a DLSS model is compiled. This model is created via a permanent back propagation process, which is essentially trial and error as to how close generated images are to the ground truth. Then, it is transferred to the user's computer (weighing in at mere MBs) and processed by the local Tensor cores in the respective game (even deeper GeForce Experience integration). It essentially trains the network to perform the steps required to take the locally generated image as close to the ground truth image as possible, which is all done via an algorithm that does not really have to be rendered.”

    Yeah, AMD can’t afford this on a PER GAME basis at under $50mil NET income yearly. Usually AMD posts a LOSS BTW, 3 of 4 recent years 400mil loss or MORE, 8B lost over the life of AMD as a company, 2018 should be ~400mil vs. NV 3B-4B+ NET per year now (3.1B 2017 + over 4B for 2018 1B+ per Q NET INCOME) .

    “With DLSS enabled, the game runs at a nice and steady 67 FPS (a 33% performance increase). That's a huge difference that alone can be the deciding factor for many.”
    Agreed, I guess that’s why Anadtech/toms refuse to delve into this tech? ;)
    “You can now increase settings on models and textures that would have previously driven FPS down below playable levels. DLSS will in turn bring the framerates back up.” OK, then, it’s great, says he likes quality also stating “pleased”. I’m sure someone will say it’s not as good as 4K native. Well no, the point is 4K “LIKE” quality on crappy hardware that can’t support it ;) As noted in the previous quote. Turn on options that would normally kill your fps, and use DLSS to get those fps back making it playable again. DLSS will always look better than your original res, as again, it’s turning 1080p into 1440p/4k (at 4k, so far in this game, it’s just fps boosting). From what I’ve seen its pretty much 1440p for free without a monitor upgrade, or reasonable 4k “LIKE” quality again, on 1080p or 1440p. Also enables playable 4k for some that would normally turn crap down to get there or not play at all.

    I could go on, but I can’t even be bothered to read the rest of the article as I keep having to check to see if benchmarks include some crap that makes the data USELESS to me yet again. You should be testing AMD best advantages vs. NV best advantages ALWAYS unless it changes QUALITY (which would be like cheating if you lower it for better perf). IE, turn on everything that boosts speed for BOTH sides, unless again, QUALITY is dropping then turn it off. USERS will turn on everything unless it HURTS them right? 8:02 in that youtube vid above, he gains 3-4% PER move up in adaptive shading settings. This lets the 2060 best 1080 by max 15%. Besides, there are so many OTHER reviews to read that maybe didn’t do all the dumb things pointed out here.
    https://videocardz.com/79627/nvidia-geforce-rtx-20...

    I did skip to the end (conclusions on gpu reviews are usually ridiculous). Not quite mainstream, but that mark has moved up, just ask Intel/NV (hedt, and top 2080ti selling faster than lower models, might change with 2060 though). “The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.”...LOL. Uh, I can say that about EVERY card on the market at some point right? <15% of 125mil steam users have 8GB+. Don't you think game devs will aim at 85% of the market first (maybe a HD texture pack pushes a few over later, but main game aims at MAIN audience)? This is like consoles etc mentioned above holding us back, intel igpu doing the same. 6GB will too probably I guess.

    "Now it is admittedly performing 14-15% ahead of the 8GB GTX 1070, a card that at MSRP was a relatively close $379"...LOL. Anything to take a shot, even when the conclusion is asinine as you're not even talking what happens NEXT YEAR when all the games are using DLSS+RT (either or both), and maybe VSR which got 92% boost vs. 1060 here. That is a LOT more than 15% over 1070 too! FAR higher than 59% you quote without using it right??). I'd guess 20-30mil will be sold from RTX line in the next year (NV sells 65-70% of the discrete cards sold yearly, surely 1/2 of NV sales will be RTX after 7nm), and much of those will be 6GB or less? No doubt the 3050, next year or whatever will then have 6GB too and sell even larger numbers. If you are claiming devs will aim at 15% of the market with 8GB, I will humbly guess they will DEFINITELY aim at 6GB which IS already 11% (same as 8GB % BTW), and will double in the next year or less. Most are on 1080p and this card does that handily in everything. With only 3.5% on 1440p, I don’t think many people will be aiming at 8GB for a while. Sure you can max something the crap out of a few to CAUSE this, but I doubt many will USE it like this anyway (you benchmark most stuff with features disabled!).

    There are ~92-100mil discrete cards sold yearly now (36% of ~260-280mil pcs sold yearly), and 70% currenly are NV cards. How many 6GB cards OR LESS do you think will sell from ~62-70mil NV gpus sold in 2018? AMD might release a 6GB in the next year or two also. Navi has a 4GB at under $200 (rumor $129, I think higher but…), so plenty of new stuff will sell under 6GB. Hopefully 2050 etc will have 6GB of GDDR5x (cheaper than GDDR6 for now) or something too too get more 6GB out there raising the 40% on 2GB/4GB (~20% each, great upgrade for 2GB people). Your site is such a fan of TURNING OFF features, or graphics DOWN to play 1440p/4k, I don't get why this is a problem anyway. Can't you just turn off HD textures like you already do to avoid it (or something else)? Never mind, I know you can. So something to revisit is just hogwash. People can just turn down one or two things and magically it will be using 6GB or less again...LOL. Again, 6GB or LESS is 85% of the market in gamers! See steam. How many games hit 8GB in your benchmarks? LOL. You test with stuff OFF that would cause an 8GB hit (hd textures etc) already. What are you complaining about here? The card doesn't stop functioning because a game goes over 6GB, you just turn something down right? LOL.

    “ What makes the $350 pricing at least a bit more reasonable is its Radeon competition. Against RX Vega at its current prices the RTX 2060 (6GB) is near-lethal”
    So is it “a bit more reasonable” or LETHAL? LETHAL sounds VERY reasonable to anyone looking at AMD before July if navi even hits by then and they don’t have RT+DLSS AFAIK, so worth something with the numbers from the youtube guy at digitalfoundry. His results testing the NEW features (well duh anandtech) are quite amazing.

    “That driver is due on the same day of the RTX 2060 (6GB) launch, and it could mean the eventual negation of AMD’s FreeSync ecosystem advantage.”

    Uh, no, it DEFINITELY ENDS IT AS OF NOW. It is no ADVANTAGE if the other guy has it on all current cards right? It’s just a matter of checking the top ~100 freesync monitors for approval as NV will surely aim at the best sellers first (just like game devs vs. 85% of the market). Oh wait, they tested 400 already, 12 made it so far, but you can turn it on in all models if desired:
    “Owners of other Adaptive-Sync monitors will be able to manually enable VRR on Nvidia graphics cards as well, but Nvidia won't certify how well that support will work.”
    So maybe yours works already even if NOT on the list, just a matter of how well, but heck that describes freesync anyways (2nd rate gen1 at least, gen2 not much better as AMD isn’t forcing quality components still) vs. gsync which is easily the best solution (consider TCO over monitor life). You didn’t even mention DLSS in the conclusion and that is a MASSIVE boost to perf, netting the hit from RT basically. But you guys didn’t even go there…ROFLMAO. Yet again, you mention the 6GB in the final paragraph…LOL. How many times can you mention a “what if >6GB” scenario (that can simply be fixed by turning something down slightly or OFF like HD textures) vs. IGNORING completely DLSS a main feature of RTX cards? A LOT, apparently. Even beta benchmarks of the tech are AWESOME. See digitalfoundry guy above. He and the techpowerup VSR/DLSS info both say 32%/33% gain turning either on. You don’t think this should be discussed at all in your article? That is MASSIVE for EITHER tech right? As techpowerup notes in their DLSS article:
    “Our RTX 2080 Ti ran at 50 FPS, which was a bit too low for comfort. With DLSS enabled, the game runs at a nice and steady 67 FPS (a 33% performance increase). That's a huge difference that alone can be the deciding factor for many.”
    OK, so 2080ti shows 33% DLSS, and 2060 shows 32% at digitalfoundry on VSR. Seems like even patched in games will do it for this perf increase. Why would a dev ignore ~33% increase across the board? AMD better get this tech soon as I totally agree this would persuade MANY buyers alone. Note DF did their talk of this at 1080p, techpowerup did it at 4k but not as much value here IMHO, just making playable from unplayable really, but lower res add 4k LIKE LOOK too.
  • TheJian - Tuesday, January 8, 2019 - link

    Screw it, all in a row, doesn't look bad...LOL. 3rd final (assuming the post takes, each grows):

    https://devblogs.nvidia.com/turing-variable-rate-s...
    How can you ignore something that is SIMPLE to integrate, and will be adding VSR plugins for game engines soon. Do you even do much work if NV includes it as a plugin? I doubt it. Also note, the more COMPLEX your pixel shaders are, the MORE you GAIN in perf using the tech. So your WHAT IF scenario (games always needing more stuff) works in reverse here right? Games will NOT pass this up if it’s easy to add especially with a plugin in game engines. But you guys didn’t even mention a 33% perf add and as he also noted, when GTX 980 launched it wasn’t a massive improvement, but over it’s life “as maxwell matured, it left previous gens in the DUST” with driver updates!

    https://www.youtube.com/watch?v=edYiCE0b8-c
    For those who want to know VSR tech. This vid is Dec4…LOL. A month later Anandtech has never heard of it. Also note the info regarding handhelds, as VRS tech really helps when your gpu resources are stretched to the limit already (think Nintendo switch etc). Note Devs have been asking for this tech for ages, so he thinks next consoles will support it.
    https://hothardware.com/reviews/nvidia-geforce-rtx...
    Discussion here of Content Adaptive/Foveated/Motion Adaptive/Lens Optimized & VRS as a whole etc shading. NVIDIA claims developers can implement content-based shading rate reductions without modifying their existing rendering pipeline and with only small changes to shader code. This is HUGE for VR too as you can avoid rendering pixels that would be DISCARDED anyway before going to VR headset.
    “Turing’s new Variable Rate Shading techniques, as well as its more flexible Multiview Rendering capabilities will take time for developers to adopt, but the net gain could be over a 20 percent speed-up in graphically rich scenes and game engines, but with comparable or higher image quality as a result of these optimizations.”
    Actually we’ve already seen 33% in Wolfenstein right? So he’s a little low here, but point made.

    https://developer.nvidia.com/vrworks/graphics/vari...
    “Variable Rate Shading is a new, easy to implement rendering technique enabled by Turing GPUs. It increases rendering performance and quality by applying varying amount of processing power to different areas of the image.”
    Ahh, well, only a 32% boost feature (netting 92% boost over 1060…LOL), so who cares about this crap, and never mind maybe more too depending on complexity as noted before. But AMD doesn’t have it, so turn it all off and ignore free perf…LOL.

    https://www.tomshardware.com/reviews/nvidia-turing...
    Tomshardware knew what it was at 2080 review in SEPT and noted THIS:
    “But on a slower card in a more demanding game, it may become possible to get 20%+-higher performance at the 60ish FPS level. Perhaps more important, there was no perceivable image quality loss.”

    OK so why did they turn it all off in 2060 review?
    https://www.tomshardware.com/reviews/nvidia-geforc...
    “To keep our Wolfenstein II benchmarks fair, we disable all of the Turing card's' Adaptive Shading features.”
    ER, UM, if QUALITY is NOT lost as they noted before, WTH would you turn it OFF for? Is it UNFAIR that NV is far faster due to BETTER tech that does NOT degrade quality? NO, it’s AMD’s problem they don’t have it. ALL users will turn EVERY feature on, if those features do NOT drop QUALITY right? Heck, ok, if you’re not going to test it AGAINST AMD, then at least test it against themselves, so people can see how massive the boost can be. Why would you ignore that? Ah right, sister site just as bad as anandtech…ROFL. Acting like one company doesn’t have features that they are CLEARLY spending R&D on (and REMEMBER as noted before devs wanted this tech!), just because the OTHER guy hasn’t caught up, is DUMB or MISLEADING for both sites that have went down the toilet for reliable data as you’d use your product. It only counts if AMD has it too…Until then, THESE FEATURES DON’T EXIST, we SWEAR…LOL. The second AMD gets it too, we’ll see Toms/Anand bench it…LOL. Then it won’t be “turned off turings adaptive features”, it will be “we turned on BOTH cards Adaptive features” because AMD now won’t be left in the DUST. They screw up their conclusion page too…LOL:
    “No, GeForce RTX 2060 needs to be faster and cheaper than the competition in order to turn heads.”
    Uh, Why do they have to be CHEAPER if they are FASTER than AMD?
    “Nvidia’s biggest sin is probably calling this card a GeForce RTX 2060. The GeForce GTX 1060 6GB launched at $250.”
    Uh, no, it started at $299 as founder’s edition just as this one is called that (by PCworld, Guru3d etc), so again, expect price drop once NV is done selling them direct 

    https://hothardware.com/reviews/nvidia-geforce-rtx...
    “GeForce RTX 2060 - Taking Turing Mainstream”
    I guess, some think $350 is mainstream…LOL.

    https://www.pcworld.com/article/3331247/components...
    As PCWorld says, “Not only does the card pack the dedicated RT and tensor core hardware that gives RTX GPUs their cutting-edge ray tracing capabilities, it trades blows in traditional game performance with the $450 GTX 1070 Ti rather than the $380 GTX 1070.”
    https://www.pcworld.com/article/3331247/components...
    “The only potential minor blemish on the spec sheet: memory capacity. The move to GDDR6 memory greatly improves overall bandwidth for the RTX 2060 versus the GTX 1060, but the 6GB capacity might not be enough to run textures and other memory-intensive graphics options at maximum settings in all games if you’re playing at 1440p resolution.”
    OK, so according to them, who cares, as 1440p cards have 8GB, and is only 3.5% of the market anyway…LOL.
    NV’s response to why 6GB “Right now the faster memory bandwidth is more important than the larger memory size.” They could have put cheaper 8GB of GDDR5, but they chose faster rather than more to hit $349 (and $300 next month probably from other vendors that are NOT founders model). Though they think maybe all will be $349 it seems.
    “We focused our testing on 1440p and 1080p, as those are the natural resolutions for these graphics cards.”
    LOL, agreed…Why anyone tests 4K here with ~1.5% is dumb.
    “We use the Ultra graphics preset but drop the Shadow and Texture Quality settings to High to avoid exceeding 8GB of VRAM usage”
    Ahh, so PCWorld proves Anandtech is misleading people like cards just die if you hit 8GB, nope, you turn something down in a game like Middle Earth Shadow of War…LOL. Anandtech acts like this is a SHOWSTOPPER. OMG, OMG…LOL. Hmm, 18 degrees lower than Vega64, 8 below vega56, ouch.

    Should all make sense, but it is Almost 10am and I’ve been up all night...LOL. No point in responding anandtech, that will just end with me using DATA to beat you to death like the 660ti article (used ryan's own data against his own conclusions, and tons of other data from elsewhere to prove him an outright liar), which ended with Ryan and company calling me names/attacking my character (not the data...LOL), which just looks bad for professionals ;) Best to just change how you operate, or every time a big product hits and I have time, boom, data on how ridiculous this site has become since Anand left (toms since Tom left...ROFL). I love days off. Stock homework all day, game a bit, destroy a few dumb reviews if I have time left over. :) Yeah, I plan days off for "EVENTS", like launches. Why does my review seem to cover more RELEVANT data than yours? ROFL. I own AMD stock, NOT Nvidia (Yet, this year…LOL, wait for Q1 down report first people IMHO). One more point, Vulkan already has VRS support, PCworld mentions they are working with MSFT on DX support for VRS but “Until then, it'll expose Adaptive Shading functionality through the NVAPI software development kit, which allows direct access to GPU features beyond the scope of DirectX and OpenGL.” OH OK ;) Back to reading OTHER reviews, done trashing this useless site (for gpus at least, too much info missing that buyers would LIKE to know about perf).
  • PeachNCream - Wednesday, January 9, 2019 - link

    TL;DR
  • boozed - Friday, January 11, 2019 - link

    JTFC...
  • LSeven777 - Tuesday, January 22, 2019 - link

    It's a shame you don't use all that energy to do something usefull.
  • El Sama - Tuesday, January 8, 2019 - link

    This company needs to be put back into reality (where are you AMD?) at this trend we will be having 500 USD RTX 2260 in a few years.

Log in

Don't have an account? Sign up now