Best Video Cards: March 2014

by Ryan Smith on 3/24/2014 9:00 AM EST
POST A COMMENT

53 Comments

Back to Article

  • apappas - Monday, March 24, 2014 - link

    If you intend to use the video card on a Linux/Steambox system, do not buy AMD! Their driver support is atrocious whereas NVIDIA is at parity with their Windows driver with the exception of Optimus which needs a small but easy workaround.

    P.S Please add some Linux coverage. I beg you!
    Reply
  • themelon - Monday, March 24, 2014 - link

    Linux is the reason why I never even consider AMD graphics. Reply
  • augiem - Monday, March 24, 2014 - link

    That's really sad. I haven't had an ATI/AMD graphics card in my main computer since... gosh, I don't know... 2000? I forget. It was a LONG time ago. I've considered them every time I need an upgrade, but I always see the same old bad driver issues come up in so many forum posts, etc. Really sad. Nvidia has always been rock solid for me even for high end 3D work. I want to try AMD again someday, but always hearing this repeated year after year doesn't inspire much confidence. Reply
  • Gigaplex - Monday, March 24, 2014 - link

    On the flip side, if you're using Linux and care about open source drivers, then avoid NVIDIA. Reply
  • stmok - Tuesday, March 25, 2014 - link

    I don't use AMD graphics cards in Linux because Blender doesn't properly support it for GPU accelerated rendering. OpenCL isn't supported for reasons described in the following link...
    => http://wiki.blender.org/index.php/Dev:2.6/Source/R...

    If you want GPU acceleration under this particular scenario, you must buy Nvidia and use the closed source driver. As only CUDA is properly supported in Blender's rendering engine.
    Reply
  • anandreader106 - Monday, March 24, 2014 - link

    Excellent performance guide and market summary Ryan. Thank you! Reply
  • jeremymcdev - Monday, March 24, 2014 - link

    Waiting for the 300 - 350 range to do high on 1440p. I figure I'll get an 870 when they come out. Hoping for May. Sold my 2 5970's and now I'm roughing it with integrated HD4600. At least it gives me a chance to catch up on some console games. Reply
  • ddriver - Monday, March 24, 2014 - link

    Hell must have frozen over, anand article has recommended more radeons that geforces :D Reply
  • BPM - Monday, March 24, 2014 - link

    Come on . They said amd cards offered more bang for your buck.amd 290 is loud but extremely well priced Reply
  • Flunk - Monday, March 24, 2014 - link

    I'd say the big problem with Nvidia's current lineup is price. If the Radeons were all selling at MSRP the board would be almost entirely red. Reply
  • Homeles - Monday, March 24, 2014 - link

    Huh, it's almost as if AT isn't biased :) Reply
  • Hrel - Thursday, March 27, 2014 - link

    You mean like they always do? If anything you should be surprised that there's Nvidia in there at all. Reply
  • slickr - Monday, March 24, 2014 - link

    I'm waiting for $150 AMD Radeon R9 270X and GTX 760.

    The GTX 460 1GB was $150, now in order to get the kind of performance you must spend $200+ and to get the $250 performance from back then, now you must pend $330+ most cards are custom so on average $350.

    These are not realistic prices and I for one am not buying a new GPU until prices drop. I don't know what AMD and Nvidia are thinking as the PC market is shrinking, 2013 is 16% down over 2012 and 2012 was 12% down over 2011. 2014 is said to be down anywhere from 10% to 20% over 2013.

    With these GPU and CPU prices its not surprising though, they are too expensive for what they offer and games are only going to become more demanding now that the new consoles are and even though the consoles lack in performance right now, as development process matures and companies start abandoning development for PS3 and Xbox360 we are going to see a lot more demanding games for the consoles and a current $150 card will not be able to keep up with those games.
    Reply
  • nathanddrews - Monday, March 24, 2014 - link

    "The GTX 460 1GB was $150, now in order to get the kind of performance you must spend $200+"

    The GTX 460 1GB was $230 at launch in 2010 ($250 adjusted for inflation), which is the same launch price as the GTX 660 which is much more powerful.

    The new 750ti is $150. Using AT Bench, it performs quite closely to the GTX 480 which launched at $499 ($540 inflation) while consuming 60% less power:
    http://www.anandtech.com/bench/product/1135?vs=113...
    Reply
  • minijedimaster - Monday, March 24, 2014 - link

    "we are going to see a lot more demanding games for the consoles and a current $150 card will not be able to keep up with those games."

    http://www.anandtech.com/bench/product/1034?vs=112...

    I don't get it, the PS4/XBONE uses a modified AMD 7870 GPU. The 7870 performs pretty close to the $150 recommendation R7 265.

    Considering the GPU in the consoles will never be changed, games made for them will be capped in performance and capabilities right there. If I buy the R7 265 today it will run all upcoming next gen console games for the next couple of years at least. Even then, if I really wanted too, I can always upgrade it after a couple of years at the same price point and be even faster costing me a total of $300 as opposed to $500/$600 of a console. Your argument for being too expensive doesn't make sense. (lets not even mention how much I save on games themselves over consoles with Steam sales and the like)
    Reply
  • Penti - Monday, March 24, 2014 - link

    Your computer is not a console, all it's software is different. The PC-port (or original platform) will not be optimised around R7-265 (GCN 1.0) and Windows, and storage isn't as limited on PC's. The games will get harder to drive, a late PS3-port is fine on R7-265 or there about, but running newer AAA-titles will be tougher, and you don't really want to go the route that console do, you do not want to go 30 fps. Consoles have low level access, teams of developers that fixes issues and do optimisation on that platform alone. PC's have different drivers, changes and so on. Reply
  • Coztomba - Monday, March 24, 2014 - link

    For people outside the US who aren't effected by the coin mining craze like here in Australia, I can get a 3GB 280x $40 or so cheaper than a 2GB gtx 770. Assuming the AMD will be a better buy for me? Reply
  • piroroadkill - Monday, March 24, 2014 - link

    Yes. I feel this article should have made that clearer, because not everyone who reads this is in the US. Reply
  • Ryan Smith - Tuesday, March 25, 2014 - link

    Sorry about that. In case it's not clear, all of these buyers guides are going to be US-centric. Reply
  • HighTech4US - Monday, March 24, 2014 - link

    Great another site recommending a ghost card. You can hear about it but never touch it.

    The R7 265 is never in stock but for a few hours. Even the one $20 higher for $169.99 isn't in stock.

    http://www.newegg.com/Product/ProductList.aspx?Sub...
    Reply
  • Ryan Smith - Tuesday, March 25, 2014 - link

    Noted. But for what it's worth, that card was in stock when this was written on Friday. Reply
  • boot318 - Tuesday, April 01, 2014 - link

    Your not using Newegg's search feature right the R7-R9 series.

    http://www.newegg.com/Product/ProductList.aspx?Sub...
    Reply
  • dblkk - Monday, March 24, 2014 - link

    I'll take 2 gtx titan black's please! Hoping price falls a little below $1,000. Hoping nice bridge between workstation class, and gaming capabilities. Reply
  • rickon66 - Monday, March 24, 2014 - link

    My EVGA GTX780Ti OC w/ACX cooler is the best $730 spent in a long time, hopefully will last for a couple of years B4 upgrading. Been playing Far Cry 3 at max setting on a Dell U3011 and it looks amazing, now if I was only a little faster. Reply
  • rhx123 - Monday, March 24, 2014 - link

    I wish you/other sites would stop using the reference 770 picture when you refer to one.
    It is impossible to buy and I found the only sampling them to press/making a very limited run of units with the Titan Cooler very naughty.
    Reply
  • nightryder21 - Monday, March 24, 2014 - link

    I would love to see this article for other task such as video transcoding or Lightfoot workflow. Reply
  • apertotes - Monday, March 24, 2014 - link

    I'd like to see a bit more focus on 120 hz gaming. But thanks a lot for you outstanding job.

    I just built a new computer and your past article was a huge help.
    Reply
  • Ryan Smith - Tuesday, March 25, 2014 - link

    For 120Hz gaming, you can generally follow our 1440p suggestions, and this is something I'll be sure to point our in our next guide. In the case of 1440p the resolution is doubled rather than the frame rate. Reply
  • apertotes - Tuesday, March 25, 2014 - link

    Thanks! Reply
  • Anders CT - Monday, March 24, 2014 - link

    I'll wait for bigger Maxwell cards. Reply
  • Alchemy69 - Monday, March 24, 2014 - link

    I've yet to encounter a game that isn't perfectly playable on ultra settings @1080P on a HD7770 providing that you turn off AA and I find the benefits of AA at that resolution to be questionable at best. Reply
  • cobrax5 - Wednesday, April 02, 2014 - link

    I'm not entirely sure about that, unless you have the 2GB version.

    The 7770 is pretty close to a 6850 in terms of performance (albeit much more efficient). I have a 6850 1GB card in to my media computer that hooks up to my 1080P flatscreen. Even in games like Guild Wars 2, my wife has to turn down a few settings other than AA to get frame rates above 30 FPS. When there are a lot of characters on the screen casting spells, it will bog down to the high teens low 20's.

    I have a 6950 2GB (no firmware hack on this one) connected to a 1080P monitor. For CryEngine games like Crysis 3 and Far Cry 3, I get bogged down at the highest settings with no AA. Granted, these are probably not the best examples. Battlefield 3 multiplayer I can keep above like 40-50 FPS with everything turned up and a decent bit of action.

    I suppose it depends on what you consider playable. Some people are happy with 30 FPS with crappy frame time variance because the occasional stutter doesn't bother them. Others, like those who play competitive multiplayer shooters, 60 FPS and up with great frame time variance when there are 64 players on the screen is the minimum acceptable performance.

    Generally, though, I agree with what you are saying. Some things like depth of field and ambient occlusion can hurt a mid level card like the 7770, but removing AA completely and possibly using SSAO instead of HBAO can let you run almost any modern title at max settings at 1080P.
    Reply
  • cobrax5 - Wednesday, April 02, 2014 - link

    I meant Battlefield 4. Battlefield 3 was great with my 5770 on a 1680x1050 monitor. Since I replaced that 5770 1GB with a 6950 2GB, I upgraded the monitor to 1080P since I had ample VRAM. Reply
  • techn0mage - Monday, March 24, 2014 - link

    I signed up just to ask this noob question.
    In this section: "Extreme Performance with Refinement ($499): NVIDIA GeForce GTX 780"
    The article states: "As an added bonus, even at the $499 base price this gets access to NVIDIA’s amazing metal cooler"

    However the picture of this card shows a similar looking cooler: "Reaching For 1440p ($329): NVIDIA GeForce GTX 770"

    My question is, what is different about the cooler shown on the GTX 770, since it appears to be very similar to the highly-recommended "metal cooler" shown on the GTX 780? The cooler on the GTX 770 certainly looks "metal" so I must be missing something in not understanding the point about the GTX 780.
    Reply
  • A5 - Monday, March 24, 2014 - link

    As far as I know, there are no 770s available at retail with the stock NVidia cooler.

    You're right that the picture shows the same cooler, because it is. The reference cards sent to reviewers had that cooler on them, but none of the OEMs used it due to cost.
    Reply
  • techn0mage - Monday, March 24, 2014 - link

    Very well explained, thank you. Reply
  • A5 - Monday, March 24, 2014 - link

    You're welcome.

    I do wish sites would update their pictures and use one of the actual cards that is out there, though :P
    Reply
  • Magichands8 - Monday, March 24, 2014 - link

    I'm a little perplexed by this:

    Taking the Single-GPU Crown For Gaming ($679): NVIDIA GeForce GTX 780 Ti

    When the Sapphire 290 Tri-X is $494.00. With few exceptions it seems to run neck-and-neck with the 780 Ti when it's not beating it. Even when it's overclocked it still runs cooler and quieter than the 780 Ti.
    Reply
  • Chaser - Monday, March 24, 2014 - link

    Um that's one single derivative of a 290X -against most of the others- that might, according to you, benchmark better than a GTX 780ti. Meh. Reply
  • Magichands8 - Tuesday, March 25, 2014 - link

    Or at least according to the review. I've only read the one done here at AT though. I just received mine. Reply
  • username609 - Wednesday, March 26, 2014 - link

    Remember that the 290 Tri-X is a 290, not a 290X. The 290X Tri-X should run close to the 780Ti with the better cooler in place; the 290 Tri-X can beat a 290X with a stock cooler, but it's still a notch below the 290X Tri-X. (The 290X Tri-X has a $749 pricetag at the Egg; the 290 Tri-X kills it in price/performance.)

    Now that most of the GPU manufacturers have moved to aftermarket coolers, it would be nice to see another roundup of high-end AMD cards. Maybe throw in an update of Nvidia cards too, and compare how everything runs with the latest drivers?
    Reply
  • Ryan Smith - Tuesday, March 25, 2014 - link

    Since we can't possibly review every single card released, these guides are focused on suggesting what class of card to get. e.g. a GTX 770 or R9 280X, rather than suggesting specific cards. The 290 Tri-X is a very interesting card, but it would be poor form for us to suggest the 290 overall based on a card only offered by a single vendor. Reply
  • Magichands8 - Tuesday, March 25, 2014 - link

    Yes, I did notice that you guys were talking in general terms. Also, I see your point about making such a recommendation based upon a single vendor's card. But can other vendor cards be expected to perform all THAT differently? And if not then doesn't the reference R9 290 become itself the exception to the rule in decisively under-performing the 780 Ti? Just a thought. It also occurred to me that the Tri-X can only defeat the 780 Ti by a good margin when it's been overclocked past its factory overclock and it wouldn't be fair to compare it under those conditions. Still though, my eyes just about popped out of my head when I saw $679 for the 780 Ti. I'm very glad I ordered a Tri-X :) Reply
  • doubledeej - Tuesday, March 25, 2014 - link

    Where does one go for a single-slot card with decent performance? Reply
  • jimjamjamie - Tuesday, March 25, 2014 - link

    Getting excited for the GTX 8xx series cards if the 750 ti is anything to go by. Reply
  • cgalyon - Tuesday, March 25, 2014 - link

    I'm surprised the 760 isn't recommended in place of the 270. Is that just due to price (the 760 is $50 more)? My sense was that the 760 outperformed the 270 overall.

    I've been waffling between those two models for my upgrade (from a 470). Really, I would like to get more than 2GB RAM, though there are variants to address that regardless of GPU.
    Reply
  • Barnassey - Tuesday, March 25, 2014 - link

    Is it possible for benchmarks against older cards to the newer ones? Bench only lets you select year for older ones which make the whole table useless for benchmarks if you cant directly compare older to newer. Reply
  • cgalyon - Friday, March 28, 2014 - link

    Agreed, I would like to be able to choose to restrict the Bench by year *or* simply view all years. For those of us who are kind of getting far behind in GPU's, this would be very helpful in determining which cards would yield a significant increase in performance. Though I've also been kind of disappointed with how infrequently the Bench data are updated. I imagine it takes a lot of time to enter, though, so I don't really blame them. Just wish it was updated more frequently. :) Reply
  • Hrel - Thursday, March 27, 2014 - link

    "Since our guide is written on the assumption that most buyers are looking for the best performance at a given price, our performance recommendations are going to favor AMD, as they’re more willing to throw out larger, more powerful cards at these sub-$200 price bands. NVIDIA on the other hand isn’t going to be able to directly compete with AMD on price/performance," ... you do realize electricity costs money right?

    Not to mention heat/noise is less on Nvidia.

    I appreciate that you guys provide objective, scientific data. But your editorials, recommendations and conclusions are FAR too often colored by personal preference, emotion. I can't fault you for bias, since you do present the data your "conclusions" are based on. I simply ask that you attempt to be more mindful of your biases.
    Reply
  • hangfirew8 - Tuesday, April 01, 2014 - link

    Electricity cost is not the buyer's consideration here, because Mom pays the electricity bill. Reply
  • draperw86 - Friday, March 28, 2014 - link

    Can anyone tell me how intel's broadwell processor and the subsequent mobo's released in it's wake will affect these gpu's ? and what about ddr4 ram that's coming around same time? Reply
  • cobrax5 - Wednesday, April 02, 2014 - link

    I've been thinking about upgrading from a 6950 2GB to something newer.

    I waited to see if the new R7/R9's would drop the price a lot, but crypto mining threw dashed those dreams. Since I'm on a 1080P monitor, I don't see an urgent need to upgrade, but my wife's computer (6850 1GB on a 1080P TV) would get my 6950.

    My price range is around $300 (the 770 could be a candidate, but it pushing the price). Nvidia-wise, I'd like to see how higher end Maxwell cards shake out. I really, really wanted an aftermarket cooled 290 and would pay the extra $100, but the prices are artificially high (~$500 on Newegg for an aftermarket 290 instead of the supposed ~$400 launch price kills me).

    Thoughts?
    Reply
  • beck2050 - Friday, April 04, 2014 - link

    Quiet and cool is a big plus, IMHO. Reply

Log in

Don't have an account? Sign up now