Back to Article

  • AmdInside - Tuesday, May 31, 2011 - link

    Summer is the wrong time to release these suckers. My house gets hot enough as it is with a single graphics card. Release these in the winter time when my house gets cold and it will double as a home heater system. Reply
  • StevoLincolnite - Tuesday, May 31, 2011 - link

    Thankfully it's Winter in the Southern Hemisphere. (Australia.) xD

    Don't really see any need to upgrade my 2x unlocked 6950's at this stage, roll on the Radeon 7000 series!
  • tipoo - Tuesday, May 31, 2011 - link

    "6950X2 which should enter into the market between the 6970 and 6990"

    There was a time when GPU naming conventions made sense. The X2 part is slower than the 6990? Adding the X2 back in is on Powercolor and not AMD of course, but they shouldn't have dropped it in the first place.
  • KineticHummus - Wednesday, June 01, 2011 - link

    It does make sense here. 6950X2 essentially means two 6950s. a 6990 performs better than two 6950s. Therefore, a 6990 is better than a 6950X2. X2 does not mean it is better. It means there are two of that chip on one card. Reply
  • tipoo - Wednesday, June 01, 2011 - link

    I know. I just think the 6990 should have kept the x2 moniker. Reply
  • StormyParis - Tuesday, May 31, 2011 - link

    600 possible customers Worldwide.

    I'm actually mildly irritated by all this focus on the extreme high end. It distracts from where the action really is, which is is the low to middle segment.
  • qhoa1385 - Wednesday, June 01, 2011 - link

    Powercolor was just trying to showoff those bad boys anyway
    " ... Powercolor don’t have intentions to bring this product to market unless they see a demand for it – they were more inclined to show off a 6950X2 ..."
  • lurk1n g00d - Wednesday, June 01, 2011 - link

    Makes sense,; there really aren't that many games that would make it a worthwhile purchase atm IMO. Reply
  • nyran125 - Sunday, June 19, 2011 - link

    doesnt make any sense to me accept to jus tgo wow look at that. Because Crysis 2 is backwars technology adn Modern Warfare 3 or BAttlefield 3 isnt going to require anythign more than a AMD 6870 , so i dont see any reason fo ultra high end video cards. Reply
  • nyran125 - Sunday, June 19, 2011 - link

    and i dont meen 2 6870's either. one 6870 can run all the games out today on MAX. Including the The Witcher 2. So its a waste of time. Reply
  • nyran125 - Sunday, June 19, 2011 - link

    its just a big waste of electricity money. Reply
  • campdude - Sunday, June 19, 2011 - link

    Its all about future proofing your system.....
    Its all about running the highest anti-aliasing setting (forced in control panel) on all the games.
    Its about running the high resolutions. (now 1080 p is pretty much the norm for new monitors)

    Personally, i bought a 4870 for a pretty penny right when it came out and released.
    But it has lasted me 3 years so far and still going on strong.

    I cant justify the upgrade to a 6970 based on the fact its not fast enough.

    Don't get me wrong... It (6970) is a supreme video card compared to mine... BUT its just not fast enough to make me feel good about the upgrade.

    In a little while I'll be upgrading from a dual core to an octa core... i need that type of upgrade performance gap for my video card as well.

    So far the only contender is the 6990 or the Nvidia equivalent.

    However If my current video card exploded... I would purchase a 6850 x2 or 6950 x2 if it was reasonabley cheaper than the 6990. But that is an unlikely scenario... There will probably be a single GPU in the HD 7000 series that runs as fast as the 6850 x2... so i can wait.
  • tzhu07 - Saturday, July 02, 2011 - link

    Maybe it's because I'm just an occasional gamer, but I'd rather upgrade my video card going from mid-range to mid-range, rather than high to high.

    You save a ton of money on both the initial purchase price and electric bills, and you just have to sacrifice some fps or some IQ.

    Not to mention that for people who are sensitive to sound, the mid range cards are quieter, with often some company making a passive version of one of the mid-range cards.

Log in

Don't have an account? Sign up now