POST A COMMENT

44 Comments

Back to Article

  • havoti97 - Thursday, April 28, 2011 - link

    AMD pulling a fast NVIDIA eh? Reply
  • MrSpadge - Thursday, April 28, 2011 - link

    Yeah, that's so NVIDIA. But at least they're not trying to hide it..

    MrS
    Reply
  • Leyawiin - Thursday, June 09, 2011 - link

    And that somehow makes it better. lol Reply
  • Zap - Thursday, April 28, 2011 - link

    A 2 page thread in the forums which was started last night, and now it shows up on front page. Hmmm... Reply
  • Ryan Smith - Thursday, April 28, 2011 - link

    The news got out early; the cards were already showing up in retail. We had already scheduled a briefing before any of that. Reply
  • happymedium - Thursday, April 28, 2011 - link

    Thanks, for answering all my questions Ryan, perfect timing, good mini review. :) Reply
  • marc1000 - Thursday, April 28, 2011 - link

    I wonder if we will see a die shrink of Juniper to 28nm, but keeping the same specs. I really like this GPU, I own one... really good cost-benefit. hehe... Reply
  • Ryan Smith - Thursday, April 28, 2011 - link

    It would be interesting, that's for sure. Traditionally (and I use this term loosely since it's a recent tradition) AMD does a pipecleaner part on a new process ahead of a full lineup. That's where the 4770 came from. Given what happened with the 40nm process I'd think it's in AMD's best interests to run another pipecleaner for 28nm, but we'll see. Reply
  • marc1000 - Thursday, April 28, 2011 - link

    I remember the 4770 launch. I kept waiting for the 5000 series instead of buying any 4x00 because that 4770 launched on a new process, telling everyone that new things would come soon.

    I believe that Juniper is small enough (with its 128bit bus) that if AMD indeed does a shrink of it to 28nm, the resulting GPU would be really cheap and could consume under 75W - and drop the need for that extra PCIE power connector. Of course, I'm assuming that they can keep leakage under control on the new process (wich I believe they can, given the fact that they know this GPU for over 2 years now).

    Maybe this could be one of the 28nm tape-outs AMD is planning this year.... =D
    Reply
  • Ota-kun - Thursday, April 28, 2011 - link

    The most compelling thing I see between these 2 chips is that you can crossfire both of them..... Reply
  • marc1000 - Thursday, April 28, 2011 - link

    You mean, do a CF with one 5770 and one 6770???

    wouldn't we have BIOS/Driver problems?
    Reply
  • T0morrow - Thursday, April 28, 2011 - link

    What about flashing 5770 to 6770 with the nessesary bios? Reply
  • Ryan Smith - Thursday, April 28, 2011 - link

    It's entirely possible. However the cards would need to be nearly identical to work (otherwise hardware such as VRMs may not match what the BIOS is programmed for), which may be a problem for older cards. Reply
  • silverblue - Thursday, April 28, 2011 - link

    "What is NIVIDA’s lineup?" :P Reply
  • lifeblood - Thursday, April 28, 2011 - link

    The engineer and tech enthusiast in me says this is deceptive and very low of AMD. The business and sales man in me says they kinda needed to do it. Well, now the nVidia fanboys get to flame the AMD fanboys in the forums for AMD doing what they accused nVidia of doing. Should be fun reading the next day or two. Reply
  • Stuka87 - Thursday, April 28, 2011 - link

    It is a bummer that they are the same cards. But its not like AMD is the first to do this. Just about the whole nVidia 300 series lineup was rebadged 200 series chips.

    I actually have a 5750 now and I am looking to upgrade for when BF3 hits the stores. Going to have to do that extra step of verifying benchmarks and such before choosing a card.
    Reply
  • marc1000 - Thursday, April 28, 2011 - link

    I own a 5770 and will OC the hell out of it and wait until some 28nm mainstream gpus launch... I really want to upgrade, but these boards are "good enough" to run one monitor at 1680x1050 or 1920x1080 resolutions. So I believe I can wait until next-gen gpus... Reply
  • Arbie - Thursday, April 28, 2011 - link

    I don't really have an opinion on the issue itself, at least none that you haven't very well examined from both sides already. This needed to be brought out and given just the right amount of perspective and column space, and I think you did that.

    Thanks for the good journalism.
    Reply
  • Shadowmaster625 - Thursday, April 28, 2011 - link

    It is a good card. I just bought one recently. A billion transistors, for under $100. I hope they put this design in the new nintendo console. I hope they put it in Trinity. Everyone should have one. lol Reply
  • marc1000 - Thursday, April 28, 2011 - link

    I believe this is the exact GPU that will be in the new Nintendo console. The rumor says "a R700-like gpu with DX11". this GPU already exists and it is Juniper. I guess it has a little over triple the performance from Xenos, and if they manage to put some embedded RAM on it like the xbox did... then it would have really high performance! Reply
  • Shadowmaster625 - Thursday, April 28, 2011 - link

    The amount of embedded RAM on the xenos is so small i wonder if any of these modern lazy shovelware developers even make use of it. Reply
  • marc1000 - Friday, April 29, 2011 - link

    yes, they do. that small amount of RAM is also so fast that it enables Xbox to do 2xAA "for free" on a really old architecture. some embedded RAM in a new gpu would work wonders... Reply
  • blountmatt - Thursday, April 28, 2011 - link

    I just ordered a rig through HP two weeks ago (that has yet to arrive mind you) with this card. I did pretty extensive research on each component and the pricing of the overall setup and I was pretty happy until this very moment. Of course, the only component I didn't research was the HD 6770m. I still have 14 days from delivery to return the laptop so thank you for the article anandtech!

    On a side note, since everyone on this site seems to be more intelligent than I, would someone care to comment to see if this was a good buy? I would appreciate any feedback.

    • Genuine Windows 7 Home Premium 64-bit
    • 2nd generation Intel(R) Quad Core(TM) i7-2630QM (2.0 GHz, 6MB L3 Cache) w/Turbo Boost up to 2.9 GHz
    • 1GB GDDR5 Radeon(TM) HD 6770M Graphics [HDMI, VGA]
    • FREE Upgrade to 6GB DDR3 System Memory (2 Dimm)
    • FREE Upgrade to 750GB 5400RPM Hard Drive with HP ProtectSmart Hard Drive Protection
    • 6-Cell Lithium-Ion Battery (standard) - Up to 5.25 hours of battery life +++
    • 17.3" diagonal HD+ HP BrightView LED Display (1600 x 900)
    • FREE Upgrade to Blu-ray player & SuperMulti DVD burner
    • HP TrueVision HD Webcam with Integrated Digital Microphone and HP SimplePass Fingerprint Reader
    • Intel 802.11b/g/n WLAN and Bluetooth(R) with Wireless Display Support

    $982.79 after tax with a two year warranty
    Reply
  • GodisanAtheist - Thursday, April 28, 2011 - link

    Laptop parts ARE NOT the same as their desktop equivalents. Laptops exist in their own naming and hierarchy bubble which is both good and bad news in a sense. The good news is that this rebadging has absolutely nothing to do with the HD6770m in the laptop you ordered. The bad news is that the 6770m isn't going to perform anything like a desktop 6770 and more like a 6670, because the underlying architecture of the 6770m is based on the desktop 6670.

    This behavior is not exclusive to either side, Nvidia or AMD, and can be extremely frustrating for the end user. For example the GTX580m from nvidia is a far cry from the desktop GTX 580, and actually more like a GTX 460.

    All that being said, you got a pretty good deal on that laptop there, and while the 6770m isn't what you thought it is, it is definitely a very solid laptop GPU and more than capable at the resolution the laptop offers. I'd say stick with it.
    Reply
  • marc1000 - Thursday, April 28, 2011 - link

    yep. pretty good deal. the 6770m is NOT the 6770. but it seems just fine to run the 1600x900 display. I would stick with it too. Reply
  • JarredWalton - Thursday, April 28, 2011 - link

    Interesting to note is that AMD did the same rebadging on the mobile side, just with different chips:
    http://www.anandtech.com/show/4109/amd-and-globalf...

    Basically, 6300M, 6500M, and 6800M are using the older Redwood cores, though at least they *did* get some clock speed increases. Meanwhile, 6400M, 6600M, 6700M, and 6900M all use Barts cores. But then, that sort of makes sense as the 6600M/6700M are really equivalent to the 6500/6600 desktop parts.

    Anyway, for a laptop, the 6770M should actually be a good midrange GPU, capable of running 768p for sure at ~high detail, probably 900p at medium to high, and 1080p at ~medium.
    Reply
  • mczak - Thursday, April 28, 2011 - link

    errm, you've fallen into another marketing trap...
    Mobile chips get higher numbers than desktop ones (that's true both for nvidia and amd). So your HD 6770M corresponds to a Desktop HD 6570 gddr5 at best (same chip, but clocks could still be lower), which is indeed a new chip.
    There's nothing wrong with this chip, just keep in mind it's still one performance category below your desktop HD 5750, even though it's newer. But outside of pure gaming notebooks, this is probably as fast as it gets (just make sure it has gddr5 memory).
    Reply
  • Roland00 - Thursday, April 28, 2011 - link

    The 6770m has 480 shaders, 725 core speed, 1600 mhz 1gb GDDR5, and a 128bit bus
    The 6670 desktop has 480 shaders, 800 core speed, 1000 mhz 1gb GDDR5, and a 128 bit bus
    The 6570 desktop has 480 shaders, 650 core speed, 900 mhz 1gb GDDR3, and a 128 bit bus
    Reply
  • mczak - Friday, April 29, 2011 - link

    Not necessarily. There's a reason I wrote "Desktop HD 6570 gddr5" as it comes in both ddr3 and gddr5 versions. If it has gddr5 ram the ram will be clocked higher slightly (the official word is 1800-2000Mhz) than the mobile's gddr5 memory. In that case the mobile 6670M would be very close to a desktop 6570.
    There's also a reason I wrote the clocks "could" be lower. Mobile chips, unlike desktop ones, quite often come with lower clocks than reference. That said, the core clock is probably unlikely to be lower than that of the desktop 6570.
    So all in all, it should likely be about as fast as a desktop 6570 with gddr5 memory. Faster than the desktop 6570 ddr3 for sure, but slower than the 6670.
    Reply
  • blountmatt - Thursday, April 28, 2011 - link

    Thank you, all of you, for all of the input as it was all extremely informative. You guys are an awesome resource for the less knowledgeable! Reply
  • Battleflame - Thursday, April 28, 2011 - link

    Does the UVD still make any sense now that there are plenty of players with support for DXVA? Reply
  • ViRGE - Thursday, April 28, 2011 - link

    Eh? UVD is the fixed function video decoder that DXVA interfaces with. Reply
  • Belard - Thursday, April 28, 2011 - link

    Don't play the idiotic nVidia game. Its confusing, stupid and hurts the reputation of the company.

    The 6790 is only slightly faster than the 5770 - yet its still a $150 for the 6790!?

    The 6670 costs as much as a 5770, but is slower than the 5750.

    Should have found a way to make a cheaper/slightly slower 6790 and call it 6770 and sell it for $100.

    But this is a cheaper route - trying to fill in the $100~120 price hole.

    We are not impressed.
    Reply
  • Hrel - Thursday, April 28, 2011 - link

    Display Port is stupid. I do NOT understand why everyone at Anandtech makes such a big deal about. I want it to do. I've never seen it on any monitors and I'm pretty sure no HDTV's have it on them. Also, as far as I know it doesn't support sound or 3D. HDMI is all we need. EVERYTHING else needs to die away, yesterday! Improved tesselation matters WAY more than stupid DP. Reply
  • Hrel - Thursday, April 28, 2011 - link

    also, what's with the HD6790? Reply
  • Belard - Friday, April 29, 2011 - link

    No, its not. You don't understand how the tech industry works.

    It takes YEARS for connectors and such to become standard. USB was available for 2-3 years before Apple started using it with their Macs... then it took off. Otherwise, it used to be a useless connector. OMG, VGA connectors is STILL alive and kicking - typically on bottom end PCs and monitors. ie: $80 monitors don't have DVI.

    DVI is still not understood by many people today. Its not so bad thou, todays LCD monitors do a pretty good job of auto-sync to VGA. But I'll take DVI over it.

    Display Port is similar to HDMI. But Display Port is royalty free. ($10k~100k+ a year per manufacture) ie: If I'm correct, HDMI royalty is per cable / device... such as the PC, the cable and the monitor.

    Display Port (DP) has almost double the bandwidth over HDMI.

    DP supports Audio and 3D.

    DP has Multi-monitor support such as 4 monitors in 1920x1080 through a single cable. HDMI does NOT do that.

    There are currently 7 PC monitors and Apple sells DP monitors. Apples 27" display (the only one they have) does 2560-by-1440. Snif... no 2560-by-1600.
    Reply
  • marc1000 - Friday, April 29, 2011 - link

    DP is also the video protocol implemented in intel Thunderbolt. by the way, thunderbolt uses DP cables and sends a PCIE link inside them.

    so it would be really nice to have only ONE connector on the entire computer, for everything from storage to displays, and even mouse/keyboard at some point.

    at some point the TV's will switch do DP too.

    DP only needed 2 revisions until now, while HDMI has been through 4. and it is still faster than HDMI.
    Reply
  • von Krupp - Thursday, April 28, 2011 - link

    I'm not surprised, really. What would surprise me is if the 6770 costs more than $120, because it has been AMD's recent tradition to have a product in every pricing segment and thereby saturate the entire market. Shotgun approach to sales, really. Look at the chart above: AMD has only three open slots and Nvidia has five. What's more, almost all of AMD's line up is made from HD6000 cards, where Nvidia has 200, 400, and 500 series cards. Higher numbers appeal to base consumer decision making. Reply
  • Belard - Friday, April 29, 2011 - link

    Just thinking about it.

    What would have made more sense... is to name these 6750/6770 cards the "6650 and 6670". The 6790 is nothing like the 5770/6770...

    The problem with naming these 6750/6770 cards is that the don't have the HD6000 feature set. Even thou they are slightly upgraded from the 5700s, there is no improvement in performance or video output.

    The 6670 card is more advanced, even thou its a slower card. It would be a better choice for those who want some gaming abilities out of their HTPC systems.
    Reply
  • von Krupp - Friday, April 29, 2011 - link

    I agree. But at the end of the day the majority of consumers have no idea exactly what they are getting with each tier. All they see is 6770 > 6670, so they opt for the 6770 instead. They also see that 6770 > 460, so they opt for the AMD part. Nvidia does have more brand recognition than AMD/ATI, so that could level things out on that side.

    As for the 6790...that card just baffles me. It's the HD5830 all over again.
    Reply
  • slyck - Friday, April 29, 2011 - link

    Outright fraud. We've got filth companies like OCZ changing an SSD's parts to slower ones, and selling it as the same name with the same specs. Also we've got crooks like Nvidia and AMD changing not a single thing on a video card, and selling it as the next generation. It's lying no matter how anyone wants to sugarcoat it. Reply
  • von Krupp - Friday, April 29, 2011 - link

    Just like selling a Phenom II X2 is lying because it's actually a Phenom II X4 that couldn't make the cut, right?

    It's a solid business move. It does outperform the 6670 in games, so I have no problem with it. It only lacks the HD6000's value add-in features.
    Reply
  • Leyawiin - Monday, May 02, 2011 - link

    Make excuses when its your pet company. Reply
  • 3R0lD - Friday, May 20, 2011 - link

    Why does AMD do this??? People loyal to them has always been like this bcs they do do sh.. like nvidia that rebadges old gpus. I am very disappointed now. At least there are a few changes. They're not 100% like nvidia. Reply

Log in

Don't have an account? Sign up now