POST A COMMENT

44 Comments

Back to Article

  • dagamer34 - Saturday, May 19, 2012 - link

    Gotta love the classic rebadging of old products! Reply
  • StevoLincolnite - Sunday, May 20, 2012 - link

    Lots of reasons to do it though, like being supply constrained at 28nm, 28nm being more expensive than the old mature 40nm etc'.

    But the least they could have done was put faster memory (GDDR5?) on all the cards or something and had passive coolers.
    Reply
  • DanNeely - Sunday, May 20, 2012 - link

    The margins on very low end cards are too low to pay for GDDR5. Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    8 out of 10 systems cracked open have a single empty PCI-E card slot.

    Despite the constant moaning hatred, nVidia knows what it's doing, and these cheap cards are an upgrade boon for almost all those 8 systems with their crud integrated.

    So for very cheap and cheaper than that soon if not sooner, usually the only thing that polls well around here, $40 bucks and many tens of millions would suddenly have a "gamable" system, where formerly there was none.

    nVidia knows what it's doing, and it's a great thing to do, for millions.
    I hope their marketing is HUGE on these, so we can increase the PC gamer user base which will help all of us.

    Of course, others are more interested in destroying nVidia and not welcoming gamers into the fold, but dissing and denying instead, and calling them stupid.
    No wonder they say PC gaming is going down.
    Reply
  • UltraTech79 - Monday, June 04, 2012 - link

    Gamers do not buy cards like this. Millions? Who are you kidding? When you can find a GTX 460 on ebay for around $60, these cards are for idiots only. Reply
  • DerKaleun - Sunday, July 29, 2012 - link

    idiots.. Yeah, you see... Since you seem to have a very narrow train of thought, let's ignore the obvious other facts. Not everyone can also afford a new PSU. Not everyone has a good CPU to even support games that could be played with it. Jesus, what does it take to get people to use their brain.. "gamers don't buy cards like this." Correction, people that don't feel like being total nerds and spending lots of money buy cards like this. "LOL BUT YOU CAN GET IT OFF EBAY!!!!!11" Sure, I suppose you could. I also assume it's OEM right? No warranty, nothing like that. I don't know about other people, but I sure as hell don't trust something with such a major price difference, let alone that coming from eBay. Even then, you will only ever find it that low on eBay, after looking on Amazon, NewEgg and Tiger Direct. I have won, any further arguments from you will be invalidated. One more win for me. Reply
  • MySchizoBuddy - Sunday, May 20, 2012 - link

    What happens once 28nm becomes mature. will the same cards all of a sudden come with kepler or Nvidia will lauch an entire set of even more confusing names. Reply
  • MrSpadge - Monday, May 21, 2012 - link

    They'll long be in the 7000 or 8000 generation by then, featuring still the same Kepler chips we're seeing today, plus (maybe) some new ones. Reply
  • tipoo - Tuesday, May 22, 2012 - link

    The high end cards would be on the next architecture by then and the low end cards would start using a cut down Kepler no doubt. Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    What happens then is 10's of millions of more gamers are added to our ranks so we can get moar and better PC games faster.

    Yes, it's just a terrible, terrible thing that is happening. It's aweful.
    Reply
  • UltraTech79 - Monday, June 04, 2012 - link

    10s of millions. Again, you're delusional. Reply
  • sucram03 - Saturday, May 19, 2012 - link

    And yet all of the pictures I've seen thus far (including the ones I saw at Zotac's website last night) all seem to depict cards with fans.

    I just can't imagine the TDP to be *that* much that we have yet to see more than one OEM produce a fanless version of the 440. And even then, just looking at Newegg, it's only ASUS that makes it, and it's only a DDR3 variant.

    Here's hoping and waiting for a respectable CUDA card for my HTPC. Thankfully my fanless GT430 has held up well enough for decoding and there hasn't been any reason to upgrade yet.

    I haven't explored aftermarket VGA coolers for a number of years, so I'm unsure if there's one that fits the bill.
    Reply
  • Taft12 - Friday, May 25, 2012 - link

    Low profile with a tiny fan is cheaper and will fit into Dell, HP, Acer, etc slim cases than a big passive cooler.

    The last thing Zotac and EVGA want is customers returning vid cards because a mostly clueless customer bought a card at Tiger Direct won't fit in their small Dell PC case.

    I don't like it either, but it's the way it has to be.
    Reply
  • Patflute - Saturday, May 19, 2012 - link

    No gaming charts? : (

    Anyone know when the 660 and 660ti are coming out?

    Will the 660ti be $199? Would fast will the 650ti be, would it be better than the 7770?
    Reply
  • Skott - Sunday, May 20, 2012 - link

    I'm curious when the Ti models are coming out as well. Reply
  • Roland00Address - Sunday, May 20, 2012 - link

    So that is what 14.4 GB/s (64 bit / 8 * 1800) for the memory

    A Llano/Trinity processor in dual channel has
    21.3GB/s with 1333mhz memory,
    25.6 GB/s with 1600mhz memory,
    29.8 GB/s with 1866mhz memory.

    A tegra 3 cellphone/tablet cpu+gpu can have up to 6.0 GB/s (32 bit/8*1500) which is 41.6% of these gpus. And even tegra 3 is bandwidth starved.
    An Ipad 3 cpu+gpu (dual core cortex A9+A5x graphics) has 12.8 GB/s (4 memory controllers*32 bit/8*800) which is 88.8% of the memory bandwidth.

    It is going to be sad in the near future when a cell phone cpu will have better graphics than a current new model nvidia graphic card.
    Reply
  • Orwell - Sunday, May 20, 2012 - link

    Heck, even an old 9600GT has 68GiB/s (256/8 * 1090 * 2) of bandwidth.

    At least pick a decent old card to compare with then! ;)

    Oh, and yes, partially thanks to consoles this card can still play nearly any game at high detail, given you disable AA and stick with 1680x1050 or so. Also, they can be had for as little as €40 so its probably cheaper that these three new chips. Moreover, they don't need to be oven baked.
    Reply
  • Stuka87 - Monday, May 21, 2012 - link

    Well not any game, but a good deal yes.

    These cards are not going to play something like BF3 at high detail at 1680x1050. They would struggle at low-medium detail settings at that resolution.
    Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Maybe the amd fans would agree that the Dx11 upgrade cheapos surpass the sad integrated pre AM3 and socket 775 loser chips, and give the cu$tomer a gigantic bang for their buck ?

    I mean you should see how happy and proud they are they fled the amd X200, X1250, X2100 integrated and cranked their aging system to the DX11 sky for a cheap $35 bucks and a tech notch on their belt.

    Of course all the brilliant would be honest underdog CEO's and BSA Market Analysts here have a different story to tell about how stupid the purchasing public is - not necessarily a PR win if their words got out.

    Maybe the rage3d fans should think about all the crappy Intel integrated boards these low cost DX11 cards beat by 500% ?

    Maybe those buying them aren't so dumb after all - and INSTEAD the usual cheap moaning penny pinching I can't afford a three cent snot bubble upgraders of amd fan persuasion we have here are the clueless and completely thoughtless going against their usual pennies from the peanut butter jar price/perf scrapingly poor "enthusiast" menschen upgrade mind spew.
    Reply
  • yannigr - Sunday, May 20, 2012 - link

    For $40 / 30€ they are not bad. For old PCs without graphics I mean. For new PCs with ultra cheap graphics the right way is Llano/Trinity/...Intel. Reply
  • Scali - Sunday, May 20, 2012 - link

    Which may be exactly why they're not bothering to introduce new GPUs at the low-end?
    I guess this level of performance will be phased out as it is now being surpassed by APUs. So why develop shiny new 28 nm GPUs for that? They probably wouldn't sell enough to make it worthwhile.
    Reply
  • Wwhat - Sunday, May 20, 2012 - link

    They should make mixed mode wafers where the central part has the high end thus large GPU's and then the edge has the small low end ones.
    I wonder if they ever did the calculation on that, what with the lousy yield of 28nm that might be a real good idea.
    Reply
  • Scali - Sunday, May 20, 2012 - link

    Not sure about that...
    Firstly it would require the extra effort to design and test these low-end parts before it would work.

    Secondly, as far as I know it is not possible, or at least not cost-effective to build multiple chips on a single wafer.
    Namely you normally re-use a single mask for all chips on a wafer. When you mix them, the mask has to be replaced halfway during lithography.

    Lastly, the wafer has the least defects in the center, and the most defects at the edges. The yields may be too low for it to be worthwhile anyway. So it's just as well when they go to waste.
    Reply
  • Lonyo - Sunday, May 20, 2012 - link

    Except they already have low end GPUs which are being used in the OEM versions of these cards, so they have already developed shiny new 28nm GPUs, they just aren't using them in retail. Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Why thank you yannigr for the 1st bit of sense so far.

    (although you could be sporting a racist name, so anything you say is not worthy, and should be ignored, and you lost all credibility, and if you had a nicer name maybe someone would listen to you or think you could possibly be telling the plain and obvious truth.)
    rolls eyes
    Oh sorry, I started fitting in and acting like the children here who taught me all too well.
    Reply
  • vision33r - Sunday, May 20, 2012 - link

    It's not the enthusiat market that keeps Nvidia afloat it's these low end solutions. I order them by the hundreds for businesses. Maybe a few more years Intel and AMD's internal GPU will be powerful enough to replace them. Reply
  • MrSpadge - Monday, May 21, 2012 - link

    Trinity is more than a match for these low end cards. Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Yes it appears trinity is a slight bit faster than a GTX 525, but so what ?

    You know how many millions of Athlon 2 / pentium D / Pentium 900 / Cedar Mill / Phenom / Phenom 2 systems are out there with an empty PCI-E slot in them - that could use a DX11 vid card where Trinity can't fit any of them at all ever ?

    I'm kind of sick of the mindset that is about this place, where the only thought that comes to mind is ripping away with the clueless singular notion that fps in a game compared to any amd fan spew gpu of the moment is what counts even when no comparison in any dream world is pertinent, and added to that the endless ranting about nVidia making money or deceiving everyone.

    It's just amazing how all of you are so much more intelligent than the rest of the world, that after an article you can spew out some amd fan piece, then proclaim everyone else is fooled because you the brightest bulbs on earth know all the tricks...

    Man what a downer you people are. Really, what a freakin downer.
    None of you ever seem to have a system you could use a card like one of these in. I say you probably all have a core2 or lesser around that could, but you're too busy bashing and ripping to even think. Maybe with your vast experiences you've never even opened your walmart box and have zero clue it has a 16X pci-e slot ready and waiting.

    Yes, the top end Trinity is likely faster, so go buy the non -existent $800 + laptop and tell us how bad amd is spanking a $40 Dx11 card.

    Surprisingly, when it's cheap as dirt, the usual amd fanboy reaction is massive greed kicks in and they drool to yank one off the shelf somewhere especially if some other release will drive the price down to pauper pleasing level.

    Wow guess I freaked out. Good job amd fan.

    Reply
  • tipoo - Tuesday, May 22, 2012 - link

    They already are. Trinity will already be ahead of these for sure, and the HD4000 is probably ahead of them too, at least the bottom two. Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Not with DX11.
    Amd dropped support on the HD4000 too.
    Great job amd fan.
    Reply
  • Taft12 - Friday, May 25, 2012 - link

    For volume yes, but the profit margin is tiny.

    It might take 100 sales of the GT 610 to equal the profit on a single GTX 680 sale.
    Reply
  • shriganesh - Monday, May 21, 2012 - link

    That's just plain cheating! It can't be accepted! They can't argue the performance level is same. TDP, new features (directX, openGL), 28nm etc Reply
  • MrSpadge - Monday, May 21, 2012 - link

    +1 Reply
  • marc1000 - Monday, May 21, 2012 - link

    Nvidia sells marketing. They build the biggest/fastest chip to create hype and then sell whatever small chip they got in stock because a lot of people thinks that a GT630 card will run a game like the 680 card, because they are from the same company.

    I am still waiting for a U$ 200/250 card on 28nm and 150w TDP from nvidia to choose between it and the AMD cards. Unfortunately, AMD has a 28nm gpu on the market on that price but Nvidia doesn't, so my choice is still AMD.

    The sad part is that AMD went for compute this generetion and Nvidia did an unexpected move going for raw performance on the 680 and won the race too soon. So the best card of this generation is one that does not exist. :-\
    Reply
  • wiyosaya - Monday, May 21, 2012 - link

    "Nvidia sells marketing."

    Agreed! It is marketing bling, and IMHO, they are doing it too often. Like $2,500 tesla cards so that you can get the double-precision compute performance that should be in the 680.
    Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Well, there's a whole lot of jelly spewing from you two.

    Too bad amd is a failing and bailing and firing hole of losses, unable to pay for driver teams let alone fly 200 people all over the world 24/7/365 to gaming companies to make our PC gaming experiences better.

    Amd uses a telephone and email to "help gaming companies" that's how pathetic and broke they are - ROFL.

    In fact, if they had a clue, they'd listen to you two, if even a tenth of your hate filled diatribe repeated as often as the sun rises against nVidia is remotely true, it's sound advice for actions amd needs to implement immediately, if not sooner.

    Instead, we have amd failing, flailing, and driver teams bailing out the sinking ship, always behind, always unable to get to stable. They're like a broken addict with a dysfunctional lifestyle.

    The reason nVidia can sell craploads of cards to OEMs and shelf markets is because they have a great reputation and their driver bundles are awesome - absolutely spectacular - and so simple to not screw up - so easy to fix if anything like a virus chews through them - that's a huge value for OEMs and end users.
    It's huge.
    It's huge for tech support costs - as in costs very little.
    It's huge for entrepreneurial techies who can make a proifitable upgrade for their customers, quick, easy, and error free and without future troubles.

    There's a LOT of REASONS beyond your spewing hatred, ones amd should take note of and attempt in vain to duplicate.
    Reply
  • medi01 - Monday, May 21, 2012 - link

    Logic they do have.
    Let's sell moar and who cares if we deceive customer quite a bit.

    Most buyers wouldn't notice the difference anyway.
    Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Most buyers will be greatly pleased when they stuff their $30-$40 pci-E card in the slot and surpass their Intel or ATI integrated by 600% in performance.

    Something I know you smart guys always think about when talking about the deceived idiot consumers, none of which of course are ever you, here.

    ROFL - yes you guys are so schmart !
    Reply
  • ericore - Monday, May 21, 2012 - link

    Graphics chip manufacturers aren't making tons of money, and if you can get away with selling rebranded chips to OEMs and uneducated consumers then by all means. Still, not everyone wants to shell out over a hundred for a graphics card; it's an undesired precedent since it states you must pay over X amount for new technology. A more eloquent way of doing it, is have limited stock on next gen low end since you'll want the more advanced tech for high end cards anyways, and when you run out of stock for your new gen low ends, then sell the rebranded version. And to sweeten the deal, as far as OEMs are concerned just sell them rebrands haha. Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    tipoo below has a tip for you
    " Low end performance hardly changes generation to generation "

    I'm sure he's still enjoying his pci x300 by the way.
    Reply
  • tipoo - Tuesday, May 22, 2012 - link

    If they put this little effort into the low and mid range, I for one am glad Intels greater emphasis on graphics is going to kill this market range. Low end performance hardly changes generation to generation compared to high end offerings, they have the technology to make it better but I think they just choose to keep it so slow to sell more of their higher end cards. Similar cards using a cut down Kepler would obviously be better and probably no more expensive to produce (they could even use the older fabs for it, as long as it's on the new architecture). Reply
  • CeriseCogburn - Saturday, May 26, 2012 - link

    Yes, because you bought fab space, and the wafers, and decided as CEO of nVidia that was the rollout you would do as 670 and 680 supply is short. Then you got fired by the board and AMD asked you to run their company because no one else wanted to.

    Can I get your autograph ?
    Reply
  • Taft12 - Friday, May 25, 2012 - link

    Hey Ryan,

    The GT 620 specs you posted are equal to the GT 430 (CUDA cores, clockspeed, TDP).

    I'm not familiar with the Nvidia internal model numbering (ie. GF108) but isn't this a straight up rebadge as it is for the 630 and 610?
    Reply
  • hatten - Friday, April 04, 2014 - link

    for all you sad little boys who live with mummy and daddy and do not go out int the real world. no this card is not for you. but if you have a life and need your money to pay the bills and just want a boost to your old computer. eg take a nvida 9500gt which mine was put this into your computer. speed has doubled . i only play wow now and then. and convert vids of my kid to blur-ray. as i have a life. this is a brill card at £30!!!!!!!! not over £100!!!!!!!!!!. spend that money on my boy!!! not a computer. Reply

Log in

Don't have an account? Sign up now