POST A COMMENT

44 Comments

Back to Article

  • nevertell - Tuesday, February 18, 2014 - link

    They should've made it white as white radiates heat better than black :P Reply
  • random_user - Tuesday, February 18, 2014 - link

    White absorbs less heat than black, but black radiates more efficiently than white. Reply
  • chaos215bar2 - Tuesday, February 18, 2014 - link

    I hope you just forgot the /s. White certainly absorbs less heat, but color has absolutely nothing to do with heat radiated. Reply
  • Death666Angel - Tuesday, February 18, 2014 - link

    Yes it does. Take some university classes. Or google "black body radiation" and "thermal radiation". Reply
  • Notmyusualid - Tuesday, February 18, 2014 - link

    Indeed, never saw a white air-cooled motorcycle engine either.... Reply
  • aakash_sin - Wednesday, February 19, 2014 - link

    :D +1 Reply
  • Guest_56 - Wednesday, February 19, 2014 - link

    (sigh)
    "Surface color is often mistaken as a significant factor in emissivity; it is not." - source: Emissivity article in wikipedia.
    You confuse emission of heat (by radiation) with absorption/reflection of visible light frequencies. Also, heat mostly emits in infrared frequencies.
    Reply
  • Death666Angel - Wednesday, February 19, 2014 - link

    Read: "significant". Never said it plays a big role, just that plays a role. Reply
  • Death666Angel - Wednesday, February 19, 2014 - link

    Wow, it states my point right after your quote, which you omitted:
    "it affects the spectral emissivity at visible wavelengths, which are often negligible when calculating the total emissivity."
    Reply
  • svyper1 - Wednesday, February 26, 2014 - link

    I think people are mistaking what is heat conductivity and radiation...

    Light is the electromagnetic wave with frequencies ranging from gamma - ultra violet - visible - infrared - micro - radio waves.

    'Heat sinks' aren't designed to absorb radiation. All objects absorb and emit heat radiation if there is some in the system, however, Heat sinks absorbs heat transferred through material, called conduction. Which is then "taken away" by convection - the transference of heat to a gas (air)...

    Conduction and convection is transferring the energy between atoms...This is the principle way heat sinks reduce the chip temperature. Being black or white affects the radiation absorption and emission, but this is not what heat sinks are for and so being black is purely for aesthetics...
    Reply
  • ShieTar - Tuesday, February 18, 2014 - link

    1. Not remotely true, you can get both white and black high emissivity paints.
    2. All the fins are located next to more fins. The cooler would just radiate into itself, with little to no net cooling effect.
    Reply
  • nevertell - Tuesday, February 18, 2014 - link

    We both know that no paint would be better, if possible. But purely theoretically speaking, it doesn't matter if most of the pins would be irradiating to each other, it's the net result of heat radiation radiating away from the heatsink that matters. Look at it this way, black absorbs more than white, why would you want to absorb more heat when the sole purpose of a heatsink is to remove heat from a particular source ?
    And all of this in the end is meaningless as we're talking fractions of a degree here.
    Reply
  • extide - Tuesday, February 18, 2014 - link

    Heh, I doubt it's painted at all, rather anodised. Reply
  • SlyNine - Tuesday, February 18, 2014 - link

    Were talking about transferring thermal energy. You're providing frames of reference where there shouldn't be any. It's simply transferring energy from one medium to another. Reply
  • TomFahey - Sunday, February 23, 2014 - link

    I may be able to help here as I've just wrapped up the Blackbody Radiation part of my Quantum Physics module at university.

    A "blackbody" is a perfect absorber of light (hence it's lack of colour). When you take thermal equilibrium into account, this implies that a blackbody must also be a perfect emitter. Therefore, we can qualitatively say that something black will cool down better than something white, assuming there is no more external light for the blackbody to absorb. It should be noted that both the absorption and emission mentioned here pertain only to photon i.e light absorption/emission. How the body absorbs/emits heat in terms of heat conduction/at the molecular level is a completely different story.
    Reply
  • svyper1 - Thursday, February 27, 2014 - link

    the argument about black body absorption is pointless as it has nothing to do with that.

    Heat sinks are about heat transfer between materials, by conduction and convection.

    The colour of the heat sink is nothing more than for design
    Reply
  • svyper1 - Thursday, February 27, 2014 - link

    Heat sinks are not about absorbing radiation. It is about transferring heat through material - called: conduction and convection.

    Conduction is between solids and convection is between solid and liquid (gas)...

    The colour does not matter as it is not what it is designed for.

    The term "radiator" is a general term for something the transfers heat. It is not to be confused with absorbing and emitting radiation but of heat transference by conduction and convection.
    Reply
  • npz - Tuesday, February 18, 2014 - link

    The Kepler architecture itself is already compromised, sacrificing good integer, general OpenCL or general purpose compute. Kepler should probably be called SPGPU for specific purpose compute instead Reply
  • eanazag - Tuesday, February 18, 2014 - link

    For me, the only value I would see in this is if it supported virtualization like the Grid K1 and K2. At a $1000 it still makes more sense going with AMD 290X s since they are cheaper. It looks like Anand may need to add a bitcoin bench to the GPU bench.

    Speaking of the bench, please ensure future product reviews appear in bench as it seems like so
    Reply
  • eanazag - Tuesday, February 18, 2014 - link

    me of the reviews have not been in line with it or submitted to it. Reply
  • vision33r - Tuesday, February 18, 2014 - link

    How can they measure bitcoin bench when the difficulty skyrockets every couple of weeks or days. Reply
  • Death666Angel - Tuesday, February 18, 2014 - link

    That's network difficulty, the hash rate of your device stays the same, that is independent of the network hash rate. Reply
  • MrSpadge - Tuesday, February 18, 2014 - link

    Although we'll probably never know how much of that OpenCL weakness is actually due to nVidia not wanting their good CUDA compiler to create binaries which work just as well on other hardware. Reply
  • TheJian - Wednesday, February 19, 2014 - link

    Umm, this is why it wins in every game. I don't call that a compromise when it is done to make 780ti that outperforms 290x by 10-20%
    http://hardocp.com/article/2014/02/10/msi_geforce_...
    780ti OC card vs. 290x OC card. 780ti rules. Period, as most games were 20%+ for 780ti.

    OpenCL means nothing. You run Cuda for PRO apps. There is a reason NV owns 90% of the workstation market which is ONLY running on Cuda with 200+ apps supporting it. Mining crap and F@H makes you NOTHING today (and F@H is just a waste of electricity, you get nothing). You won't even make 1/2 your card back before Asics for LTC and all other coins (programmable to cover everything with the next round) come in 3-4 months. The first wave was specific for BTC, but the next wave is for everything supposedly. It will cost you most of the profits in electricity right now, and even worse with current AMD pricing being nowhere near the fake MSRP. I call it fake because they haven't been sold for that, so when was it real? They are not sold out anywhere now, and haven't been for a long while. PNY says there is a shortage, which is what makes sense based on the Quarterly report for AMD showing ZERO GPU profits that were NOT from consoles.

    IF you are selling out of your top two gpus (290/290x) you should make more than 10mil console socs x ~$12 (120mil - AMD says low double digits, so lower than 15% margins, I'm guessing 12 as those numbers make sense). But they didn't. 8mil sold during the quarter with another 2mil in the pipe being boxed up by sony/ms. So it is easy to see AMD made nothing on 290/290x or as I suspect (and PNY confirms) they are not selling because they don't have many which is REALLY why they are higher priced. They can't make enough that run at 1ghz without special cooling. Is AMD really this dumb at predictions (after 20yrs of making cards, ATI or AMD people should know this) or did they just blow 1ghz smoke while they had to know couldn't be had on many chips? At or near equal pricing it is dumb to walk home with anything but your "compromised" kepler :)

    How many games use compute, OpenCL or "good integer"? If there were many AMD would not be losing by 20% in most games right? There is no point in NV supporting tech that they already have with Cuda in Pro stuff, so you won't see anything from them on this useless stuff for games until games actually start using it. Why waste silicon like AMD on this crap? You think shared crap will save AMD? Cuda 6 has that with maxwell too. By the time it's useful NV will be shipping maxwell (actually probably not even useful until Volta ship times, software needs to catch up). AMD has hardware but no software, NV has software (cuda 6) but no hardware yet...LOL. Neither side wins in a battle were neither sides ideas are being used in anything yet anyway.

    How many games can you play while mining or F@H? NONE. Ok then ;) If any of the things you mention mattered, Cuda wouldn't rule 90% of the workstation market, and 780TI wouldn't be owning AMD in games. Ref to ref AMD loses. OC'd to max on both AMD/NV again AMD loses. Special cooling on both, AMD loses. Drivers? AMD loses. Can you say phase 3? :( Is Freesync working in anything I can buy on a desktop? Again loses. Profits and balance sheets? AMD loses. Do you see the pattern yet? CPU's? AMD loses. They seem to have stopped totally bleeding, but the damage is seriously done already. Can't compete at the top end of GPU's or CPU's, and no ARM for tablets/phones for at least another year, and even then they have no modem so phones will be tougher for them than NV (which finally works on ATT now with a Nov certification). Intel can't get into many phones for the same reason until they get a modem fully integrated. So the largest growing market is out for AMD which is why they went server first (no modem means no phones pretty much). I don't believe seattle is IN HOUSE either (just ARM clone basically) so I'm guessing NV's Denver will be better just like apple's swift and Qcom's cpu cores.

    If Kepler is "compromised", then AMD is just junk that fails to dominate anything that matters? All of the things we see AMD win in, mean nothing to most people. Mining?...Don't make me laugh. F@H? 166K users have downloaded it, so out of 350mil-384mil PC's sold each year, basically NOBODY cares about it either. Easy to see why NV couldn't care less about your junk purposes right? I'd rather have R&D spent on GAMES and optimizing for those.

    The only game anandtech shows using compute is Civ5, which NV is tops in as shown in the 750TI article right? So let me know when your statements actually mean something. :)
    http://www.anandtech.com/show/7764/the-nvidia-gefo...
    750TI on top of civ5.

    Also they test compute on Sony Vegas, which is a known NV hated app ;) They refuse to test Adobe here because it would be using Cuda which would show the exact opposite of Sony Vegas which again is used by FAR fewer people than Adobe. So again those results mean little to me as I'd only buy Adobe for photo, video, AE etc when using NV hardware. You should always use CUDA when available and testing without it for anything that CAN use it just by switching apps is dumb. But Anandtech has an AMD portal so you shouldn't be surprised by the AMD a$$ kissing ;) All NV cards will do badly in Sony Vegas, you should use ADOBE on NV period. But that wouldn't be in Anandtech's AMD interests now would it? :) Your problem is, you're paying too much attention to Anandtech's OpenCL/AMD slant. Reality is you run other stuff using cuda which is why AMD owns less than 10% of the workstation market, as they all run CUDA and ADOBE. Vegas is at best a 2nd runner. Adobe's suite is #1. Watch for Anandtech to suddenly do an about face on Adobe when AMD finally gets OpenCL working right in it ;) ROFL. We've seen AMD advertising in slides they will be good in Adobe one day, but not software to benchmark yet. Why is anandtech so afraid to show CUDA vs. AMD? Why not run Adobe (NV) vs. Vegas (amd) or at least test with the #1 suite out instead of picking Sony Vegas to show AMD can win something? I don't know how much AMD pays for this favorable treatment, but it really hurts Anandtech's credibility. No NV portal hurts them too.
    Reply
  • jimjamjamie - Wednesday, February 19, 2014 - link

    Wow, you sure told us. Fight the good fight, champ. Reply
  • dirk_kuyt - Wednesday, February 19, 2014 - link

    wow what a waste of time this must have been typing this dribble! i wish i could say ive never seen someone try so hard to hate on something affects them so little...but thats pretty much everyone. we all think we know everything about everything thats the best. all i know is i have an AMD hd 7970 that i paid $200 for and it plays games just fine. ive also never had a driver problem with and, even using beta drivers. just chill out man Reply
  • TheElMoIsEviL - Friday, February 21, 2014 - link

    TheJian needs to lay off the crack pipe for a bit. Reply
  • 3DVagabond - Sunday, February 23, 2014 - link

    Without sources for all/any of your claims, and your one citation has nothing to do with Titan Black, your post has zero cred. Reply
  • wetwareinterface - Sunday, February 23, 2014 - link

    first off the r9 290 and 290x cards are actually msrp'd for ~ $430 and $580 respectively. and yes the litecoin miners are buying them up driving up pricing. trust me on this the asus custom cooled r9 cards are sold to retailers below the msrp. retailers are just capitalizing on the litecoin craze and jacking pricing up on their end. the reason the cards are available is the price to performance ratio is so skewed now that the retail pricing of the 780 ti looks more attractive. I just bought a 290 for $480 with my employee discount (which is 10% over cost) at work so I know that the price hike is on the retailer side not the card manufacturer side or amd's side.

    at $480 my asus r9 290 is a musch better value than a $720 780 ti.
    Reply
  • Krysto - Tuesday, February 18, 2014 - link

    This is not a Maxwell version, though, right? Reply
  • DigitalFreak - Tuesday, February 18, 2014 - link

    No, it's not. Reply
  • A5 - Tuesday, February 18, 2014 - link

    No. If GM110 exists, it won't be out until after TSMC 20nm is fully operational. Reply
  • TheJian - Wednesday, February 19, 2014 - link

    "With 7.1 billion transistors and an architecture that separated itself from high-end consumer GPUs, the Titan was worthy of its name. It took 9 months for NVIDIA to make a gaming focused version: the GeForce GTX 780 Ti."

    I don't understand this comment. It took ZERO months as the gaming focused version of TITAN is called a GTX 780 as your chart shows with the same 7.1B transistors. It took 9 months to get enough chips to field a FULLY enabled 15 SMX 780TI, and a few extra months to give us the PRO version called TITAN BLACK we have announced here. This is probably because it takes a little longer to get even MORE chips that have fully enabled SMX's AND with all DP stuff working too (or they were just clearing old Titan's? as I see none in stock with a quick check). I could be wrong on that, but I'm guessing today's FULLY FULLY enabled (ROFL) Titan Black takes longer than a just regular fully enabled gaming card with much of the DP units off (does NV just disable/hobble them period, or are they really doing it partially because something isn't working on at least some of them?). They were cherry picking the 780ti chips for 2-3 months before introducing the new card and while doing that probably cherry cherry picked (again, ROFL) even more for Titan Black.
    Reply
  • TheJian - Wednesday, February 19, 2014 - link

    Well, I guess it took 3 months looking at the chart again...ROFL. But either way these two new cards are on a different chip, so again not related to the OLD titan or 780. Also while they were cherry picking for 780ti, those failed chips ended up in some GTX 780's or Titans.

    I don't think they'll be all that limited if they sell like the first Titan's. They will up production as needed if they sell out fast just as they did the first go around. The first 100K sold out in days, and I can already see many are out of stock already at newegg etc. Newegg has a superclocked model for $19 extra though. I'm surprised they are in stock while the others are not but maybe they just got them in last. I'm guessing they'll be gone before Friday also. Very nice that they can turn on 3 SMX's and run them ~7% faster without raising watts. GK110 is pretty darn impressive in that regard and well worth the R&D it took it seems.

    If AMD hadn't spent on consoles we'd have an answer next week. Also maybe Mantle wouldn't be BETA after 2yrs of dev, still in phases of their drivers, and maybe they'd have a decent CPU to compete with Intel. Bummer. Did I mention I hate consoles? ;) Almost forgot, maybe we'd have something marketable to compete with Gsync also, instead of Freesync that isn't even a product (and really only works on a laptop so far - not free if you need a new monitor right?). Consoles need to die. They give us crappy ports, rob from AMD R&D for drivers/new tech/cpus/gpus, and hold back gaming for nearly a decade with each rev. :(
    Reply
  • TheElMoIsEviL - Friday, February 21, 2014 - link

    Ok not loopy... rather obsessive. Reply
  • TheElMoIsEviL - Friday, February 21, 2014 - link

    He's a little loopy. Reply
  • beck2050 - Wednesday, February 19, 2014 - link

    Thank you! I'll take 4! Reply
  • YazX_ - Wednesday, February 19, 2014 - link

    So basically, this card is useless at this price point, there is not a single difference between it and the 780 Ti in gaming performance and yet it costs 1K, i know that its alot better in compute performance, but you can SLI 780 Ti by adding 300$ to the 1K and get double gaming performance and same compute performance as this one.

    so does it worth it for 1K?? NO
    Reply
  • chizow - Wednesday, February 19, 2014 - link

    I don't understand the omission of the GTX 780 in your comparison Anand, that was clearly the first gaming focused GK110 chip and released only 3 short months after the original Titan.

    In any case, it is no surprise Titan Black is launching this time around with significantly less pomp and circumstance, as Nvidia is most likely expecting soft demand this time around as well. The original Titan launched under false pretenses and tricked many early adopters into buying it uncertain whether or not Nvidia would launch another gaming-focused version of GK110. Those early adopters have since been burned not once, but twice by Nvidia since, with this being the 3rd time all in less than 1 year's time.

    In any case, I am sure there will be some that must have the latest and greatest, but at least Nvidia won't be selling these under false pretenses as they did with the original Titan. Anyone buying one of these is going into it with eyes wide open.
    Reply
  • aggiechase37 - Wednesday, February 19, 2014 - link

    So my question would be how this card compares in professional applications like 3D modeling and rendering programs. I wish Nvidia would come out with a solution for people who like to game a little but also would like a professional class workstation. Reply
  • TheElMoIsEviL - Friday, February 21, 2014 - link

    Loses badly to an AMD R9 290x. That's how it compares in those professional applications overall. Reply
  • 3DVagabond - Sunday, February 23, 2014 - link

    It's a Geforce, so it still runs on drivers that are optimized for gaming. The DP compute has nothing to do with any 3D modeling apps. They don't make use of it. Reply
  • myhomeismycastle45138gr - Wednesday, February 19, 2014 - link

    having something like this would be awesome! my gtx760 isnt bad for now Reply
  • justsean09 - Saturday, February 22, 2014 - link

    Another amazing card by Nvidia (except the price) but I'm happy with my EVGA 780 Ti SC Edition. I would rather by a second card in a few months. Reply

Log in

Don't have an account? Sign up now