A noticeable trend in the current desktop ecosystem is towards the small, as evidenced by the Intel NUC and successes of mini-ITX products like the BitFenix Prodigy.  Users, gamers and enthusiasts all want something powerful in a physically small envelope, and while we have cases and motherboards that match this sort of size, the GPU ecosystem has been slow to accommodate.  Sure, larger mITX cases like the Prodigy exist, and users can select between a beefy GPU or hard drive bays, but what if you want both?  Insert the ASUS GTX 670 DirectCU Mini, debuted on the ROG Forums.

With the Mini, we have a GTX 670 on a mini-ITX sized (17cm) PCB, featuring a stunted version of the DirectCU cooler.  Instead of two 6-pin connectors we get a single 8-pin, but still get five outputs covering the range of analog and digital options (except mDP).

Apparently ASUS only has one of these back at HQ as they are testing the idea, and these pictures may not represent the final product.  But it does come under the heading of ‘things to look forward to’ and may generate a trend towards more products of a similar line from other manufacturers.

No word on release or pricing (or how much noise it may produce), but I would not be surprised if it comes out at just above the reference models in order to recoup some R&D.

Source: ROG Forums

POST A COMMENT

38 Comments

View All Comments

  • gregounech - Thursday, March 07, 2013 - link

    Looking forward to having options of this type.

    I also noticed a little typo in your article Ian :
    "back at HQ as thet are testing the idea,"
    Reply
  • IanCutress - Thursday, March 07, 2013 - link

    I would like to believe there is a market for 17cm video cards, as long as noise and heat generation isn't affected too much. It would all be down to the dissipation limits of the cooler. Something tells me overclocking abilities would be affected as well.

    Fixed the typo, thanks :)
    Reply
  • scook9 - Thursday, March 07, 2013 - link

    People would not be getting these for the overclocking. It has my interest, just like the GTX 570 HD from EVGA was interesting. For me it was interesting for 2 reasons:

    -It was the most powerful GPU that was not longer than a mATX board (for use in my HTPC)
    -It had the power ports on the back of the card and not the side. This allows for a case that does not have the side clearance for those plugs.

    Below is the picture of my setup, it is all in an Antec Veris Fusion case which looks awesome for a HTPC.

    http://tinypic.com/m/flzhwz/3

    Cooling the GPUs is a pain right now, but working on that :) The CPU is an i5 2500k
    Reply
  • JPForums - Thursday, March 07, 2013 - link

    Unfortunately, not everyone's enclosure has extra space in front of the cards to support power cables like your enclosure. It would be really nice if they made a card with both front and top mounted connectors. It wouldn't be hard to design at all and you could just cap the unused connector. This way, the card would be usable in both thin and shallow enclosures. Reply
  • DanNeely - Thursday, March 07, 2013 - link

    The problem with that is you'd have a support problem from people who got confused and thought they needed to connect both sets of power connections. It's easier for the OEM to make two almost identical cards with plugs in different locations. Reply
  • bobbozzo - Thursday, March 07, 2013 - link

    Maybe they could make a cover that slides (round the corner), so that only one set of connectors could be used at a time. Reply
  • Paul Tarnowski - Thursday, March 07, 2013 - link

    Easier to recess it and add two adapters, one that lets you change it to the side, and another that lets you do so to the "underside" of the PCB. That way you have an option to have it straight (no adapter), to the side, or to the one place where no one has anything in a single card configuration.

    Also it's cheaper.
    Reply
  • ShieTar - Friday, March 08, 2013 - link

    Why would that even be a problem? There is no downside to connecting the card to the voltage source by 2 parallel wires instead of just one. Reply
  • JPForums - Thursday, March 07, 2013 - link

    I don't really think cooling is the limiting factor. Better coolers that could fit in the smaller card already exist. I don't see any reason why a cleverly designed vapor chamber heatsink wouldn't work. Beyond that, vapor chambers and heatpipes have been combined to good effect. There looks to be space near the back of the card not currently occupied by heatsink area that could be piped into. Then there is the fact that hardly manufacturers currently uses full copper heatsinks. Granted, large copper heatsinks are very heavy, but it could be used to very good effect when trying to maximize the cooling when size is limited.

    Of course, these options have the potential to be a bit loud seeing as you would still need to move an appreciable amount of air across smaller heatsinks with a smaller fan. Also, airflow in mITX cases isn't always the best, so despite the extra noise, a direct exhaust or partial direct exhaust system would probably work out better. Then again, a water block and radiator would solve both the cooling and noise problem.

    Which brings me to what I consider the biggest hurdle: Cost. The cooling can be done and it can be done quietly, but can it be done for a price people will pay. Most likely, if this device comes to market, it will be a little bit hotter, noisier, and pricier that the base model. Some may scoff that the parts list is smaller and the card is less capable, but it is a much bigger challenge designing the product into such a small envelope. A small price premium could be easily justified for those the product targets. I'd love to see such a design come to market. I wouldn't use it personally, but there are plenty of systems I build for others that could use it.
    Reply
  • CeriseCogburn - Tuesday, March 12, 2013 - link

    Too bad AMD can't do anything like this, as we know then the fanboys would be drooling and screaming it's better than sliced bread.
    Another reason why nVidia makes profits $$$$ and amd is dying.
    Reply

Log in

Don't have an account? Sign up now