NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute
by Ryan Smith on February 19, 2013 9:01 AM ESTOrigin’s Genesis: Titan on Water & More to Come
Wrapping up part 1 of our look at NVIDIA’s GeForce GTX Titan, we wanted to take a quick look at the tri-SLI system NVIDIA sampled to us for this article: Origin’s Genesis. Without the ability to publish performance data we can’t go into any detail and otherwise fully evaluate it, but what we can do is give you a sneak peek at what’s among the most unusual, and likely most powerful Titan systems on the market.
But first, as a bit of a preface, as we mentioned earlier in our article NVIDIA has been sampling reviewers with various SFF and tri-SLI systems to showcase their two boutique computer concepts. With the tri-SLI system it was not only intended to show off raw performance, but also to serve as a showcase of Titan’s build quality. You see, NVIDIA had told us that the acoustics on Titan were so good that a tri-SLI system could not only be a reasonable choice from a background noise perspective, but that it would be notably quieter than even a GTX 680 tri-SLI system, the latter being particularly hard to believe given GTX 680’s impressive acoustics and low power consumption.
Of course, things didn’t exactly go according to plan, and in a happy accident Origin went above and beyond NVIDIA’s original request. As the Genesis’ marquee feature is water-cooling, Origin went all-out in setting up our sample system for water-cooling, and not just on the CPU. Despite the fact that Titan was (and technically still is) an unreleased card, working alongside their waterblock supplier EKWaterBlocks they were able to get proper waterblocks for Titan in time to build our system. As a result our tri-SLI Genesis unexpectedly ended up being both completely water-cooled and factory overclocked.
The bad news of course is that because of the performance embargo we can’t tell you anything about the performance of the Genesis, other than to say that as fast as one Titan card is, three overclocked Titan cards running on water is even faster, sometimes by a massive margin. Furthermore, coupled with this is the fact that GPU Boost 2 was designed in part to better mesh with the superior cooling capabilities of water-cooling, taking advantage of the fact that water-cooled GPUs rarely hit their temperature limits. As a result what’s already a fast system can sustain performance that much higher thanks to the fact that we hit our top boost bins more often.
But we’re getting ahead of ourselves here…
Origin Genesis Specifications | |
Chassis | Corsair 800D |
Processor |
Intel Core i7-3970X Extreme Edition, Overclocked To 4.9GHz, ORIGIN CRYOGENIC Custom Liquid Cooling CPU (6x4.9GHz, 32nm, 15MB L3, 150W) |
Motherboard | Intel DX79SR |
Memory | 16GB Corsair Vengeance DDR3 1866Mhz |
Graphics | 3-WAY SLI NVIDIA GeForce GTX TITAN, ORIGIN CRYOGENIC LIQUID Cooling Solution and Professional Overclocking |
Hard Drive(s) |
2x120 GB Corsair Neutron SSDs in RAID 0 1TB Western Digital Caviar Black SATA 6.0Gb/s, 7200RPM, 64MB Cache |
Optical Drive(s) | 12X Blu-ray (BD) Disc Combo |
Power Supply | 1.2 Kilowatt PSU Corsair |
Networking | On-Board Intel |
Audio |
Realtek ALC892 Speaker, line-in, mic, and surround jacks |
Front Side |
Power button |
Top Side | - |
Operating System | Windows 7 Ultimate 64-bit |
Dimensions |
16.2" x 4.6" x 16" (412mm x 117mm x 407mm) |
Warranty |
1 Year Part Replacement and 45 Day Free Shipping Warranty with Lifetime Labor/24-7 Support |
Pricing | MSRP of review system: ~$7000 |
We’ll have more on Thursday, including performance data for what so far is turning out to be a ridiculously fast tri-SLI system. So until then, stay tuned.
157 Comments
View All Comments
tipoo - Tuesday, February 19, 2013 - link
It seems if you were targetting maximum performance, being able to decouple them would make sense, as the GPU would both have higher thermal headroom as well as run cooler on average with the fan working harder, thus letting it hit the boost clocks higher.Ryan Smith - Tuesday, February 19, 2013 - link
You can always manually adjust the fan curve. NVIDIA is simply moving it with the temperature target by default.Golgatha - Tuesday, February 19, 2013 - link
WTF nVidia!? Seriously, WTF!?$1000 for a video card. Are they out of the GD minds!?
imaheadcase - Tuesday, February 19, 2013 - link
No, read the article you twat.tipoo - Tuesday, February 19, 2013 - link
If they released a ten thousand dollar card, what difference would it make to you? This isn't' exactly their offering for mainstream gamers.jackstar7 - Tuesday, February 19, 2013 - link
I understand that my setup is a small minority, but I have to agree with the review about the port configuration. Not moving to multi-mDP on a card of this level just seems wasteful. As long as we're stuck with DVI, we're stuck with bandwidth limits that are going to stand in the way of 120Hz for higher resolutions (as seen on the Overlords and Catleap Extremes). Now I have to hope for some AIB to experiment with a $1000 card, or more likely wait for AMD to catch up to this.akg102 - Tuesday, February 19, 2013 - link
I'm glad Ryan got to experience this Nvidia circle jerk 'first-hand.'Arakageeta - Tuesday, February 19, 2013 - link
The Tesla- and Quadro-line GPUs have two DMA copy engines. This allows the GPU to simultaneously send and receive data on the full-duplex PCIe bus. However, the GeForce GPUs traditionally have only one DMA copy engine. Does the Titan have one or two copy engines? Since Titan has Tesla-class DP, I thought it might also have two copy engines.You can run the "deviceQuery" command that is a part of the CUDA SDK to find out.
Ryan Smith - Tuesday, February 19, 2013 - link
1 copy engine. The full output of DeviceQuery is below.CUDA Device Query (Runtime API) version (CUDART static linking)
Detected 1 CUDA Capable device(s)
Device 0: "GeForce GTX TITAN"
CUDA Driver Version / Runtime Version 5.0 / 5.0
CUDA Capability Major/Minor version number: 3.5
Total amount of global memory: 6144 MBytes (6442123264 bytes)
(14) Multiprocessors x (192) CUDA Cores/MP: 2688 CUDA Cores
GPU Clock rate: 876 MHz (0.88 GHz)
Memory Clock rate: 3004 Mhz
Memory Bus Width: 384-bit
L2 Cache Size: 1572864 bytes
Max Texture Dimension Size (x,y,z) 1D=(65536), 2D=(65536,65536), 3
D=(4096,4096,4096)
Max Layered Texture Size (dim) x layers 1D=(16384) x 2048, 2D=(16384,16
384) x 2048
Total amount of constant memory: 65536 bytes
Total amount of shared memory per block: 49152 bytes
Total number of registers available per block: 65536
Warp size: 32
Maximum number of threads per multiprocessor: 2048
Maximum number of threads per block: 1024
Maximum sizes of each dimension of a block: 1024 x 1024 x 64
Maximum sizes of each dimension of a grid: 2147483647 x 65535 x 65535
Maximum memory pitch: 2147483647 bytes
Texture alignment: 512 bytes
Concurrent copy and kernel execution: Yes with 1 copy engine(s)
Run time limit on kernels: Yes
Integrated GPU sharing Host Memory: No
Support host page-locked memory mapping: Yes
Alignment requirement for Surfaces: Yes
Device has ECC support: Disabled
CUDA Device Driver Mode (TCC or WDDM): WDDM (Windows Display Driver Mo
del)
Device supports Unified Addressing (UVA): Yes
Device PCI Bus ID / PCI location ID: 3 / 0
Compute Mode:
< Default (multiple host threads can use ::cudaSetDevice() with device simu
ltaneously) >
deviceQuery, CUDA Driver = CUDART, CUDA Driver Version = 5.0, CUDA Runtime Versi
on = 5.0, NumDevs = 1, Device0 = GeForce GTX TITAN
tjhb - Tuesday, February 19, 2013 - link
Thank you!It seems to me NVIDIA are being incredibly generous to CUDA programmers with this card. I can hardly believe they've left FP64 capability at the full 1/3. (The ability to switch between 1/24 at a high clock and 1/3 at reduced clock seems ideal.) And we get 14/15 SMXs (a nice round number).
Do you know whether the TCC driver can be installed for this card?