Login to Account Create an Account
Review Awesome New NVIDIA GTX TITAN Gets Previewed Around the Web
Unfortunately, official benchmark reviews are still under NDA until the 21st, so official hard performance numbers remain tantalisingly out of reach for now, leaving us all on tenterhooks for another two days. However, that hasn't stopped several websites from posting previews of this hot new card, which are allowed. While we unfortunately don't have a GTX Titan to review for ourselves (please, NVIDIA...?) rest assured that we will bring you a roundup of all the best reviews of this fantastic card. The various previews around the web do reveal some hard data about the new card though, proving right the latest leaks that we brought you.
First off the price: you guessed it, it's going to cost precisely one arm and one leg. NVIDIA has officially priced it at $999 in their press release, making it around £770-£850 in the UK. Interestingly, looking at the websites of various online retailers, I can see that the GTX 690 has already dropped to around £740 for the cheapest models, which doesn't really surprise me. This leaves the Titan as the graphics card to forever lust after by hardcore enthusiasts without deep pockets to satisfy their cravings. That'll be me then... just check out its hot looks!
Next up, we have some specifications to go, courtesy of NVIDIA's official slide:
What's not shown here is the size of the card, which is 10.5 inches long and dual slot, which is shorter than a GTX 690 and only half an inch longer than the GTX 680, so fitting it won't be a problem in any half decent PC chassis. The relatively frugal TDP of 250W is impressive too for a GPU containing 7.1 bilion transistors and built by TSMC on a 28nm process.
Looking at the full specs on NVIDIA's website confirms that the version of the GK110 GPU fitted to the Titan has 2688 CUDA Cores, meaning that one SMX unit is disabled, making for a total of 14, as previously reported. The K20 HPC card has a GK110 GPU with all 15 SMX units enabled giving 2880 CUDA Cores. However, it costs around $3,500 and can't be used as a graphics card unfortunately, so cannot be used for gaming, regardless of how rich and enthusiastic the gamer. Hopefully in time, NVIDIA will release a version of the Titan will all SMX units enabled as manufacturing yields improve.
The Titan has a neat bling feature which allows the brightness of the green GEFORCE GTX logo to be controlled, as explained by Legit Reviews:
The GeForce GTX logo on the edge of the TITAN board is also LED backlit just like the one on the GeForce GTX 690! This LED acts as a power indicator, lighting up when the board is in use. The intensity of this LED can be manually adjusted using tools provided by select NVIDIA add-in card partners; you can even adjust the intensity based on GPU utilization, so the LED will shine brighter as GPU utilization increases.
Next, the exact connectivity available is finally revealed. Display outputs consist of two dual-link DVI ports, one HDMI port and one DisplayPort 1.2 port. Unsurprisingly, the card supports 4K resolution (4096 x 2160) which might hopefully begin to justify that enormous 6GB of onboard RAM.
I've previously stated that shear graphics horsepower (sorry) while driving you batty with the noise is nothing and if the noise comparison chart from NVIDIA below is to be believed, it performs very well here in a three way SLI setup, especially when compared to similar setups with GTX 680 and AMD's HD 7970 graphics cards.
Interestingly, while NVIDIA won't release framerate benchmarks at the moment, they're fine with giving us "hard" numbers for noise, showing the AMD setup to assault our ears with around 62dBA of noise which would indeed be enough to drive many enthusiasts batty or replace those noisy stock coolers. The GTX 680 rig is somewhat better at around 53dBA, but the Titan does really well with just 48dBA's worth of annoyance with three of them running. Note that every 3dB reduction halves the perceived noise level, since this is a logarithmic scale. This makes the Titan setup waay quieter than the AMD one, a very important factor when choosing a graphics card, in my book. In fact, NVIDIA describe it as "whisper quiet" on their Titan promo page.
Note that I said "hard" numbers: noise measurement has a large subjective factor to it, especially given the quality of that noise, such as fan whine and any other noises that stand out, so I'll wait for the reviews before proclaiming a solid NVIDIA win here, although I don't really doubt it judging from my experience of their cards. My GTX 285 and GTX 580 are pretty reasonable noise performers for the framerate performance they offered for their time. The controversial GTX 480 was a bit of a disappointment in this area apparently, but I never had one, so can't say from personal experience.
Actually, NVIDIA have released some hard rendering performance numbers using 3DMark Vantage (Performance Preset) as can be seen below. The card was tested on an unspecified Intel i7 3.3GHz CPU, with 8GB system RAM, unspecified X79 motherboard and PhysX enabled.
This comparison gives us a good idea that it's damn fast (obviously) and beats a GTX 690 by a small margin. Interestingly, this comparison shows it to be only about twice as fast as a GTX 285 and GTX 480, which oddly both score almost the same here. Again, we'll wait for the final reviews before passing judgement.
Next up, we have GPU Boost 2.0, which I talked about previously. We won't go into the details here about this surprisingly complex feature, but will instead let PC Per explain it all, in great detail. The bar charts below give us an easy to understand idea of how much GPU Boost 2.0 improves performance, while the graph below looks suitably complex and confusing unless one has read PC Per's preview. There's lots more graphs like that on there, too...
Here's an interesting caveat with GPU Boost 2.0, as PC Per explains:
Oh. Somehow, one's graphics card slowing down as you use it doesn't sound so appealing, even if the technology in fact squeezes the utmost performance from it. I can just see cryocooling becoming popular to prevent that performance drop...
This new version of GPU Boost definitely seems more in line with the original goals of the technology but there are some interesting caveats. First, you'll quickly find that the clock speeds of TITAN will start out higher on a "cold" GPU and then ramp down as the temperature of the die increases. This means that doing quick performance checks of the GPU using 3DMark or even quick game launches will result in performance measurements that are higher than they would be after 5-10 minutes of gaming. As a result. our testing of TITAN required us to "warm up" the GPU for a few minutes before every benchmark run.
Finally, we come to that odd-sounding "80Hz" adaptive vsync that I discussed yesterday. From the limited information available then, it just didn't make sense. However, now that we have the official explanation, it makes perfect sense, if surprising reading: what this feature amounts to is dynamic control of the refresh rate of the monitor! NVIDIA calls this "display overclocking" and describes it thus:
Wow! What this amounts to is a variable temporal sampling rate (framerate) the same idea that's been used for MP3 sound recordings for years, but has never been applied to displays, until now. By keeping the display refresh rate to just under what the system can render, while maintaining vsync lock, smooth judder-free gameply is maintained. How effective this is and whether it will lead to odd dynamic visual artefacts remains to be seen.
With GPU Boost 2.0, we’ve added a new feature that makes this possible: display overclocking. Using tools provided by our add-in card partners, you may be able to overclock the pixel clock of your display, allowing you to hit higher refresh rates.
Note that it will require a monitor that supports this feature to work, but it appears that current monitors not designed with this in mind may yet be able to make use of it. The official reviews will clear up this point soon.
This is a feature that will be enabled through third party utilities rather than the driver control panel and EVGA's popular Precision X utility will be updated to make use of it. Judging by the looks of it though, it won't "overclock" to 120Hz refresh and beyond, so those with 120Hz capable displays (or even 144Hz for the Asus VG278HE that I have) won't be any better off.
There's another crucially important point to consider too which is that it very likely won't work with LightBoost on. As I recently discovered, using 2D mode and LightBoost in my games leads to perfect motion blur-free gaming, something which has never been seen on an LCD monitor before and is frankly awesome. It makes an LCD monitor as good as the previous gold standard, the CRT and then some, since it delivers high resolution and perfect clarity, along with 120Hz refresh. Something that CRTs couldn't manage. In short it delivers a superlative gaming experience and is a feature that I can't stress the benefits of enough.
There is no word yet on what video driver it will use. Namely, will there be a new main release in a couple of days, like we had yesterday, or will it have its own special driver for now. We'll find out very soon.
All the raw data, photos and graphics have come from the websites below which previewed the card.
Article picture courtesy of Egypthardware.
Official NVIDIA GeForce GTX Titan promo page, product page and full specifications.
GK110 block diagram:
Click to enlarge
And finally the naked GPU porno shot you've been waiting for throughout this whole preview:
Click to enlarge