NVIDIA's GeForce GTX Titan was an absolute beast when it launched. With 7.1 billion transistors and an architecture that separated itself from high-end consumer GPUs, the Titan was worthy of its name. It took 9 months for NVIDIA to make a gaming focused version: the GeForce GTX 780 Ti. Although the 780 Ti gave up double precision floating point performance (FP64) and 3GB of GDDR5, it made up for the deficit by enabling all 15 SMXs and running its memory at a 16% higher frequency. The result was that Titan was a better compute card, while the 780 Ti was better for gamers. You couldn't have both, you had to choose one or the other.

Today NVIDIA is letting its compute-at-home customers have their cake and eat it too with the GeForce GTX Titan Black. The Titan Black is a full GK110 implementation, just like the GTX 780 Ti, with all of the compute focused-ness of the old GTX Titan. That means you get FP64 performance that's only 1/3 of the card's FP32 performance (compared to 1/24 with the 780 Ti). It also means that there's a full 6GB of GDDR5 on the card, up from 3GB on the 780 Ti.

  GTX Titan Black GTX 780 Ti GTX Titan GTX 780
Stream Processors 2880 2880 2688 2304
Texture Units 240 240 224 192
ROPs 48 48 48 48
Core Clock 889MHz 875MHz 837MHz 863MHz
Boost Clock 980MHz 928MHz 876MHz 900MHz
Memory Clock 7GHz GDDR5 7GHz GDDR5 6GHz GDDR5 6GHz GDDR5
Memory Bus Width 384-bit 384-bit 384-bit 384-bit
FP64 1/3 FP32 1/24 FP32 1/3 FP32 1/24 FP32
TDP 250W 250W 250W 250W
Transistor Count 7.1B 7.1B 7.1B 7.1B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Launch Date 2/18/14 11/07/13 02/21/13 05/23/13
Launch Price $999 $699 $999 $649

Unlike the original Titan, there are no compromises on frequency. The memory runs at a full 7GHz data rate just like the 780 Ti. The GK110 core and boost clocks are up by 1.6% and 5.6% compared to the 780 Ti, respectively. Compared to the original Titan we're talking about anywhere from a 13.8% to a 19.9% increase in performance on compute bound workloads or a 16.7% increase on memory bandwidth bound workloads.

Gaming performance should be effectively equal to the 780 Ti. NVIDIA doesn't expect a substantial advantage from the core/boost clock gains and thus didn't bother with a sampling program for the Titan Black.

The heatsink looks identical to the original Titan, just in black (like the 780 Ti). We've got dissection shots in the gallery below.

We've heard availability will be limited on the GeForce GTX Titan Black. Cards will retail for $999, just like the original Titan.

The Titan Black should be a no-compromises card that can deliver on both gaming and compute fronts. It's clear that NVIDIA wants to continue to invest in the Titan brand, the only question going forward is what will it replace GK110 with and when.



View All Comments

  • vision33r - Tuesday, February 18, 2014 - link

    How can they measure bitcoin bench when the difficulty skyrockets every couple of weeks or days. Reply
  • Death666Angel - Tuesday, February 18, 2014 - link

    That's network difficulty, the hash rate of your device stays the same, that is independent of the network hash rate. Reply
  • MrSpadge - Tuesday, February 18, 2014 - link

    Although we'll probably never know how much of that OpenCL weakness is actually due to nVidia not wanting their good CUDA compiler to create binaries which work just as well on other hardware. Reply
  • TheJian - Wednesday, February 19, 2014 - link

    Umm, this is why it wins in every game. I don't call that a compromise when it is done to make 780ti that outperforms 290x by 10-20%
    780ti OC card vs. 290x OC card. 780ti rules. Period, as most games were 20%+ for 780ti.

    OpenCL means nothing. You run Cuda for PRO apps. There is a reason NV owns 90% of the workstation market which is ONLY running on Cuda with 200+ apps supporting it. Mining crap and F@H makes you NOTHING today (and F@H is just a waste of electricity, you get nothing). You won't even make 1/2 your card back before Asics for LTC and all other coins (programmable to cover everything with the next round) come in 3-4 months. The first wave was specific for BTC, but the next wave is for everything supposedly. It will cost you most of the profits in electricity right now, and even worse with current AMD pricing being nowhere near the fake MSRP. I call it fake because they haven't been sold for that, so when was it real? They are not sold out anywhere now, and haven't been for a long while. PNY says there is a shortage, which is what makes sense based on the Quarterly report for AMD showing ZERO GPU profits that were NOT from consoles.

    IF you are selling out of your top two gpus (290/290x) you should make more than 10mil console socs x ~$12 (120mil - AMD says low double digits, so lower than 15% margins, I'm guessing 12 as those numbers make sense). But they didn't. 8mil sold during the quarter with another 2mil in the pipe being boxed up by sony/ms. So it is easy to see AMD made nothing on 290/290x or as I suspect (and PNY confirms) they are not selling because they don't have many which is REALLY why they are higher priced. They can't make enough that run at 1ghz without special cooling. Is AMD really this dumb at predictions (after 20yrs of making cards, ATI or AMD people should know this) or did they just blow 1ghz smoke while they had to know couldn't be had on many chips? At or near equal pricing it is dumb to walk home with anything but your "compromised" kepler :)

    How many games use compute, OpenCL or "good integer"? If there were many AMD would not be losing by 20% in most games right? There is no point in NV supporting tech that they already have with Cuda in Pro stuff, so you won't see anything from them on this useless stuff for games until games actually start using it. Why waste silicon like AMD on this crap? You think shared crap will save AMD? Cuda 6 has that with maxwell too. By the time it's useful NV will be shipping maxwell (actually probably not even useful until Volta ship times, software needs to catch up). AMD has hardware but no software, NV has software (cuda 6) but no hardware yet...LOL. Neither side wins in a battle were neither sides ideas are being used in anything yet anyway.

    How many games can you play while mining or F@H? NONE. Ok then ;) If any of the things you mention mattered, Cuda wouldn't rule 90% of the workstation market, and 780TI wouldn't be owning AMD in games. Ref to ref AMD loses. OC'd to max on both AMD/NV again AMD loses. Special cooling on both, AMD loses. Drivers? AMD loses. Can you say phase 3? :( Is Freesync working in anything I can buy on a desktop? Again loses. Profits and balance sheets? AMD loses. Do you see the pattern yet? CPU's? AMD loses. They seem to have stopped totally bleeding, but the damage is seriously done already. Can't compete at the top end of GPU's or CPU's, and no ARM for tablets/phones for at least another year, and even then they have no modem so phones will be tougher for them than NV (which finally works on ATT now with a Nov certification). Intel can't get into many phones for the same reason until they get a modem fully integrated. So the largest growing market is out for AMD which is why they went server first (no modem means no phones pretty much). I don't believe seattle is IN HOUSE either (just ARM clone basically) so I'm guessing NV's Denver will be better just like apple's swift and Qcom's cpu cores.

    If Kepler is "compromised", then AMD is just junk that fails to dominate anything that matters? All of the things we see AMD win in, mean nothing to most people. Mining?...Don't make me laugh. F@H? 166K users have downloaded it, so out of 350mil-384mil PC's sold each year, basically NOBODY cares about it either. Easy to see why NV couldn't care less about your junk purposes right? I'd rather have R&D spent on GAMES and optimizing for those.

    The only game anandtech shows using compute is Civ5, which NV is tops in as shown in the 750TI article right? So let me know when your statements actually mean something. :)
    750TI on top of civ5.

    Also they test compute on Sony Vegas, which is a known NV hated app ;) They refuse to test Adobe here because it would be using Cuda which would show the exact opposite of Sony Vegas which again is used by FAR fewer people than Adobe. So again those results mean little to me as I'd only buy Adobe for photo, video, AE etc when using NV hardware. You should always use CUDA when available and testing without it for anything that CAN use it just by switching apps is dumb. But Anandtech has an AMD portal so you shouldn't be surprised by the AMD a$$ kissing ;) All NV cards will do badly in Sony Vegas, you should use ADOBE on NV period. But that wouldn't be in Anandtech's AMD interests now would it? :) Your problem is, you're paying too much attention to Anandtech's OpenCL/AMD slant. Reality is you run other stuff using cuda which is why AMD owns less than 10% of the workstation market, as they all run CUDA and ADOBE. Vegas is at best a 2nd runner. Adobe's suite is #1. Watch for Anandtech to suddenly do an about face on Adobe when AMD finally gets OpenCL working right in it ;) ROFL. We've seen AMD advertising in slides they will be good in Adobe one day, but not software to benchmark yet. Why is anandtech so afraid to show CUDA vs. AMD? Why not run Adobe (NV) vs. Vegas (amd) or at least test with the #1 suite out instead of picking Sony Vegas to show AMD can win something? I don't know how much AMD pays for this favorable treatment, but it really hurts Anandtech's credibility. No NV portal hurts them too.
  • jimjamjamie - Wednesday, February 19, 2014 - link

    Wow, you sure told us. Fight the good fight, champ. Reply
  • dirk_kuyt - Wednesday, February 19, 2014 - link

    wow what a waste of time this must have been typing this dribble! i wish i could say ive never seen someone try so hard to hate on something affects them so little...but thats pretty much everyone. we all think we know everything about everything thats the best. all i know is i have an AMD hd 7970 that i paid $200 for and it plays games just fine. ive also never had a driver problem with and, even using beta drivers. just chill out man Reply
  • TheElMoIsEviL - Friday, February 21, 2014 - link

    TheJian needs to lay off the crack pipe for a bit. Reply
  • Ozminer - Thursday, October 2, 2014 - link

    you need to lay off the magic shrooms Reply
  • 3DVagabond - Sunday, February 23, 2014 - link

    Without sources for all/any of your claims, and your one citation has nothing to do with Titan Black, your post has zero cred. Reply
  • wetwareinterface - Sunday, February 23, 2014 - link

    first off the r9 290 and 290x cards are actually msrp'd for ~ $430 and $580 respectively. and yes the litecoin miners are buying them up driving up pricing. trust me on this the asus custom cooled r9 cards are sold to retailers below the msrp. retailers are just capitalizing on the litecoin craze and jacking pricing up on their end. the reason the cards are available is the price to performance ratio is so skewed now that the retail pricing of the 780 ti looks more attractive. I just bought a 290 for $480 with my employee discount (which is 10% over cost) at work so I know that the price hike is on the retailer side not the card manufacturer side or amd's side.

    at $480 my asus r9 290 is a musch better value than a $720 780 ti.

Log in

Don't have an account? Sign up now