ATI's New Radeon X850 and X800 Lines: A Smorgasbord of GPUs
by Anand Lal Shimpi & Derek Wilson on December 1, 2004 9:41 AM EST- Posted in
- GPUs
ATI's Radeon X850 XT Platinum Edition
Compared to the X800 XT:
Compared to ATI's previous flagship, the X850 XT PE offers a 0 - 10% increase in performance, with the biggest gains coming in Battlefield and Doom 3. The performance improvements aren't negligible, but definitely no reason to upgrade from a X800 XT. If you're stuck choosing between the two, what's another $50 when you're already spending $500 on a video card?
Compared to NVIDIA's GeForce 6800 Ultra:
Next up we have the X850 XT PE compared to NVIDIA's flagship, the GeForce 6800 Ultra, which is currently only available through OEMs in a PCI Express version.
ATI has always done better in Battlefield than NVIDIA has, so it's no surprise to see the X850 XT PE with a huge advantage there. The rest of the games are basically a wash with the exception of Doom 3 and Half Life 2. Under Doom 3, the X850 XT PE is about 15% slower than the GeForce 6800 Ultra, but the tables are turned as soon as you look at Half Life 2, where the X850 XT PE is almost 17% faster than the GeForce 6800 Ultra. So which card do you pick? Well, both happen to run every single game out on the market just fine at the highest resolutions/detail settings so you can't really go wrong either way. The issue here is predicting whether more developers will use Valve's Source engine or id's Doom 3 engine for future games, and at this point that's a tough prediction to make.
The Radeon X850 XT Platinum Edition basically offers smoother playability at 1600 x 1200 in all of today's games (including Half Life 2 and Doom 3) than either of the previous reigning champions, the X800 XT and the GeForce 6800 Ultra. Now let's have a look at the rest of the X850 line...
69 Comments
View All Comments
Booty - Wednesday, December 1, 2004 - link
Yeah, this product naming is getting out of control - I don't even want to take the time to try to get them straight. I'll wait until everything's actually available, then try to see what the best option in each price range is. Right now, though, I have to go lie down - trying to remember what product is which gave me a headache.bob661 - Wednesday, December 1, 2004 - link
#16I would like to know too since we 6800GT's in a CAD environment and haven't had ANY problems with them.
BenSkywalker - Wednesday, December 1, 2004 - link
Any chance of seeing high res testing again?Alphafox78 - Wednesday, December 1, 2004 - link
#9, how is the 6800GT "slightly unreliable"?? Ive had one for months and have no "reliability issues."D0rkIRL - Wednesday, December 1, 2004 - link
I'm looking into that X800 now, as a possible budget upgrade, so I don't have to switch over to nVidia and get the 6600GT AGP.Entropy531 - Wednesday, December 1, 2004 - link
The X800 XL has some potential if it OCs well. The 6800GT is still the best option though, if you go PCI-E.shabby - Wednesday, December 1, 2004 - link
Worst refresh release ever! A minor bump in core/mem does not equal 50 bucks more.Im thinking that the agp x800 cards are not going to fall in price at all, these pcie only cards are not competing against them so it makes sense(for ati) to keep prices high for both the x800 and x850 cards.
Now lets hope nvidia comes out with a 500/1200 6900 ultra :)
istari101 - Wednesday, December 1, 2004 - link
Considering the fact that both ATI and Nvidia are technological think tanks, you'd think they could do a less confusing job of naming their cards. :|flexy - Wednesday, December 1, 2004 - link
pricing for the "high end" cards is utterly ridiculous...the (so called) high-end (X850) which is only a refresh of current tech is totally overprized, has dual slot cooling etc.....And...R520 is already taped out...
Who pays $520 for this stupid card which, not even has SM3.0 and is only marginally better than previous versions ? Retards ?
jkostans - Wednesday, December 1, 2004 - link
I miss the days of $200-$300 video cards being the top of the line (Voodoo2, TNT2, Geforce 1-4). I dunno why anyone would want to spend any more than this on a video card unless they've got money to burn.