GeForce 6200 TurboCache: PCI Express Made Useful
by Derek Wilson on December 15, 2004 9:00 AM EST- Posted in
- GPUs
Half-Life 2 Performance
Here is the raw data we collected from our Half-Life 2 performance analysis. The data shows fairly consistent performance across all the levels we test.Half-Life 2 1024x768 Performance | |||||
at_canals_08 | at_coast_05 | at_coast_12 | at_prison_05 | at_c17_12 | |
GeForce 6200 (128-bit) | 43.6 | 65.23 | 50.4 | 41.57 | 45.84 |
GeForce 6200 (TC-64b) | 38.26 | 61.81 | 48.2 | 38.92 | 42.11 |
Radeon X300 | 34.3 | 57.54 | 39.14 | 32.62 | 40.69 |
GeForce 6200 (TC-32b) | 31.66 | 51.09 | 39.72 | 30.69 | 34.1 |
Radeon X300 SE | 29.59 | 52.42 | 34.91 | 30.55 | 37.01 |
The X300 SE and new 32-bit TurboCache cards are very evenly matched here. The original 6200 leads every time, but the TC versions do hold their own fairly well. The regular X300 isn't quite able to keep up with the 64-bit version of the 6200 TurboCache, especially in particularly GPU limited levels. This comes across in a higher average performance at high resolutions.
Again, Unlike Doom 3, the TurboCache parts are able to keep up with the 128-bit 6200 part fairly well. This has to do with the ammount of memory bandwidth required to process each pixel, and HL2 is more evenly balanced between being GPU dependant and memory bandwidth dependant.
We can see from the resolution scaling chart that in cases other than 1024x764, the competition between the X300 series and the 6200 TurboCache parts is a wash. It is impressive that all these cards run HL2 at very playable framerates in all our tests.
43 Comments
View All Comments
manno - Wednesday, December 15, 2004 - link
any chance we can see numbers for this thing on an intel system with higher bandwidth DDR2 memory? Maybe even overclocked DDR2?R3MF - Wednesday, December 15, 2004 - link
"The next thing that we're waiting to see is a working implementation of virtual memory for the graphics subsystem. The entire graphics industry has been chomping at the bit for that one for years now."3DLabs VP10 GPU has a feature that allows system memory to be treated as virtual memory for the GPU, and it has been out 12 months or more.
faboloso112 - Wednesday, December 15, 2004 - link
im an ATI fanboi but good job nvidia!..your products keep on getting better and better!(though i dont plan on downgrading to any of these from my 9800pro...its still a great budget card). nvidia has certainly won in this generation of graphic card wars...lets see whats going to happen when the next gen line up comes out(though i doubt that'll be anytime soon).