Nvidia GeForce GTX 200 vs. ATi Radeon HD 4800

Graphics pixels
Size isn't everything. The latest offerings from the two big players look very different, but the results are less clear cut

About three years ago, PC processors went through a technical – and somewhat philosophical – revolution.

The Intel Pentium D of 2005 was the first dual-core CPU and it signalled the end of the quest for ever higher clock speeds. Instead, the CPU industry shifted to a theoretically more efficient multi-core, parallel processing approach.

The first signs of an equivalent change in the graphics chip market appeared in 2007. First, Nvidia elected not to replace the GeForce 8800 GTX GPU with a large, significantly more powerful chip. Its alternative was a die-shrunk GeForce 8800, focusing on efficiency and cost savings rather than outright performance.

One step beyond

AMD's graphics subsidiary ATi took the same approach when it moved from Radeon HD 2900 to Radeon HD 3800 Series GPUs. In fact, it went one step further. ATi released the dual-chip Radeon HD 3870 X2 and announced that as far as it was concerned, the game was up for big graphics boards based on a single monolithic GPU die; the future would be multi-chip.

All of which sets up a rather intriguing backdrop for the introduction of a pair of new GPUs from ATi and Nvidia. Once again, ATi has focused on efficiency and affordability.

Nvidia, on the other hand, has gone old school and delivered a single-die graphics chip of truly mammoth proportions.

Nvidia's attempt

You might think that these new pixel pushers are not directly comparable, but the competition between ATi and Nvidia will be just as fierce. The difference now is that the contest is no longer a clean fight between two graphics chips. Instead, it's a battle between two distinct design philosophies and business models.

First out of the blocks is Nvidia's beastly new GeForce GTX 200 series. By any metric, it's a monumentally powerful – even intimidating – new graphics chipset. The GPU at its heart contains an incredible 1.4 billion transistors. That's literally double the number in GeForce 8800 series GPUs. Consequently, the GTX 200 series packs some seriously beefy specifications.

In terms of functional units, the shader count is up from 128 in the GeForce 8800 series to 240. Things get a little more complicated when it comes to comparing new and old texture processing and pixel outputs per clock. Suffice to say that with 32 render output units and 80 texture address and filter units, Nvidia has boosted the new GPU's functional heft in all parts of its architecture by at least 25 per cent – and usually much more.

Long memory

The final piece of the GTX 200 puzzle is memory technology. Here Nvidia has also taken the big iron approach by pairing established GDDR3 memory with a beefy 512-bit interface.

All told, the GTX 200 is a massive 576mm square chip. It's so big that no more than 100 can be squeezed onto the 65nm wafers used by Nvidia's production partner TSMC. To put that into context, Intel can cram approximately six times as many dual-core Penryn CPU dies or 25 times as many Atom processors into the same space. In other words, Nvidia's new GPU is an extremely expensive chip to manufacture.

Unsurprisingly, it's also extremely expensive to buy. At launch, two models are available, the GeForce GTX 280 and the GeForce GTX 260. The former is the full-fat offering with all of the abilities detailed above, core and shader clocks of 602MHz and 1,296MHz respectively and a memory frequency of 2.2GHz. It's yours for around £400.

The 260 model is comparatively cut down with 192 shaders, 64 texture address and filter units, and 28 render outputs. Operating frequencies are likewise somewhat suppressed at 576MHz, 1,242MHz and 2GHz for core, shader and memory respectively. The 260 must also make do with a 448-bit memory bus. The starting price for this slightly more modest GTX 200 is £250.

ATi's option

Latest in Gaming Components
An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging
Where to buy AMD Radeon RX 9070 XT and RX 9070: the best retailers in the US and UK to check for stock
Image of the Resident Evil 4 remake & the RTX 4080 Super
I've spent 250 hours in the Resident Evil 4 remake using an RTX 3080 Ti - the upgrade to an RTX 4080 Super and Asus' 49-inch OLED ultrawide is worthwhile
best crossplay games: Horizon from Apex Legends slowly approaching a machine that glows with blue light
AMD's new anti-lag feature could mark you as a cheater in your favorite competitive shooter
BenQ Zowie XL2566K gaming monitor
BenQ Zowie XL2566K review: for pro gamers only
Promotional image for the Crucial T700 Gen5 NVMe SSD.
Crucial players – how this Gen5 SSD can unlock your gaming potential
G.Skill DDR5 RAM against a colorful pink backdrop
Good news gamers, Samsung is making 12nm DDR5 RAM easier to get
Latest in News
Ray-Ban Meta Smart Glasses
Samsung's rumored smart specs may be launching before the end of 2025
Apple iPhone 16 Review
The latest iPhone 18 leak hints at a major chipset upgrade for all four models
Quordle on a smartphone held in a hand
Quordle hints and answers for Monday, March 24 (game #1155)
NYT Strands homescreen on a mobile phone screen, on a light blue background
NYT Strands hints and answers for Monday, March 24 (game #386)
NYT Connections homescreen on a phone, on a purple background
NYT Connections hints and answers for Monday, March 24 (game #652)
Quordle on a smartphone held in a hand
Quordle hints and answers for Sunday, March 23 (game #1154)