Just over a year back in 2004, ATI and NVIDIA released their then next generation graphics parts and for a good half-year following the launch, both vendors were only replacing their aging lineup with thumbed down variations of the R420 and NV40 respectively in the mid and low-end segments. There wasn't any real competition in dethroning each other in the high-end for quite a while until ATI refreshed their RADEON X800 series with the much-needed RADEON X850 series that brought with it a more refined silicon and was even faster. That allowed ATI to ramp up high-end graphics card production, which was greatly held back during the days of the RADEON X800 XT series that was plagued with yield and production volume issues.
This new high-end series from ATI was launched just before the turn of 2005 and had steady volume in retail by end of January. Now that posed a real challenge to NVIDIA's GeForce 6800 Ultra, which was already tailing the RADEON X800 XT Platinum Edition and it goes without saying for the even speedier RADEON X850 XT Platinum Edition. This was only an issue for single board performance, but NVIDIA already had SLI capable platforms and graphics cards by then. With dual GeForce 6800 GT or Ultra cards, performance scaled really well for games that were previously pixel shading and fill rate limited (basically, GPU limited and not local memory bandwidth limited or CPU-bound). Hence NVIDIA's option for those wanting even faster performance than a GeForce 6800 Ultra was to adopt the SLI version, which became the ultimate performance king if one had US$1,000 to spend (and a relatively cheaper and more sought after option was dual GeForce 6800 GT cards). Though the returns were never always twice that of a single card (where SLI worked perfectly), it was still more than 50% faster than any single card solution, which is justifiable for many hardcore gamers. After all, there wasn't anything to challenge the combo graphics card solution until the very recently announced ATI CrossFire series, which would likely only make it to retail somewhere in July.
Back on the single graphics card arena, tensions were rising high as both ATI and NVIDIA tried to outclass one another consecutively during the period of CeBIT Hannover 2005 held in March. Just days before the show launched, ATI quietly unleashed the 512MB version of the RADEON X850 XT graphics card to limited online media and NVIDIA countered back with an official release during CeBIT itself with a huge 512MB variant of the GeForce 6800 Ultra, which was running in their booth. ATI's RADEON X850 XT 512MB never made it officially and only remained as an option for their partners, but they did seed the RADEON X800 XL 512MB officially in to retail at the turn of May 2005. As we have shown our readers in a string of 512MB graphics cards reviews, the returns were just not there for gamers to invest in them. In fact, when these monster half-a-gigabyte graphics cards debuted, most consumers were actually anticipating the launch of the next generation parts instead. Well in our view, the spate of salvos by both the visualization leaders was definitely a build-up of momentum to launch their next generation solutions, and indeed it was with the buzzwords such as '"G70" and "R520" codenames plastered all over online debates and news tidbits. Finally, one of them has come to pass in reality as 22nd June marks the day when not only NVIDIA releases their G70, but has also made it available to end-users on that very day. Today, we welcome the newly christened GeForce 7800 GTX, which utilizes the G70 graphics architecture.
NVIDIA's previous generation's GeForce 6800 Ultra in SLI had set itself new performance levels and has helped developers and consumers alike to feel what the next generation graphics processors are capable. In fact, SLI setups have helped game houses tremendously in modeling their next generation game engines as well as push the limits of game realism. With NVIDIA hyping that their single GeForce 7800 GTX would be equivalent to it previous generation's SLI setup, it's time we took a look under the hood and see how it has evolved from its predecessor and of course bring our readers the full gamut of results for you to judge NVIDIA's new heir to their throne.