Feature Articles

Voodoo Beginnings - 10 Years of GPU Development

By Kenny Yeo - 15 Jan 2009

Timeline: 2001 and 2002

2001


  • This year, Microsoft introduced DirectX 8.0, which implemented Programmable Shading. This allowed for non-standard shading routines to be applied at the at the pixel-level for interesting effects (more than just T&L functions), resulting in more realistic graphics.

The follow-up to the immensely successful GeForce 2 - the GeForce 3. It was also the first card to fully support DirectX 8.0, and hence, Programmable Shading.

  • In response to this move by Microsoft, NVIDIA launched the GeForce 3, the first card to fully support DirectX 8.0. It wasn't groundbreaking in the same way NVIDIA's previous cards were and the GeForce 3 could even, in some cases, be outperformed by the older GeForce 2 Ultra.
  • ATI, on the other hand, unleashed the Radeon 8500. Unfortunately, its launch was marred by problems with its drivers. Once that was sorted out, however, it proved to be a competitive card.
  • S3, now known as SONICBlue, sold its graphics business to VIA, a chipset provider, for over US$300 million, choosing instead to concentrate on digital media. And although whatever technology VIA inherited from S3 was not powerful enough to compete with modern day GPUs, its low cost, however, made it an ideal integrated solution.


2002

  • Early in the year, NVIDIA was the undisputed speed king once more with the launch of its GeForce 4 Titanium series of cards. It was similar to the earlier GeForce 3 GPU, with the exception of a few additional features, such as higher core and memory clock rates, an improved memory controller and the introduction of the nFiniteX Engine II.

    We had a NVIDIA GeForce4 Ti 4600 128MB DDR in our labs and were surprised at how fast it was. Although it offered top notch performance, it was also very expensive.

NVIDIA reclaimed the honor having the fastest card with the Ti4600. It was very fast, but also very expensive.

  • To ensure they stayed competitive, NVIDIA would later expand the lineup by introducing the cheaper Ti 4200 GPU. This GPU was extremely popular and we managed to secure a Leadtek WinFast A250 LE TD 64MB for testing. Although it was considerably cheaper, it still provided great performance and this particular model was extremely overclockable.

The Ti4200 was the card of choice amongst most mainstream gamers. It was considerably cheaper than the top-of-the-range Ti4600, yet offered good enough performance.

 

  • By now, AGP was the de facto interface for graphics cards, and this year saw the introduction of AGP 8X.

    Theoretically, AGP 8X would double the rate of data transfer from 1.06GB/s to 2.1GB/s, and so we sought to find out how much performance would you get from this increase in bandwidth. You can read the full test here , which was then a world exclusive.

    In our tests, we took a SiS648 reference board as our Universal AGP 3.0 platform and accompanying it was an SiS Xabre400 graphics card, which has support for both AGP4X and AGP8X transfers. Since the reference board does not allow us to set the AGP transfer speed in the BIOS, we had to modify the graphics card to force it to operate in AGP4X (Mode 2.0). This allowed us to compare the performance of the same graphics card using different AGP transfer speeds.

    Despite using only a SiS Xabre400 graphics card, we found AGP8X to give us about a 4.7% boost in frame-rates, and we believed that we would see greater gains with a higher-end graphics card.

    However, in a follow-up test with a higher-end GeForce Ti4200 8X card, we were surprised to find that there was no significant improvement between AGP4X and AGP8X. We alluded this to the fact that AGP 8X was still in the early stages of implementation and that there were still higher-end cards that remain untested.

    Surprisingly however, in a third test (this time with a Radeon 9700), we once again found no significant gain in performance. Clearly, AGP 8X provided negligible benefits. We concluded that this could be because both the Radeon 9700 and Ti 4200 had larger frame buffers, and it could be because these cards were more than capable of handling the games of the moment.

  • In mid 2002, Creative Technology acquired 3Dlabs, the creator of the Permedia chipsets, thereby signaling their intentions to be a major player in the graphics card market. Sadly, that was not to be, and 3Dlabs was left languishing under Creative's ownership.
  • The graphics card market was becoming increasingly crowded and competitive. To stand out, card manufacturers such as Sparkle started looking at packaging their products differently to attract buyers. Sparkle's Platinum GeForce4 Ti 4600 was one of those cards. A card in a tin can? Who would have thought of that?

If we gave out awards for most innovative packaging, this card would have won hands down. Never again have we seen cards coming in tin cans.

  • Finally, ATI released the Radeon 9700, which would later go on to achieve legendary, almost godly status. It was so powerful that it trumped the previous fastest card, the GeForce Ti 4600, by a 20% margin. With anti-aliasing and anisotrophic filtering turned on, it would beat it by anywhere from 40% to 100%! In fact, the Radeon 9700 was so powerful that it would allow gamers to achieve playable performance on even the latest games three years after its launch. We reviewed the Gigabyte MAYA II GV-R9700 PRO and proclaimed that, in its time, nothing in the market even came close to matching it for sheer performance.

    Industry experts would later declare the Radeon 9700 to be one of the most important breakthrough graphics cards in history, alongside NVIDIA's GeForce 256 and 3dfx's Voodoo.

The sight of the Radeon 9700 caused our eyes to well up with tears. This was truly a monstrous card. Absolutely nothing could stand up to it.

Join HWZ's Telegram channel here and catch all the latest tech news!
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.