GeCube Gemini 2 (Dual Radeon X1650 XT)

Shown previously at CeBIT 2007, GeCube's dual GPU graphics card is finally ready for the market. Featuring two Radeon X1650 XT processors in CrossFire, this is not your ordinary mainstream card. Read on as we check out its build and performance.

Introduction

It's been a while since we saw the last of those giant dual graphics cards that integrated two graphics core on a single PCB. Usually, these custom cards are created as a demonstration of a manufacturer's innovation (or hubris) and more often than not displayed exclusively at conventions or trade shows. There were few of these cards for retail and those that were sold were hideously expensive - NVIDIA's official dual-GPU card, the GeForce 7950 GX2 being the rare and successful exception.

By our reckoning, the last two we experienced firsthand were the dual GeForce 6800 GT cards from ASUS and Gigabyte . After the initial excitement generated by these behemoths dwindled, they quickly faded from the limelight. Nevertheless, that has not stopped vendors from trying. GeCube has been one of those who have attempted at such a card with the Gemini, which featured dual Radeon X1600 XT processors. First seen at last year's CeBIT, the prototype was never released commercially. However, it seems that GeCube has not given up on the idea and the introduction of ATI's Radeon X1650 XT has revived the Gemini.

GeCube's attempt to combine two Radeon GPUs onto a single PCB - the Gemini 2, with two Radeon X1650 XT cores integrated

GeCube's attempt to combine two Radeon GPUs onto a single PCB - the Gemini 2, with two Radeon X1650 XT cores integrated

Dubbed the Gemini 2, this new dual GPU graphics card has two Radeon X1650 XT cores on a single PCB. Remarkably, GeCube has managed to keep the cooler to a single slot despite the additional complexity of cooling two cores. Outfitted with a heavy, full copper heatsink and capable of powering up to four monitors through custom connectors, this graphics card has been modified to run ATI's Native CrossFire through a PLX bridge chip that links the two GPUs via an 8-lane bus and at the same time, communicates via 16 lanes with the PCIe bus. While we have seen examples of dual GPUs using SLI, this is probably the first time we have seen a CrossFire version. And unlike the last Gemini, Gemini 2 will be available for enthusiasts.

The GeCube Gemini 2

Our first impression of the Gemini 2 was spoilt by the noise generated by the cooler. Despite using a full copper heatsink that adds greatly to the weight of the card, the fan spins at quite a fast speed, making this card one of the noisiest we have experienced in a long time. It brings back unpleasant memories of the older Radeon X1800 series and its loud fan. Perhaps a larger but slower fan would have helped. They could have also opted for a two-slot cooler, but GeCube was clearly stressing the point of a single-slot dual GPU design, which is a noteworthy achievement.

It's quite a remarkable feat to get two GPUs on the same PCB and with a single-slot cooler too.

It's quite a remarkable feat to get two GPUs on the same PCB and with a single-slot cooler too.

It may not look like much but the full copper heatsink used in the Gemini 2 makes it a rather hefty card. Unfortunately, it is not the most ideal of coolers and the noise output was significantly above most mainstream and even high-end graphics cards,

It may not look like much but the full copper heatsink used in the Gemini 2 makes it a rather hefty card. Unfortunately, it is not the most ideal of coolers and the noise output was significantly above most mainstream and even high-end graphics cards,

What lies beneath.

What lies beneath.

Underneath the heatsink, we found the two GPU cores and their respective memory chips (Samsung's DDR3 memory rated at 1.2ns is used). There is also a special PLX bridge that handles communication between the two GPUs and also with the rest of the system through the PCIe bus. GeCube advertises that it has 32 PCIe lanes for switching in all but that is just the sum total. The maximum in any direction is still 16 lanes between the bridge chip and the PCIe bus while the rest are shared between the two GPUs that communicate with the bridge chip. The GPUs themselves are clocked at the standard speed of 575MHz and so are the 1350MHz DDR memory chips. Like a few other vendors, GeCube opts for the greater durability of solid capacitors and unlike the single card, a separate power connector is required for this card, understandable given the presence of two GPUs.

In order to preserve the two sets of dual, dual-link DVI connections on this card, GeCube has used these DMS59 connectors and cables, each of which can support two DVI outputs.

In order to preserve the two sets of dual, dual-link DVI connections on this card, GeCube has used these DMS59 connectors and cables, each of which can support two DVI outputs.

In case you think there is any missing functionality due to its unique design, this card is HDCP ready with built-in EEPROM keys. Video output goes through two DMS59 connections (basically high density DVI connectors), each of which is capable of supporting two DVI outputs. GeCube has provided two pairs of the DMS59 cables, which are split into two DVI connectors, giving us up to four DVI outputs like two dedicated graphics cards should. Since the Gemini 2 is still a pair of Radeon X1650 XT GPUs running in CrossFire, albeit sharing the same PCB now, only one DMS59 (the top one) is capable of booting up the system since that is linked to the so-called 'master' GPU. Connecting your monitor to the other DMS59 connector will not start up your display.

This is one of the two included DMS59 cables in the package. Notice how it splits into two DVI connectors?

This is one of the two included DMS59 cables in the package. Notice how it splits into two DVI connectors?

Besides the cables and two additional DVI adaptors, manual and drivers, we found no other applications or games included. This is typical of GeCube as long time followers know and should come as no surprise. The contents of the bundle are as follows:

2 x DMS59 cables

2 x DVI-to-VGA adaptors

User Manual

Driver CD

Test Setup

An Intel Core 2 Duo E6700 (2.66GHz) was used for our benchmarking, along with an Intel D975XBX 'Bad Axe' motherboard. We equipped this setup with up to 2GB of low latency DDR2-800 memory from Kingston and ran them in dual channel mode. We also used a 80GB Seagate 7200.7 SATA hard drive and installed Microsoft's Windows XP Professional and then updated with Service Pack 2 and DirectX 9.0c.

The GeCube Gemini was running Catalyst 7.4 but the other ATI cards in our comparison, the Radeon X1950 PRO and the Radeon X1650 XT were both on Catalyst 7.2. We also threw in an NVIDIA GeForce 7900 GS 256MB on ForceWare 93.71. No doubt the GeCube Gemini is a dual GPU graphics card on CrossFire but we decided to test it against single GPU cards since its price is likely to be prohibitive and well, it certainly fits into a single slot like any of the other cards compared. The following benchmarks were used:

  • Futuremark 3DMark05 Pro (version 120)
  • Futuremark 3DMark06 Pro (version 102)
  • Tom Clancy's Splinter Cell 3: Chaos Theory (version 1.3)
  • F.E.A.R
  • Company of Heroes 1.3
  • Quake 4

Results - 3DMark05 & 3DMark06

The CrossFire factor helped to push the Gemini 2 to an impressive start, with 3DMark scores that easily topped that of the Radeon X1950 PRO and the GeForce 7900 GS. While a perfect return of twice the performance of a single GPU is highly unlikely for such a dual graphics card, the Gemini's performance scaled rather well, at least 80% faster than a single Radeon X1650 XT or higher as seen in 3DMark06.

Results - Splinter Cell: Chaos Theory & F.E.A.R (DirectX 9 Benchmarks)

One of the disadvantages of getting dual graphics card has to be the uncertainty involved. Performance gains from having dual processors vary from application to application and it certainty did not seem to be working well in Splinter Cell. The Gemini 2 was only about 16% (and 5 to 6 frames) faster than the reference Radeon X1650 XT, putting it a distant third behind in our comparison. Also, CrossFire did seem to have issues with anti-aliasing and HDR as we noticed that our benchmark scores were similar regardless of whether anti-aliasing was enabled. Hence, these scores were not included. In F.E.A.R, the increase in performance expected from the Gemini felt rather short, as having this CrossFire card only gave it a modest boost over the single GPU version. More memory bandwidth would have helped but that's a shortcoming of all midrange graphics cards.

Results - Company of Heroes & Quake 4 (SM2.0+ Benchmarks)

Company of Heroes was one of those games where having dual graphics cards may ironically, lower performance. In this case, the Gemini ran slower than a single Radeon X1650 XT. The reasons for this probably lie with the drivers and also the game itself, which may not be fully compatible with CrossFire in its current version, especially since we had encountered problems with SLI in this game in the distant past but CrossFire capability is still broken. Such problems did not plague Quake 4 as the Gemini provided good performance scaling. However, the weakness of the chipset is all too evident at very high resolutions, with the Radeon X1950 PRO taking over the lead. Once again memory bandwidth plays a big part here.

Temperature Testing & Overclocking

GeCube managed to keep the two GPUs on the Gemini relatively cool, with temperatures that stacked up well against the single GPU version. But like we said earlier, the noise output was quite high and it seems like GeCube only managed to tackle successfully one of the two main problems afflicting graphics cards. Overclocking was also a failure. ATI's Overdrive was not found in the Catalyst Control Center so we tried raising the clocks through ATI Tools instead. Unfortunately, it was prone to crashing halfway through our test, even with a minor clock increase. Therefore, we decided to scrap plans to overclock this card altogether.

Conclusion

There's a simple reason why despite all the media attention about dual graphics cards, their eventual sales figures are usually not significant. Enthusiasts may go on and on about the performance they managed to achieve by having a pair of overclocked graphics cards running in SLI or CrossFire, but the majority of consumers are content with just one. With the heat, noise and power increasingly becoming important concerns as the transistor count of GPUs soar, the attractiveness of dual-GPUs takes a turn for the worst. More so because single GPU variants are plenty fast for the vast majority of users, leaving only a small group pursuing the dual graphics route.

Cost and noise will probably be the downfall of this unique card. GeCube has tried too hard to squeeze everything into a single-slot solution, compromising the noise quotient as a result.

Cost and noise will probably be the downfall of this unique card. GeCube has tried too hard to squeeze everything into a single-slot solution, compromising the noise quotient as a result.

The GeCube Gemini 2 puts two GPUs on a single graphics card, taking away worries that dual GPUs will crowd the enclosure or take up valuable expansion slot space. This is enhanced further by the decision to use a full copper, single-slot heatsink. We must say that we were expecting a larger cooler but the mainstream nature of the Radeon X1650 XT chipset probably made this manageable heatsink size possible. However, the drawback is the noisy fan on the cooler, which makes a din that immediately overwhelms your other system or CPU fans. This is not a cooler that we can live with, even with a solid enclosure with noise dampening panels. Temperatures on the other hand were quite decent, so at least GeCube got half of that equitation right.

Performance too could be another bugbear but this varies depending on the games and applications. Most of the problems could probably be attributed to ATI's CrossFire compatibility and those games that have issues with it will naturally have lower performance on the Gemini 2. Even at its best, do not expect to get twice the performance of a single Radeon X1650 XT in real games, though it does beat higher end graphics cards in some benchmarks like Quake 4 that scales well with dual GPUs. In short, if you want consistent performance, you would be better off purchasing a single GPU graphics card based on a more powerful chipset rather than opt for the Gemini 2. You may get the occasional high score with this unique card but the lows come more often unfortunately. Then there is also the expected higher power draw, noise and heat that results from having two GPUs in close proximity.

Of course, whether one can get the GeCube Gemini 2 in retail is another question. With previous attempts at such cards failing to make any mark in retail, it is debatable if the GeCube can succeed. One major sticking point is likely to be related to its price, since the cost of development is bound to be higher than the typical graphics card. While we don't have the exact retail price yet, a price we have seen bandied online puts it at around US$300, which would make it quite the expensive novelty.

Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.

Share this article