Feature Articles

15 in 15 - The Most Important Gadgets of the Last 15 Years

By Team HardwareZone - 31 Dec 2013

15 in 15 - 1998 to 2005

 

15 in 15 - The Most Important Gadgets of the Last 15 Years

As 2013 comes to a close, HardwareZone celebrated its 15th year anniversary in its biggest ever event with fun-filled activities to thank our readers and followers for all their support. The team has come a long way from its early days in 1998 and since then, HardwareZone has expanded tremendously in experience, size and achievements as it has spawned the only locally produced print tech publication HWM, reached out to establish itself across South East Asia in both print and online, executed the industry's many first-of-a-kind events like the Sim Lim Square Midnight Mayhem with Unreal Tournament 2004, local overclocking competitions, the world's most comprehensive geek competition with HardwareZone's Iron Tech event and the many ongoing PlayTest event series to educate, entertain and engage our readers. Content-wise, we've evolved from focusing only on PC/IT segement to the entire scope of Computers, Communications and Consumer Electronics, while also diversifying how we deliver content be it in the form of news, reviews, live-coverage, videos and over social media.

There's still more to come from the team as we've much installed in the pipeline, but before we bring you more forward looking updates, we thought it would be a neat idea to pay homage to look back at the last 15 years and discover the 15 most important, world-changing, game-busting tech gadgets to have been released by the hand of man. Love them or hate them, discover which ones made the cut right here through the lens of the HardwareZone team. Take note that not every year saw a breakthrough worthy of mention.

 

1998 - NVIDIA Riva TNT

Behold, the best NVIDIA RIVA TNT card in its heyday, the Canopus Spectra 2500. Check out our <a href="https://assets.hardwarezone.com/2009/reviews/video/spectra2500/Spectra2500.html">old review</a>. Image source: tdfx.de.

Before NVIDIA and ATI, 3dfx was arguably the foremost GPU company, its Voodoo graphics cards and Glide API dominated the early years of 3D gaming. The Voodoo 2 was probably the company’s finest ever offering, introducing the ability to draw two textures in a single pass, thus improving performance tremendously, especially on games that used lots of textures or employed multi-texturing. It was also the earliest card to use SLI (Scan-Line Interleave) technology in the consumer market, where two Voodoo 2 cards could work in tandem to improve performance.

However, while the Voodoo 2 enjoyed much success, NVIDIA was plotting in the shadows, laying the groundwork in its new RIVA TNT. Like the Voodoo 2, the RIVA TNT could process two textures in a single pass, and was, at the time, the only card capable of challenging the Voodoo 2 in terms of outright speed. However, the RIVA TNT also offered a couple of other features missing in the Voodoo 2, namely 2D processing and support for 32-bit color rendering. This meant that unlike the Voodoo 2, users of the RIVA TNT need not have a separate card just for 2D processing. Also, support for 32-bit colors would later prove to be decisive as gamers clamored for not just faster graphics, but also better graphics quality. So even though the earlier NVIDIA Riva 128 was the first proper 2D/3D accelerator, it wasn't till RIVA TNT came to the scene when things really started shaping up for the PC 3D graphics scene from all angles: image quality, performance, availability, value and convenience. As such, the NVIDIA RIVA TNT gets our vote as one of the most important game changers, literally.

 

1999 - Onboard Audio

Creative Technology is widely considered to be father of computer audio. In the early 90s, the company enjoyed great success with its Sound Blaster sound card. Sadly, Creative Technology today is a shadow of its former self. The reason for this has to do largely with onboard audio integrated chips.

Thanks to advances in technology, audio in modern computer systems is mostly handed now by a single chip that is integrated onto the motherboard. More accurately, these are audio codec chips that use the raw power of the system's CPU for audio processing needs. This has numerous advantages over the sound cards of old, primarily because it’s a simpler and more elegant implementation that enables audio on the cheap for just about any system. Sound cards of old were add-on cards that took up a PCI slot and cost more for OEMs to include too. Granted these add-on cards have their own DSP chips and can arguably produce better audio, the aforementioned advantages easily triumphed for the mass market. 

This change came about with the introduction of Intel’s i810 chipset in 1999. In an attempt to introduce as much functionality into the motherboard, Intel introduced support for audio processing in its I/O controller hub, using the AC’97 audio codec standard which was developed a few years earlier.

This meant that every motherboard could support and process audio without the need for any bulky and expensive add-on cards. Whether or not audio quality suffered was really besides the point, because what is important was that many users believed the onboard audio chip to be sufficient for their listening needs, leading to the demise of sound cards as we know it. You have to remember that it was about this point of time when MP3 audio was massively popular and with the kind of bitrates that people encoded their audio those days, audio quality or performance wasn't really a concern. It was good enough for the average Joe. Further to that, you have advances in digital audio as there are USB audio peripherals and direct digital audio output via coaxial and S/PDIF connections for consideration where the humble audio codec is sufficient for the job as the burden of audio quality shifted to the output device that converts the digital feed to analog audio that you hear.

 

2000 - The Thumb Drive

Humans in general have a bad habit of taking the small things for granted. Take the lowly and humble ‘thumb drive’ for example. Everyone owns more than a couple and they probably have one on them at all times. Yet no one ever thinks twice about the impact the thumb drive has made on the tech landscape.

Commercially introduced to the market (at least for the US market) by IBM almost a year after turn of the millennium, the first USB flash drives had a storage capacity of 8MB. While it may not seem like a lot today. at that time 8MB was five times the capacity of the floppy disk. However, local Singapore company Trek 2000 showcased the first "Thumbdrive" as early as March 2000 in a CeBIT trade fair and by the month of May, HardwareZone has reviewed a commercially available 32MB Trek Thumbdrive in this part of the world. We all owe the thumb drive a debt of gratitude for making the older disk and disc based storage mediums obsolete.

The thumb drive is essentially NAND flash storage with a USB connector. As both storage and the USB connector have improved, USB flash drives have increased in storage capacity and speed. Kingston unveiled a thumb drive with a storage capacity of 1 TB at this year’s Consumer Electronics show, which goes to show how far thumb drives have progressed from their humble 8MB beginnings.

Portable and universally compatible with just about any platform, thumb drives became popular as a quick way to share data between computers and amongst friends. However as cloud based services become more prevalent, USB flash drives are losing some of their functionality. But nothing on the horizon seems ready to knock thumb drives from their throne as king of offline, portable storage. In fact, new variants have recently been populating the saturated thumb drive market with USB 3.0 and OTG-ready drives that will not only serve their traditional computing hardware, but mobility products like smartphones and tablets.

 

2001 - Microsoft Xbox

Microsoft joined the console gaming hardware market on 15th November 2001 with the release of the Xbox, a console with a rather blocky, unappealing appearance, made so much worse by the oversized X-shape across its top. The name was derived from DirectX, Microsoft’s graphics API. For a nostalgic ride, check out our local launch coverage of Microsoft's Xbox.

While Microsoft lacked the design acumen of rivals Sony, and the longtime fanbase of Nintendo, the Xbox pushed hardware capabilities with a powerful Intel Pentium III based processor, on board HDD storage and NVIDIA co-developed graphics. But what really changed the game was its integrated 100BASE-TX wired Ethernet port.

A year after the Xbox’s launch, Microsoft launched its Xbox Live online gaming service, for the first time allowing gamers to play console games online with other subscribers from around the world, as well as download content directly to the system’s hard drive. By giving gamers the ability and means to find and play games with complete strangers, Microsoft created an entire genre of online console gaming, with its ubiquitous flagship title Halo reaping the rewards, becoming one of the most popular game franchises of the 2000’s. The success of Xbox Live forced Sony to integrate an Ethernet port into the Playstation 2 Slim, and it eventually launched its own online service, Playstation Network, in 2006.

Today, more than 20 million users subscribe to Xbox Live and online gaming has become one of the most talked about features of both Microsoft and Sony’s next-generation consoles.

 

2001 - Apple iPod

The original iPod of October 2001 came with a 5GB hard drive, and it was - in typical Apple marketing speak - a digital music player that put '1,000 songs in your pocket'. There were other MP3 players at that time, but they were either too big (the hard drive-based ones) or too low in capacity (the flash-based ones). With its 1.8-inch hard drive, the iPod found the perfect balance. The music player also had a FireWire port, 10 hours of battery life, a 1.5 x 1.5-inch screen, a scroll wheel with clickable buttons, and - to many analysts' dismay - a high US$399 price tag. For a more detailed look at the gadget that propelled Apple to new heights, check out our review.

At first, the iPod was Mac-only, and this limited sales somewhat. But PC support arrived in August 2002, as did USB 2.0 shortly after. In retrospect, if the iPod gave consumers a device to store their 200-song digital music library, it was the iTunes Music Store that gave consumers the excuse to find the other 800. By October 2003, both Mac and Windows users were busy filling up their iPods with US$0.99 songs.

Looking back, there's no doubt that the iPod is one of the most iconic and important tech products the world has ever seen. It brought us the equally recognizable white earbuds, popularized podcasting, changed the music industry, ignited the concept of the connected consumer, taught its competitors a lesson in design, and of course, did a few of them in along the way.

350 million units later, Apple now sells four versions of the iPod: the super-small iPod Shuffle, the compact iPod Nano, the touchscreen iPod Touch, and the hard drive-based iPod Classic. While the iPod's heyday may be over, its DNA lives on in many other Apple products, most notably the iPhone.

 

2002 - Intel Pentium 4 with Hyper-Threading Technology

In 2002, the Intel Pentium 4 processor not only broke operating clock speed records at 3.06GHz, it also introduced the new Hyper-Threading feature. This technology actually makes the operating system think there are two CPU cores, thereby allowing the OS to schedule threads, or set of instructions, to be run in parallel on the pair of logical cores. This technology was previously found only in high-end Intel Xeon processors. With the trickling down of Hyper-Threading (HT) to the consumer end-user, the user is able to use his new Pentium 4-based PC more productively, especially those who like to multi-task by running a number of applications at the same time. For more details and its performance advantages in real-world testing, tune in to our article.

With the introduction of dual- and quad-core processors, the relevance of Hyper-Threading technology was in question. But the HT technology was enhanced by Intel and it is still in use today, in the fourth generation Intel Core chips, as well as Xeon processors. The last single-core Pentium 4 processor was released in late 2006; in the following year, the first quad-core Intel Core 2 Quad Q6600 chip was launched. This relegated the older chip to the history books permanently.

 

2003 - Wi-Fi with Intel Centrino Technology

The Intel Centrino technology essentially integrates wireless capabilities into a mobile platform, allowing a laptop to connect to the services of Wi-Fi 802.11 networks. This is done without the need for additional networking hardware that the user needs to purchase. At that time, in 2003, the Intel Centrino platform stood for Intel’s grand plans to stay ahead in the mobile computing segment. The three core components of the platform, namely the processor, chipset and Wi-Fi controller, were all supplied by Intel. Such tight control ensured the components have been designed to work together seamlessly, through rigorous adherence to high manufacturing and testing standards.

For Centrino powered notebook users, they were able to work without being tethered to their desks as they didn’t have to stay within reach of a physical network point. At the same time, the Intel Centrino technology also touted power saving features from the new processors and Intel's integrated ecosystem so as to extend the device’s battery life, which enabled users to use their mobile devices longer in between charge cycles.

The popularity of the Centrino mobile computing platform was also boosted by Intel’s excellent marketing campaign. With an easy-to-recognize logo, it meant that the customer only had to look for the Centrino logo on a notebook to be assured of good battery life, performance and mobility. It also helped that Intel OEM partners were subsidized for marketing Centrino notebooks. In short, Intel created a win-win situation for both end-users and OEM partners with its initial introduction of the Centrino mobile computing platform and spawning the Age of Wireless Computing.

For more reading materials, we've covered Intel's Centrino Mobile Technology in great detail: (1), (2), (3), (4) and (5)

 

2005 - Perpendicular Magnetic Recording (PMR) HDDs

The 3.5 and 2.5-inch form factor hard disk drives have been with us for a long time. Have you ever wondered how hard disk drive manufacturers are able to increase capacity of drives without increasing its size?

Along with increasing the number of platters in the drive, one of the ways they do so is to increase the storage density of the platters in the drive. That is to make each platter store more bits of information. In the early 2000s, hard disk drive used a form of technology called longitudinal magnetic recording (LMR) to store bits to a platter. This technology was found to have a limit of around 200 gigabit per square inch, and with users’ ever growing demand for more storage, this was soon found to be insufficient.

The breakthrough came in a technology called perpendicular magnetic recording (PMR), which can increase storage density by up to three times. It is estimated that PMR will allow for storage densities of up to 1 terabit per square inch. This is the same technology that enabled

To understand how it works, imagine you are trying to fit as many books as possible in a space - with the one rule being that no book can be stacked on top of each other. Therefore, it makes more sense to position the book upright on its spine as opposed to laying it flat. This is essentially how perpendicular magnetic recording can increase the storage density of platters. For more information, check our previous storage evolution article.

Join HWZ's Telegram channel here and catch all the latest tech news!
Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.