Many of the components that are typically found on desktop computers today were once technologies reserved for the server and high-end workstation market. Numerous examples can be given; lets take RAM size. As short as three years ago, any RAM amount over 64MB was thought to be an overkill, to say the least. It was really only in the server market that we saw 'outrageous' RAM configurations such as 256MB. Now we typically see high-end desktop computers come with no less than this amount, with some manufacturers selling computers with RAM sizes that stretch into the 384MB plus range. In a similar fashion, older server hard drive sizes have found their home in desktop systems, with the majority of major OEMs pushing high-end home systems with 30 or more gigabytes of space; sizes previously reserved to servers exclusively.
The truth of the matter is that is just a matter of time before more or less all of the components found in servers eventually find their way to the less expensive home computing market. The trickle down effect we see is caused by both the decreasing cost of the technology as well as the advent of new and more effective technologies. It is a combination of both of these effects that allow use of previously costly technologies in the enthusiast market.
In addition, as software evolves and demands get larger, there is a greater need for higher end technology in the desktop and low end workstation market. Operating systems provide perfect example of the need for more advanced technology in the home office. Prior to the release of Windows2000, any amount of memory in excess of 128MB was wasted in many cases. Windows2000, a true 32-bit operating system, is more more apt to take advantage of as much memory as it can get its hands on. With the release of new operating systems nearly every two years and applications becoming increasingly demanding, the longevity of a computer is often dependent on what previously high-end technologies the system incorporates.
The most recent case of server technology proliferating into the mainstream market comes with RAID. Standing for Redundant Array of Independent (or Inexpensive) Disks, RAID technology was originally developed in 1987 but has really only been utilized in the server market until very recently. Now a buzz work among hardware enthusiasts, RAID technology is quickly finding its way into many home and professional systems. Promising increased speed, increased reliability, increased space, and combinations of these features, it is little wonder why RAID technology powering its way into users systems and hearts.
Due to its recent adoption in the main stream market, more than a few questions surround the desktop RAID market of today. What are the different RAID modes and which one should I choose? What stripe size should I build my array with? And, perhaps most importantly, which IDE RAID controller is best? In order to both remove the mystery associated with RAID technology as well as help you, the potential RAID owner, choose what configuration and which RAID controller is best, today AnandTech takes an in-depth look at the RAID solutions out there now in an approach that has never been attempted before.