DESPITE ALL THE criticism Apple gets, the company has done one thing well, in fact so well that its computers are playing an entirely different game than the humble beige box we all know as the PC. When Apple moved to x86 it dropped hardware legacy support and started with a clean slate and we’re now at a stage where the PC is desperately in need for the same treatment. You may ask why, well read on and I’ll try to explain.
The PC as we know it today started back in 1981 thanks to IBM, Intel and Microsoft. Things have moved on quite a bit since then, yet every PC today still offers legacy support which hails back to the original IBM PC. Despite getting faster and faster with more advanced features, there are still certain basic features that are holding the PC back from moving to the next stage so to say. Some of you might disagree with me here, but I think it’s high time that we lose this “baggage” and move on.
Take for example features like parallel and serial ports, many modern motherboards still either feature them as part of the rear I/O while others offer them via headers on the board. I’m well aware that some programmers are still using these ports and that there are certain networking devices that still requires a serial port, but for most of us, these ports are simply not needed anymore. Yet these ports are still taking up space on the motherboard and in some cases they’re also using system resources.
The DVI port was introduced back in 1999, yet there are very few monitors that don’t have D-sub connectivity and many graphics cards still ship with D-sub connectors. I thought we were supposed to live in the “digital age” yet we’re using an analogue interface to connect our displays to our computers. Considering that the graphics card as well as the control circuitry inside the LCD screens we’re using are all digital, it seems odd that we’re willingly having this signal converted first to analogue and then back to digital again inside the display. This seems like a pointless exercise to me, but maybe I’m wrong? Let’s not even start talking about DisplayPort as with the exception of a few high-end displays this new and supposedly improved digital display interface is no-where to be seen.
However, one of the biggest problems with the PC as it is today is the BIOS, or the Basic Input/Output System. Every PC has one and without it our computers wouldn’t work. Yet, it too dates back to the early days of the PC and its one of the things that is starting to cause some serious problems. The BIOS has been tweaked and tweaked again to keep up with hardware and software development, but we’re getting to a stage where it can no longer cope with the changes.
The BIOS basically tells the operating system what bits are inside the PC and as we’re moving from 32-bit to 64-bit computing the BIOS is starting to show its age. Not only that, but hard drives larger than 3TB aren’t supported and unless another “fix” is somehow implemented, they never will be. If you start delving into things you’ll also find that there’s plenty of redundant code in most motherboard BIOSes but it’s not removed simply out of fear that it’ll break something that works. Add to that the human factor and you’ll quickly see that there’s a reason why motherboard manufacturers are issuing BIOS upgrades for their products.
With Sandy Bridge Intel will say farewell to the PCI bus, yet many much older technologies will still hang around. Besides the serial and parallel ports and the D-sub connector, most motherboards still feature PS/2 ports as these are needed if you want to install legacy operating systems that don’t support USB keyboards and mice natively. Then there’s the floppy drive connector, which again some of you will disagree with me on being something that isn’t needed any more, but yet again we’re back to legacy support here and it’s a feature that very few people are still using. Many motherboards also features an IDE port, yet again, SATA took over some time ago but IDE devices are amazingly still on sale. On top of that it took quite some time for SATA optical drives to replace IDE drives as the mainstream choice which the biggest reason why we still have IDE connectors on our motherboards.
Many of the old features are kept alive with the help of what is known as a Super I/O chip which houses all of the old legacy interfaces along with features such as temperature sensors and fan speed monitoring. These are usually quite large chips that use a fair bit of space on the motherboard due to the low cost design and manufacturing of the chips. These chips use a bus that’s known as LPC or Low Pin Count bus which was made as a replacement for the old ISA bus for communication with the CPU. This again shows the age of the technology used inside a PC.
If I want to be really picky I’d say it’s time for an entire re-design of the PC, as in it’s time for a new improved motherboard design as well as the power delivery system. Most modern PCs rely on 12V power, yet the power supply delivers 12, 5 and 3.3V which make it inefficient and complicated to manufacture. Most modern power supplies are now designed to convert the AC to 12V DC and then convert that in a secondary step to 5 and 3.3V. A much more cost efficient way would be to have 12V only power supplies with the motherboard doing the rest of the power conversion where needed. The only problem this scenario would create is that 2.5-inch drives and SSDs operate at 5V rather than 12V, but this could be solved by using backplanes for the hard drives, yet another feature that should be common place, but isn’t.
Add to this a motherboard layout which hasn’t changed much since it was introduced in 1995. Intel did try to launch the BTX standard and failed miserably, mostly due to the oversized and extremely heavy CPU cooler which was implemented as a part of the standard. This was of course due to Intel’s hot Pentium 4 processors before the launch of the Core architecture and once Intel moved over to the latter there was no longer any need for the large, bulky and heavy CPU coolers in a standard PC. Today the ATX form factor is starting to show its age, not least by the fact that your typical graphics card takes up two slots which means that power users are often stuck with very few – if any – slots for other expansion cards.
The other major problem with the ATX form factor is cooling. Case manufacturers are trying to alleviate this problem as much as they can by adding more fans, but with graphics cards pulling close to 300W on the high-end and CPU’s that have a TDP of 130W the heat must go somewhere. In a poorly designed chassis this means a lot of hot air gets circulated around inside the chassis. The BTX form factor actually got one thing right; the add-in cards faced the opposite way compared to how they’re installed in an ATX motherboard which would’ve helped with the cooling as all the hot components would’ve ended up on the top of the card rather than the bottom. This is one problem where the entire industry is going to have to come together and work out a new standard, something that sadly isn’t very likely to happen any time soon.
What makes the PC what it is, is also its biggest curse. There are far too many standards that aren’t far to open to be called standards, something which causes a lot of problems. Half of the time it doesn’t seem like the engineers that develop the so called standards are thinking about how well the previous version of the standard will integrate with the next generation of the standard. The PCI Express interface is a good example of how a standard works both ways with relatively few issues, as you can use new cards in a slot using the previous generation of the standard, albeit at reduced performance, and you can use old cards in a slot using the new faster revision of the standard. Sadly this isn’t the norm when it comes to computer interfaces.
Take USB 3.0 for an example, it’s hacked together “standard” when it comes to the connectors. The ports on the back of your PC might look the same, but the truth is that if you’re plugging in an older USB 2.0 cable you’re only using some of connectors inside the receptacle. The micro USB 3.0 connector on the cable end is by no means compatible with devices that have a micro USB 2.0 port, as it’s too wide. This is of course a way for the industry to sell more cables and in all fairness, the USB standard was developed some 14 years ago and it was never intended for the kind of speeds we’re seeing from USB 3.0. On the other hand, the micro USB connector was only ratified in 2007 and you would think that an organization like the USB-IF would have been aware of the fact that a slightly more advanced pin-out would be needed for its next generation standard.
USB is also a good example of how poorly an interface can be thought out, as with a mere 5V and 500mA of power it seems like there wasn’t much thought put into what the interface could potentially be used for in the future. Sadly, as these things go, it’s not always the superior standard that wins, as it’s mostly about cost and ease of mass production. Those seem to be the two biggest driving factors when it comes to new technology rather than usability and future upgradeability.
How important is backwards compatibility really? Sure, I still have one or two old bits of hardware in a drawer somewhere, like a PCI graphics card which I don’t even know if it works, but I kept it “just in case” as a spare to check if something goes wrong with my system. I’m most likely never going to use it and it’s not something that bothers me. I tend to pass on most of my old hardware when I do a major upgrade of my system and I’m still one of those that believe that it’s better to build your own machine than to buy something off the shelf. It’s so much easier to get a system up and running these days, especially with the added convenience of being able to install the OS from a USB drive rather than a sluggish optical drive.
There are of course some very valid reasons as to why backwards compatibility is a good thing, such as being able to access older data. However, with the exception of the actual storage medium that the data was placed on the big problem here is software. If you’ve got files that are over a certain age it’s likely that they’re no longer compatible with modern software. Keeping old software isn’t always an option either, something far too businesses are aware of. That’s also part of the reason why so many companies are slow when it comes to upgrading their hardware and software. Why spend money on something new when the old works just fine and when going for something new requires all the old data to become more or less unusable?
Despite all of this, there is some hope on the horizon as we should see a move towards something called EFI or Extensible Firmware Interface which should hopefully replace the BIOS sooner rather than later. Apple has been using EFI on its entire product range since 2006 and MSI was out early with a product or two, but it never really took off. Intel is also offering EFI support on some of its latest motherboards, with mixed results and a fair few known issues that need to be attended to before it’s ready for mainstream introduction.
The problem with EFI is that your operating system needs to support it, something that isn’t a problem if you’re a Linux user, but when it comes to Microsoft you pretty much need a 64-bit version of Windows 7 for it work properly on a desktop system. Windows Vista requires SP1 to be installed and older versions of Microsoft’s desktop operating systems lacks support entirely. So in as much as the technologies are available, the big players are holding the development of the PC back by not supporting new features that are required to change the basic problems that the PC suffers from.
Come to think of it, it’s actually quite amazing that any of this stuff works, as it’s such a mishmash of bits from various eras. The PC could really do with starting over from scratch and dropping all of its baggage for more modern technology. However, this is unlikely to happen within the next couple of years as none of the big industry players are pushing for it. The only change we can see is a move towards replacing with BIOS with EFI, but this is more of a must than a willing change. I’m sure some of you feel that the heritage of the PC is also part of its charm, but I have to disagree. There just isn’t any need for out-dated technology that I would have to say 90 percent of users aren’t even aware of is there and even less are using. It’s time to bring the PC into the 21st century and now is as good time as any.S|A
Lars-Göran Nilsson
Latest posts by Lars-Göran Nilsson (see all)
- AMD and Nvidia set to take on LucidLogix Virtu - Apr 7, 2011
- Notebooks and hard drives to increase in price - Apr 6, 2011
- Motherboard makers craving affordable USB 3.0 solutions - Apr 6, 2011
- IEEE approves the IEEE 802.16m standard - Apr 1, 2011
- LucidLogix scores Intel as first Virtu customer - Apr 1, 2011