Tuesday 1 December 2015

Personal Computing: 1985-2015

Personal Computing: 1985-2015

James Hunt examines the amazing advancements that have occurred over the past 30 years...

If you've come to computing in the years since then - or if you're too old to remember that far back - we've put together this comprehensive history of the PC. Whether you're nostalgic for the good (and sometimes quite bad) old days or simply interested in what you missed, we hope you enjoy reading it.


1985-1995: The Dawn Of The IBM-PC


In November 1985 the world of personal computing was still in its infancy. Although home computers first became commercially viable in the late 70s, it wasn't until the 80s that their popularity started to become established with an explosion of new systems and hardware.

The one that really started it all was the IBM Personal Computer - the undisputable progenitor of the modern PC market. This system, also known as the IBM Model 5150, is the reason there's a distinction between PCs and Macs (even though they're both 'personal computers') and the reason that, for the first couple of decades at least, your Windows PC was sold as being 'IBM-compatible'. Compatible with the Model 5150 was the point.

Utilising a 4.77MHz Intel 8088 CPU and running PC-DOS version 1.0, it was undeniably a simple piece of hardware. Although it had an optional cassette interface, it also gave buyers a choice of up to two 5.25" floppy disk drives. If you squint a bit, the IBM 5150 is still broadly recognisable as a modern PC. Paired with its IBM 5151 monochrome monitor, it has some retro-future charm, like something you'd see in an old sci-fi movie. But in real terms, it wasn't a huge hit. Most of you will have never used one, let alone owned one.

However, the open architecture and canny business-focused marketing of IBM led to the IBM-PC becoming the basis for powerful, adaptable home computing. Throughout the 80s, IBM-compatible systems became the PC of choice for non-hobbyists who wanted a computer that could be used for work and play, whereas other machines - the likes of the Commodore 64 and ZX Spectrum - were primarily associated with home gaming. Only Apple, which had outright rejected gaming to the point where cheaper Macs didn't have joystick ports at all, produced what looked like business machines - and they were too much so in the end. The future of computing wasn't specialised but generalised.

Although several models came and went in the months and years after the release of the IBM 5150 - all of which were, of course, backwards compatible - the next big milestone came barely a year. The IBM XT 286 was released in September 1986 and came with a basic 20MB hard disk, 6MHz processor and a new innovation: 640KB of zero wait-state RAM (in the form of 128KB on the motherboard and two 256KB additional SIMMs), which could be accessed by the processor almost instantly.

This new feature allowed it to run programs faster than even some more expensive models with faster 8MHz CPUs and created what is, for all intents and purposes, the personal computing model we still use today: a combination of RAM, motherboard and CPU with access to both permanent and removable storage.

By the mid-80s IBM was already making as much money from many PCs as the rest of the industry combined. More than half of companies used IBM systems, whereas the second biggest slice of the market - Apple's - was just 16%. Aggressive pricing kept IBM systems popular even as 'lookalike' systems (which used reverse-engineered, non-infringing BIOS systems) grew in popularity, increasing the market for IBM-compatible software and hardware.

By the end of 1986, an IBM system that had cost $1,600 could be bought in compatible format for as little as $600, which was cheaper than an Apple II. So it was that IBM systems running MS-DOS became popular in homes as well as the workplace. In 1986, games company Electronic Arts began to build games specifically for the PC, instead of simply porting them from other systems. In 1987, Commodore and Atari, once titans of home computing, announced IBM-compatible systems to try to take on some of the market share.

Although the prevalence of IBM-compatible clones made the platform a huge hit, it did so at the expense of IBM's control over the market. Its 'official' hardware was seen as expensive, and while it was popular with companies who wanted full solutions, the home market was overrun with much cheaper compatible systems, to the point where the first home PC based on the new 80306 platform was the Compaq Deskpro 386, which wasn't made by IBM. It was the first time a change of processor architecture for the PC-compatible platform hadn't been led by IBM, and shipped with Windows/386, a version of Windows 2.1 that had been adapted for the new CPU.

By 1990, the PC market had become so popular that companies preferred to get the PC versions released first, even though the Amiga and Atari platforms had notionally better graphics and sound capabilities. Technical capabilities were no match for the superior userbase the PC offered. 65% of the home computer market was made up of IBM-compatibles, with the Amiga at 10% and every other platform, including Atari and Apple, somewhere below that.

It can be argued that the IBM-compatible struck the killing blow to competing platforms with the release of Windows 3.0 in 1990. As far back as 1985, Amiga and Atari ST systems had showcased impressive sound and graphical capabilities, while the IBM had survived in spite of, rather than because of its quality. Windows 3.0 gave PC-compatibles similar abilities, allowing them to play video and complex audio.

Windows 3.0 was also the point where the tide began to turn against the command line-based MS-DOS. It was a significant rewrite of earlier Windows versions, with a revamped user interface, superior memory management, the ability to run text-mode MS-DOS software natively and basic multitasking. Windows 3.0 was far from the first operating system to make these leaps, but it was the first to make them on the IBM-compatible system. It also included games such as Reversi and Solitaire alongside now-familiar programs like Paintbrush (later MS Paint), Notepad and File Manager.

By supporting a 256-colour VGA mode, Windows 3.0 allowed IBM-compatible systems to look better than ever. In 1991, an updated version added 'multimedia extensions' with support for sound cards like the Sound Blaster Pro, including CD-ROM drives. Multimedia computing was now accessible in the home, and a programme of selling Windows pre-installed on PCs began a tradition that we take for granted today: that your operating system can be installed and ready to use the moment you first turn the system on.

Although DOS remained popular (particularly for gaming) long after the release of Windows 3.0, this was the point where things began to change. In April 1992, Windows 3.1 added TrueType fonts and made the media extensions available to all customers rather than just OEM installations. Just months later, in October 1992, Windows 3.11 for Workgroups added network support. By December 1992, 82% of the gaming market was focused on PCs, with Apple taking 8% and the Amiga just 5%.

In 1994, Amiga released its last two computers: the console-style CD32 and the desktop-style A4000T. Atari had discontinued its own final PC, the Atari Falcon, the previous year. Only Apple's Power Macs remained as a serious competitor to the IBM-compatible Windows PC, and even then it was barely putting up a fight. For all intents and purposes, the PC had won the home PC market. And it wasn't even finished.

1995-2005: Multimedia Explosion


If you had fo pick the most significant date in the history of IBM-compatible computing, 24th August 1995 would probably be a solid contender. It was, of course, the release date of Windows 95, and it did what no other operating system had managed to do before: it made PCs attractive.

Sure, by modern standards it's clunky and businesslike, but compared to its contemporaries, Windows 95 looked and sounded like the future. The 16-bit architecture that had ruled for years was replaced with 32-bit architecture that could pre- emptively multitask, while plug-and-play features did away with complex technical configurations that kept away novice users. Some basic restrictions that had existed in DOS-based systems for over a decade (such as the horribly misconceived 8.3 filename format) were revised away. It was, in many ways, a new dawn for the PC.

Clearly, Microsoft thought so. To accompany the release of Windows 95, it conducted an aggressive marketing campaign that was unprecedented in the history of the personal computer. Bill Gates introduced Jay Leno and licensed the Rolling Stones ('Start Me Up') to make computing look cool and accessible. Every time a tech CEO wheels out a big name star just long enough to get a photo with their arm around them, it can be traced back to the Windows 95 launch.

It was, of course, a total success. Even people who hadn't so much as clicked a mouse button knew that Windows 95 was coming. It made people who didn't own a PC think they could use one. More than that, it convinced them they needed one. Windows 95 took full advantage of the multimedia capabilities the average system now had. On the CD version, you could find 3D games, music and videos. PCs weren't just for work and gaming anymore. They were for everything.

At this point, the PC required to run Windows had an average of 4MB of RAM (8MB was better), anywhere up to 500MB of hard drive space and ideally an 80486 processor with a clock speed in the region of 50MHz. CD-ROM drives were becoming standard, and 5.25" disk drives had long been replaced by the more compact 3.5" floppy. Indeed, it was possible to buy Windows 95 on floppy disk: 13 in total, formatted using a special file system so they could contain a massive 1.7MB instead of the usual 1.44MB.

Perhaps astonishingly, Windows 95 didn't ship with Internet Explorer. Although the web had been invented in 1991 at CERN, it was still pretty niche. In fact, Windows 95 didn't even support TCP/IP networking out of its box. Internet Explorer 1.0 was added to Windows 95 by the Microsoft Plus! For Windows 95 pack, and it wasn't until the first service pack release that Internet Explorer came as standard with Windows - in this case, version 2.0.

While Windows 95 could even run on 386 CPUs, for many it provided the impetus to upgrade to the latest processors on the market. These fifth-generation x86 chips were named Pentiums by their manufacturer, Intel, and they all included something that made them much faster than the standard 386 and 486 chips: a floating point unit.

The first Pentium chips were released in 1993, with architecture that could double the performance of a 486 chip. In 1994, Intel was embarrassed when Professor Thomas Nicely at Lynchburg College in Virginia discovered the famous FDIV bug, in which a simple calculation could be shown to give the wrong result if computed on a system with a first-generation Pentium CPU. Despite this, Pentium and Pentium-compatible chips were a success, and their superior mathematical performance helped users in the era of 3D gaming.

While Windows 95 was a huge step forward in making it easy for home users to add and upgrade hardware, another huge leap was made in 1996, with the release of USB 1.0. USB would eventually replace a number of different standards. SCSI, serial, parallel and PS/2 interfaces all gradually relinquished their position to USB, which could transfer both power and data and better recognition abilities. Admittedly, it wasn't until USB 1.1 was released in 1998 with its added pass-through capabilities that the format truly led to so-called legacy-free PCs, but the groundwork was laid with the initial release of the standard.

The high storage capacity and fast data transfer rates of CDs saw them rapidly replacing floppy disks in the commercial market, but the format truly came into its own in 1997, when CD-RW drives started to become affordable (though that still meant in the region of a couple of hundred pounds for the drive and several pounds per disc).

Other formats would come and go, including proprietary high-capacity disks like Iomega's Zip Disks, but the plunging price of recordable CDs (and later, the format's interoperability with DVD drives) would ensure that optical discs remained the dominant PC standard for home storage until USB flash drives became truly affordable around a decade later.

As the internet became an increasingly important aspect of Windows and its software, computing started to move online. The mid-90s saw the establishment of the first major web companies. Amazon was founded in July 1994, Yahoo! was incorporated in March 1995, and eBay established in September 1995. Perhaps surprisingly, Netflix began in 1997 as an online video rental store that operated by post. And in 1998, a new search engine called Google launched.

Perhaps the most contentious launch of the late 90s was Napster. Created by Shawn Fanning and Sean Parker, Napster used the fledgling MP3 file format to enable music piracy on a previously unheard of scale, allowing users to upload and download music to one another free of charge. At the time, the commercial sale of MP3s was almost non-existent, but digital piracy was quick to fill the demand. It was already common for computer users to 'rip' music into the format (the first MP3 players were released in 1997), but it's fair to say that Napster took the music industry by surprise.

Despite its notoriety, Napster only existed in its original form for a couple of years before it was crippled by lawsuits and restrictions that prevented any ability to trade copyrighted songs. Nonetheless, it was a huge signal of what was to come, both in terms of file sharing, lax attitudes to copyright law and user-to-user connectivity. Sean Parker would later become the first president of Facebook, in 2004.

Although the dot-com bubble threatened to snuff out the internet as a business medium, it ultimately weathered the storm. The early 00s saw increasing bandwidth capabilities and data technologies transform the internet into something that would be dubbed 'Web 2.0'. This actually meant an increased focus on user-to-user interactions, including blogs, article comments and social media. Sites like Wikipedia (2001), MySpace (2003) and YouTube (2005) arose during this period, taking advantage of the drive to encourage users to generate the contents of a website for its owners, rather than vice versa.

Perhaps the ultimate expression of this user-to-user ethos came in 2001, when programmer Bram Cohen invented BitTorrent. Previous attempts at filesharing on servers like Kazaa and Napster had involved centralised servers, which became the target of lawsuits and criminal investigations. BitTorrent had no such requirement. 15 years since its creation, it remains the dominant technology for both legal and illegal filesharing and seems unlikely to disappear or be replaced any time soon.

The advent of illegal filesharing did have at least one positive effect, though, and that was in forcing companies to admit that there was a market that was going unfulfilled. In April 2003, Apple launched the iTunes Store as a legal alternative to music filesharing, and it was the first service to offer a catalogue containing content from all five major music labels. Although initially only compatible with Mac systems and the iPod, it later expanded to service Windows (in October 2003) and non-Apple devices, and it provided the template that many other digital retailers would follow.

As for the desktop market, Windows 95 was updated several times, most notably into Windows 98 and Windows ME, but the future of the operating system lay with Windows XP, which was released in 2001 and which was an upgrade of the previously server-oriented Windows NT platform. Although it had its critics, Windows XP probably counts as Microsoft's most successful operating system. Even today, in 2015, it runs on an estimated 12% of computers - more than any operating system other than Windows 7.

By now, an average PC had hard drive storage of around 100GB, DVD drives with CD-RW support built in were ubiquitous, and many PCs even had DVD-RW capabilities. CPUs had reached multi-GHz speeds, and systems routinely had as much as 1GB of RAM. 64-bit processors were also becoming increasingly popular, though no one was quite sure what they could do that made them much better than 32-bit ones, which made uptake slow.

Similarly laptops, a rare, luxury item in the 90s, became big business in the 00s as prices dropped to affordable levels. Desktops were still the must-have item, but that would soon change in more ways than one.

2005-2015: The Information Age


By 2005, the big changes in the world of computing were largely happening online. Web 2.0 was replaced by (or arguably evolved into) social media sites. Companies no longer wanted your content - they wanted you. MySpace may have launched in 2003, but its popularity waned as quickly as it had risen, through a combination of mismanagement and increasing competition. When users began to desert the service, they were heading to something that's still familiar today: Facebook.

Facebook actually launched in 2004, but for years its membership was restricted, first to a specific US college, then to US colleges, then to universities and colleges in any country. In 2006, when Facebook opened to the general public, it was already thriving. The strange part was that unlike sites like Flickr, which wanted you to share your photos, or Blogger, which wanted your writing, Facebook was just a place to hang out and do everything. Or nothing. It didn't mind, as long as you did it on Facebook.

The advent of always-on broadband connections meant that PCs were now giving more and more of their capabilities over to online activity. This would reach its ultimate expression when 'the cloud' was realised. Dropbox and Spotify both launched in 2008, encouraging users to stop thinking of their PC as a hub for their files and content and more as a portal to the place they were stored online.

That same year, sales of laptops finally overtook sales of desktops, and the iPhone 3GS gave users a cheap, reliable, data-focused smartphone that could browse the internet as well as most PCs (unless you wanted to browse a site built with Flash, of course). Mobile computing had arrived, and it represented the greatest threat to the desktop system for decades - especially when tablets arrived with the launch of the iPad in 2009, which further fragmented an increasingly divided market. A small mercy, then, that netbooks came and went.

While desktop sales have slowed down thanks to greater consumer choice, the quality of hardware is now better than ever. SSDs came of age in 2005, and after representing a premium choice for some time, they're now cheap enough to genuinely replace hard drives on even a standard desktop system.

Despite this revolutionary change in storage, the biggest hardware leap of the last ten years probably happened in 2006 with the release of the first dual-core desktop CPUs, like Intel's Core Duo. As Moore's Law began to struggle under the limits of commercially viable transistor materials, manufacturers looked to new ways to speed up processors, and parallelism was the solution.

These days, a single household PC is probably one of the most powerful computing devices an ordinary person can buy. The average system has computing power many times that of the earliest PCs. Four CPU cores, each running up to a thousand times the clock speed of the chips in the earliest IBM-PCs. Maybe several thousand times more RAM. Quite possibly more storage on a single 4TB disk than there was in the country in 1980, when a gigabyte of hard drive storage cost $193,000.

With the right software and hardware, they allow you to do almost anything: assemble ultraHD videos with movie-quality effects, record music as cleanly as any studio, write software that'll run on any device and access virtually all forms of entertainment. A well-built PC can outperform the most powerful games consoles on the market. When you sit down at a PC every day to go and check your email, it's easy to forget the incredible technological path that got us to this point, and the amount of raw power you're taking for granted every time you switch on your machine. For 30 years, it has helped transform our lives. We can't wait to see where the next 30 take it.


4 People Who Changed Computing


1. Bill Gates
It's popular to bash Bill Gates and his monolithic software company, Microsoft, but it's fair to say that without them the world of home computing might look very different. Microsoft's operating systems power more than 90% of all desktop systems, and Gates's personal success is almost incalculable: he's nearing 30 years straight on the world's richest person list. Gates's contribution wasn't necessarily Windows itself but the creation of a standardised platform for hardware and software that made the market easy and profitable for developers to enter. Whether you love of loathe Microsoft, it's true that without its imperial qualities, there'd be far fewer PCs around, and we'd all have much less to do on them.

2. Steve Jobs
As the co-founder of Apple, Steve Jobs helped create the first home computer with a GUI (the original Macintosh) before being forced out of the company. While at his new company, NeXT, he managed to help create Pixar before NeXT was bought by Apple. Jobs was re-installed as CEO to return the company to profitability, and the decade that followed saw the introduction of hardware like the iMac, iPod and iPhone, as well as software like the App Store and Mac OS X. When Jobs died of cancer in 2011, the computing industry lost a powerful innovator, and again - even if you dislike Apple and its products, you can't argue that the industry wasn't stronger with him in it.

3. Michael Dell
The founder of Dell, Inc. (of course), Michael Dell has been selling personal computers since the 1980s. His big innovation was to bet capital on the idea that manufacturer-sold PCs could undercut a retailer-led model. He was right. He started his company out of a condo in 1984 using $1,000 of investment, and by 2001 it had officially become the world's largest PC maker. Dell's size and influence means it has, to a large extent, controlled the pace and development of desktop  systems for over 20 years. He was even the first to sell them online, in 1996. The computing industry might still exist without Dell, but it'd probably look a lot less healthy.

4. Tim Berners-Lee
If one man can be said to have invented the future, Tim Berners-Lee might be a strong candidate. In the early 90s, while working for CERN, Berners-Lee managed to invent both HTML and the HTTP protocol that powers the web to this day. He didn't invent the internet, but before he created the web there wasn't any part of it that was accessible to the lay-person. The first web page went live on 6th August 1991. In 2014, the one billionth website was announced. Maybe he was just in the right place at the right time, but whether it was genius or circumstance that made Tim Berners-Lee famous, it's indisputable that he came up with the thing everyone needed but no one really knew they wanted. That doesn't happen often.

Computing's 4 Biggest Flops


Not every new product and idea is going to be a big success. Indeed, sometimes the opposite is true. While looking at the history of computing may give you the impression that it's a relentless climb towards the stars, it's always worth remembering the times the industry got things wrong in the last 30 years as well.

1. OS/2
In the early 1990s, IBM attempted to claw back some of the PC market with its own operating system. Notionally a good idea, the baffling decision was taken to market OS/2 in association with the PowerPC chip, which was incompatible with Windows and its existing applications. Users didn't want the PowerPC CPU and that meant they weren't interested in OS/2 either. What little foothold it had was obliterated virtually outright when Windows 95 came out.

2. Windows Vista
If IBM can take heart from anything, it's that even the king of operating systems can get things wrong. Perhaps Vista was genuinely rubbish, perhaps its release delays meant people got too comfortable on Windows XP. Either way, this iteration of Windows flopped in a way only matched by the stopgap Windows ME some years earlier. Vista was so thoroughly rejected by customers (both in the home and in businesses) that they rushed out its (vastly improved) successor and effectively abandoned the platform early.

3. MySpace
Not the first social network but certainly the first that even people who don't use PCs would have heard of. MySpace had it all: a huge userbase, global recognition, and brands and companies beating down the door to get on the network, yet somehow it squandered it all. Maybe the site's notorious privacy missteps doomed it, or maybe the launch of Facebook was always going to kill MySpace. All we know is that in 2005 it was bought by Rupert Murdoch's News Corporation for over $500 million, and by 2011 it was sold again for around $35 million, shedding 1,400 employees in the process. There are collapses, and then there are collapses. This was definitely the latter.

4. The Internet
It's fair to say that it's bounced back quite convincingly of late, but the early 00s were a dark time for anyone who had invested money in this fledgling medium. When the dot-com bubble burst in March 2000, it went on to wipe around $5 trillion off the stock markets by October 2002. When AOL merged with Time-Warner in January 2000, it was hailed as the coming-of-age for the business of the internet. Three years later, the men who forged the deal were history, and AOL's brand was quietly dropped from the company's name. It took another 15 years for the markets to come close to the peaks they'd enjoyed in 2000, and had things been even slightly different, so would the internet you use today.