Wednesday, 8 July 2015

15 Years of Linux

15 Years of Linux

Take a walk down memory lane as Jonni Bidwell examines how Linux has changed over the magazine’s lifespan.


It was a cold grey morning in May 2000. Winter should have departed but that doesn’t happen in Britian. So Reader Zero, seeking respite from the icy rain and miserable population, stumbled into their local newsagent. Zero was hoping for some stimulating and edifying reading material, but was mostly resigned to the notion that the shelves would be populated with the usual feuilletons, corrupt gaming magazines and various ‘zines pandering to interests Zero did not possess. And then he saw it, fluorescent orange, a light in the darkness: “Join the revolution!” the coverline told our enraptured reader. Amazed that frustrated tinkerings at the terminal, considered by their peers an affectation rather than a hobby, could be part of something so exciting and dynamic as a ‘revolution’, Zero was powerless to resist. There was a free disc too, a whole Linux distribution (Definite Linux) was on there! That would take about a month to download over dial up. And there would be another one in four weeks, and eventually there would be not just a CD but a DVD. Zero’s life was changed, and while Definite Linux definitely didn’t last long, and the magazine would change hands many times over the next 15 years, it remained a bastion of quality publishing that would inform, entertain and delight.

Back when Zero was having their cathartic moment in the newsagents, Linux was already about nine-years old. Some distributions (distros) had already established themselves, and one of the earliest was Softlanding Linux System (SLS), which appeared in May 1992. Unlike its contemporaries, SLS provided more than just the kernel and some GNU tools for preparing filesystems, instead it shipped with a networking stack and the X display server. This was considered ambitious and buggy, and efforts to fix this culminated in Slackware’s release in 1993. Also that year, and again in response to frustration with SLS, Debian came into being. Red Hat Commercial Linux appeared the following year, which would engender many popular distros of the late 90s, including  Mandrake, Yellow Dog and Definite Linux. KDE was released in 1998, with Gnome following in 1999. Gnome was in part created due to KDE’s reliance on the then nonfreely licensed Qt toolkit. By May 2000, the most popular distributions were Debian 2.1, Red Hat 6.1, Linux-Mandrake 7.0 (this was how it addressed itself back then), Slackware 7.0 and SUSE Linux 6.3. Some of these featured in the very first LXF Roundup, and you can read all about them in the exclusively digitised issue LXF1 on the LXFDVD this month.

What’s user experience?


If you’re a recent Linux convert who’s had to engage in combat with rogue configuration files, misbehaving drivers or other baffling failures, then spare a thought for those early converts whose bug reports and invective utterances blazed the trail for contemporary desktop Linux. Up until comparatively recently, it was entirely possible to destroy your monitor by feeding X invalid timing information. Ever had problems with Grub? Try fighting it out with an early version of Lilo.

In the early days, even getting a mouse to work was non-trivial, requiring the user to do all kinds of manual calibration. Red Hat released a tool called Xconfigurator which provided a text-mode, menu-driven interface for setting up the X server. It was considered a godsend, even though all it did was generate an XF86Config file which otherwise you’d have to write yourself. So while Windows users whined about Windows ME being slow and disabling real mode DOS, your average Linux user would jump for joy if their installation process completed. Even if you got to that stage, it would be foolishly optimistic to suppose the OS would boot successfully. Hardware detection was virtually non-existent, and of the few drivers that had been written for Linux, most weren’t production quality. Yet somehow, the pioneers persisted – many were of the mindset that preferred the DOS way of working, which began to be sidelined as the millennium approached. Windows users were having their files abstracted away – ‘My Computer’ epitomises this movement.

In January 2001 Kernel 2.4 was released and with it came support for USB and exciting new Pentium IV processors, among other things. It was of particular importance to desktop users thanks to its unified treatment of PCI, ISA, PC Card and PnP devices as well as ACPI support. The dot-com bubble was just about to burst, but all the excitement and speculation around it meant that many computer enthusiasts had a broadband connection in their home, some even enjoyed the luxury of owning more than one computer. This solved some major entry barriers to Linux: people could now download it much more easily; up-to-date documentation was easily accessible; and when Linux saw fit to disappear one’s internet connection (or render the system unbootable), the other machine could be used to seek guidance. But the user experience was still, on the whole, woefully inhospitable. While some installers had evolved graphical capabilities, these more often than not were more trouble than they were worth. Users were expected to understand the ins and outs of disk partitioning, and be able to discern which packages they required from often terse descriptions.

Windows XP was released around October 2001, and while this was seen as a vast improvement over its predecessor, many users found that their machines weren’t up to running it. After all, it required 64MB RAM and a whopping 1.5GB of disk space. Remember that BIOSes had only recently gained the ability to address large drives (there were various limits, depending on the BIOS, 2.1, 4.2 and 8.4GB were common barriers). So many people couldn’t install it on their hardware, and many that met the minimum specs found the performance rapidly degraded once the usual pantheon of office suites and runtime libraries were installed. This provided the motivation for another minor exodus to Linux, and the retrohardware contingent continue to make up an important part of the Linux userbase (and berate us for not including 32-bit distros). Before 2006 all Macs had PowerPC processors, and many of these (as well as early Intel Macs), long-bereft of software updates from Apple, now run Linux too.

The Gnome 2 desktop environment was released in 2002 and this would become a desktop so influential that some still seek (whether out of nostalgia, atavism or curmudgeonly dislike of modern alternatives) to reproduce it. It aimed to be as simple, tweakable and intuitive, and it’s hard to argue against its achieving all of these adjectives.

Oh, we’re so pretty


One of the major enablers was its strict adherence to the Gnome Human Interface Guidelines which set out some key principles for application designers. This meant the desktop was consistent not just internally, but in respect to all the GTK apps that people would go on to write for it.

Also released was KDE 3, which vaguely resembled Windows – in that it was cosmetically similar and slightly more resource-demanding than Gnome. People and distributions sided with one or the other. SUSE Linux (predecessor of openSUSE) always aimed to be desktop agnostic, but most of its users preferred KDE. Heeding this, though not until 2009, it changed position and today is the leading KDE-based distro.

In late 2002, ‘DVD’ Jon Johansen was charged over the 1999 release of the DeCSS software for circumventing the Content Scrambling System (CSS) used on commercial DVDs. This software enabled Linux users to play DVDs, a feat they had been hitherto unable to do since DVD software required a licence key from the DVD Copy Control Agency, one of the plaintiffs in the suit. It later emerged that CSS could be broken much more trivially and Johansen was eventually acquitted. By this time iPods and piracy meant that MP3 files were commonplace. These were, and still are, dogged by patent issues with a number of bodies asserting ownership of various parts of the underlying algorithm. As a result, many distros shipped without patent-encumbered multimedia codecs. The law is murky though, and rights holders have shown restraint in filing suit against FOSS implementations of these codecs. Most distros are prudent and leave it up to the user to install these, although Ubuntu offers users the licensed (but proprietary) Fluendo codecs on install. Fortunately, many of the MP3 patents have expired and many more will have done so by 2017, it doesn’t really matter – we have plenty of open formats and codecs now (OGG, FLAC, VPx and x264). It’s still technically a DMCA violation to use libdvdcss (a modern and much more efficient way of cracking CSS, used by the majority of media players on Linux) to watch a DVD, but that only applies in some countries and to date, no one has challenged its use.

The city of Munich announced in 2003 that it was to migrate all of its infrastructure from Windows NT to Linux. As well as saving costs, the Bavarians claimed the main impetus for the move was freeing them from vendor lock in. Steve Ballmer visited the mayor personally, but even his charm and eloquence (and, presumably, offers of hefty discounts) weren’t enough to convince the revolutionaries. The project was completed ten years later with some 15,000 machines migrated to the custom ‘LiMux’ distro. A scare story emerged in 2014 that the city was to revert to Windows, but turned out to be false. It’s estimated that the move saved Munich some 11 million euros.

O kernel! My kernel!


After two years in development Kernel 2.6 was released in 2003. This was a vastly different beast to 2.4, featuring scheduler enhancements, improved support for multiprocessor systems (including hyperthreading, NPTL and NUMA support), faster I/O and a huge amount of extra hardware support. We also saw the Physical Address Extension (PAE) so that machines could address up to 64GB of RAM, even on 32-bit architecture. Also introduced was the venerable Advanced Linux Sound Architecture (ALSA) subsystem, which enabled (almost) out-of-the-box functionality for popular sound cards, as well as support for multiple devices, hardware mixing, full-duplex operation and MIDI. The most far-reaching new feature was the old device management subsystem, devfs, being superceded by udev. This didn’t appear until 2.6.13 (November 2003), at which point the /dev directory ceased to be a list of (many, many) static nodes and became a dynamic reflection of the devices actually connected to the system. The subsystem udev also handled firmware loading, and userspace events and contributed to a much more convenient for desktop users. Although you still relied on such arcana as HAL and ivman in order to automount a USB stick with the correct permissions.

Linux (having already been ported to non-x86 64 bit processors) supported the Itanium’s IA64 instruction when it was released in 2001. This architecture was doomed to fail though, and Intel eventually moved to the more conservative AMD64 (or x86-64) architecture, which (we delight in reminding our readers) has been around since 2003. Thanks to open source software, Linux users were running 64-bit desktops right away, while Windows users would have to wait until 2005 for the x64 release of XP. Various proprietary applications (notably Steam and its games) run in 32-bit mode, which provides some motivation for distributions to maintain 32-bit releases, but the day will come when these are no longer tenable to maintain, and eventually they will go the way of the 386, no longer supported on Linux since 2013.

Enter the archetype


The 2004 release of Ubuntu 4.10 ('Warty Warthog') was, without a doubt, a major boon for Linux on the desktop. Using the megabucks he’d amassed from creating and selling Thawte, Mark Shuttleworth formed Canonical Inc. The goal was to sell server products and support and at the same time make a desktop Linux “for human beings”. Using Debian (it having proven itself by this point) as a base, Canonical added driver tweaks, a very brown Gnome 2 theme and an ambitious six-month release cycle. We also saw the launch of http://ubuntuforums.org, where well-meaning but ill-informed members of the community would post ‘solutions’ to various Ubuntu problems.

In 2004, a sound server called Polypaudio was released by a hitherto unknown developer called Lennart Poettering and some others. At this time desktop environments relied on sound servers to overcome shortcomings in ALSA’s dmix system: Gnome was using the Enlightened Sound Daemon (ESD) and KDE was using the analogue Realtime synthesizer (aRts). Polypaudio was designed to be a drop in replacement for ESD, providing much more advanced features, such as per-application volume control and network transparency. In 2006 the project, citing criticism that nobody wants polyps, renamed itself PulseAudio (it was in fact named after the seadwelling creature, not the medical condition).

With its new name and increased demand for a sound system comparable with that of OSX or the newly released (and much maligned) Windows Vista, PulseAudio enjoyed substantial development and began to be considered for inclusion in many distros. As is traditional, Fedora was the first to adopt, incorporating it as the default in version 8, released in late 2007. Ubuntu followed suit in 8.04, although its implementation attracted much criticism and resulted in much anti-Pulse vitriol. Poettering at one stage even described his brainchild as “the software that currently breaks your audio”. It took some time but eventually Ubuntu (and other distros) sorted out implementation issues, and it now mostly works out of the box.

Before tablets, and smartphones that people could afford, netbooks were the pinnacle of portable computing. The first one was the Asus EeePC 701. Due to its low hardware spec (it had a 700MHz processor, 800x480 display and 512MB of RAM) running Windows on it was not an option. Instead it came with a customised version of Xandros Linux, which was functional, but lacking in polish. On the whole most people were unhappy with it, but netbooks still proved great platforms for more experienced Linux users. As newer netbooks were released (many based around the more suitable Intel Atom chips) they started to ship with Windows XP (some seven years after its initial release) and then the crippled Windows 7 Starter Edition. Asus later backpeddled on its Linux enthusiasm: Teaming up with Microsoft it even launched an ‘It’s better with Windows’ campaign, designed to deter people from purchasing Linux-based laptops. This smear campaign used phrases like ‘major compatibility issues’ and ‘unfamiliar environment’ to scare people away.

The cost of progress


The year 2010 may be remembered by some as the one Ubuntu started to lose the plot. Up until now, the distro had been going from strength to strength, gaining more users, more stability. It was the poster child for the (dead or irrelevant depending on who you ask) dream of Linux on the desktop. But things started to go awry in the 10.10 release. Its Ubuntu Software Center now included paidfor apps (the first one was Fluendo’s licensed DVD player) and the Netbook remix used a new desktop environment called Unity. In the 11.04 release though, this became the new shell for the main release too. Ubuntu had long taken issue with the new Gnome 3 desktop, which at the time of the Ubuntu feature-freeze was not considered stable enough to include in the release anyway, and Gnome 2 was already a relic. So in a sense Ubuntu had no choice, but no one likes change, and users were quick to bemoan the new desktops. Ubuntu has persisted with Unity and it’s much improved today, but a low point came with the 12.10 release when users noticed ‘suggestions’ from Amazon as they typed queries into the search lens.

Gnome 3 is not without controversy too – the criticisms it attracted were threefold: First, many preferred the old Gnome 2 way of doing things and this clearly was not that. Second, all the fancy desktop effects required a reasonable graphics card (and also working drivers). There was a fallback mode, but it severely crippled desktop usability. Finally, this appeared to be something designed for use on mobiles or tablets, yet even today mobile Linux (not counting Android) has never taken off, so why should users be forced into this mode of thinking? Many found though, that once some old habits are unlearned and some sneaky keyboard shortcuts are learned (and Gnome Tweak Tool is installed), that the Gnome 3 way of working could be just as efficient, if not more so, than its predecessor. KDE users looked on smugly, having already gone through all the rigmarole of desktop modernisation (albeit less drastic than Gnome’s) when KDE 4 was released in 2008. Around this point we ought to mention Systemd as well, but there’s not much to say that hasn’t been said elsewhere: the old init system was creaking at the seams, a new and better one came along, it wasn’t everyone’s cup of tea, but we use it anyway, the internet slanders Lennart Poettering.

There has always been a niche interest in gaming on Linux, but this was mostly done through Wine, which has been around since the mid 90s. Things changed when Valve released its Steam for Linux client in 2013. Today there are over 1,000 games available for Linux, with more being ported all the time. Granted, many of the high profile ports incorporate either a Wine layer or a wrapper such as eOn, but we are also seeing a good proportion of indie releases running natively. Valve even made an OpenGL version of zombie splatterfest Left 4 Dead 2, which outperformed the DirectX/Windows release. Linux users make up about 1% of the Steam userbase at present, but this may change if Valve’s plan to conquer the living room through Steam boxes, running the Debian-based Steam OS, comes to fruition.

The last couple of years have been full of Linux developments and dramas too, including the Heartbleed bug, a partial resolution to the long-running SCO-IBM lawsuit and a much less adversarial stance from Microsoft. But there just isn’t enough space, alas.