Thursday, 15 December 2016

Is The X86 Era Coming To An End?

Is The X86 Era Coming To An End?

Mark Pickavance looks at the shifting relevance of Intel's X86 technology, and if it will soon be a thing of the past

The origins of X86 go back to 1978 when Intel released the 8086 processor, a 16-bit extended version of its previous 8088 series design. It wasn't an overnight success, because there were lots of chip makers then, and many of them had more popular support than Intel.

Early computer makers preferred the chips like the Z80 from Zilog, 6501 and 6502 by MOS technology, the 6800 series by Motorola, among others. But what propelled Intel's technology was IBM's choice to use it in its IBM PC in 1981 and the subsequent success of this platform in standardising Intel instructions and initially the MS-DOS operating system.

There's a core instruction sequence that the Intel or AMD CPU uses that originated in the 8088, even if the way these systems are coded is radically different and their internal architecture bears few similarities.

Having been around for nearly 40 years, is it time for Intel's X86 to retire from the world of computing, or is there still life in this legacy platform? To answer that, you need to really understand why X86 was so successful, and why changing conditions might eventually end its reign.

Bad, But Not Terminal, Yet

It's worth accepting from the outset that the PC as a platform has been in decline for some time. The advent of web-based applications, cloud storage, consumer devices like tablets and smartphones have all stuck a blade in the computer equivalent of Julius Caesar over the past few years.

Under these circumstances, Intel could usually rely on Microsoft to come riding to the rescue of its quarterly figures with a new version of Windows, but even that stalwart has failed to protect Intel.

Since the inception of the IBM PC, Microsoft and Intel have been the dynamic duo. When Microsoft launched a new OS, it usually needed more computing power, helping Intel. And when more powerful chips came to market, Microsoft was quick to find new things to do with that performance.

However cosy that all became, it couldn't shield either side from the economics of a changing computer market, one that is now radically different from what existed in the 1990s or even the early 2000s.

In Q4 of 2015, PC sales worldwide declined 10%, and they'd been declining year on year for six straight quarters, over a period when Microsoft launched Windows 10. That brings the PC sales to the same point they were at in 2007.

Oddly, after that, lots of stock pundits then declared that this was a turning point, as the PC would recover in 2016 from the doldrums. It didn't. In the first three months of 2016, it lost another 9.5-11.5%, depending which analysis you wished to believe.

Fearing a mass exodus of shareholders, Intel quickly announced it would be cutting 11% of its global workforce and diversifying into other markets. Looking at the latest briefings Intel has generated, it's now talking about loT and cloud computing platforms as making up the shortfall created by the PC decline. This comes despite it cancelling its phone chip Medfield, after $10bn reportedly went into that hole, and failing to gain any traction in the tablet market with its Atom processors.

While the PC market is still big, surely this is a train Intel wants to alight before it really hits the buffers?

Well, it's tried to engineer an untimely end for its X86 architecture in the past and failed rather miserably.

Attempted Fratricide

The popularity of Windows has revealed to Intel's chip engineers the numerous flaws of the X86 chip series, especially when it comes to handling multitasking and the potential path for 64-bit addressing.

X86 wasn't designed to be as flexible as the PC became, and on numerous occasions, the cracks in bending it to the new paradigms have shown. At each chip iteration, Intel layered more modern thinking on old structures, in a curious parallel of the problems that Microsoft ran into improving Windows while retaining backwards compatibility.

Many of those working on these projects just wanted to ditch X86 and the old thinking that it represented, and start on a totally new chip with none of that legacy baggage dragging along behind. Eventually, senior people in Intel began to listen, and a whole new chip architecture was envisioned that would be so massively superior that it would sweep X86 away and usher in a whole new generation of personal computing.

Inspired by the work that HP did on Explicitly Parallel Instruction Computing (EPIC), Intel started work on a chip that would do more in a single clock cycle and use a whole new IA-64 architecture. In 1999, it officially named the chip Itanium, and it forecast that it would be selling $38bn a year of these chips by 2001.

That never happened, partly because Intel didn't actually release the first chips until June 2001, but it was a target that this chip would never achieve, even if technically Intel still sells variants of them to this day.

Because the Itanium couldn't natively process X86 instructions, it was initially seen as an alternative to the UNIX RISC workstation processors like those made by SUN (SPARC), MIPS and DEC (Alpha).

Due to the size and power of Intel, many initially took the Itanium very seriously and set about compiling a version of their operating systems for the new architecture. These included Windows, OpenVMS, Linux and a selection of UNIX derivatives. Some got cancelled before they were finished, and those that did arrive only highlighted the problems of usurping X86 in the general use computer market.

Having Windows for Itanium is fine, but what you need is the applications, and without X86 compatibility, that meant specific versions made for AI-32 or AI-64 to run of those recoded Windows models. Oddly enough, this was exactly the same brick wall that Microsoft would run blindingly into with the ARM-based Surface machines years later, because who in their right mind wants a Windows computer that can't run Windows applications?

Intel tried to patch over these limitations with X86 emulation code, though the processor was horribly slow at doing this, and it wasn't that quick running native code either. Itanium soon became something of a joke, where Intel would avoid saying how many it had sold and would hide its commercial impact among other more profitable parts in its business.

In the end, only HP continued to sell systems based on this line, and leaked documents revealed it was forced to pay Intel to keep developing and producing the chips to meet its specific needs.

In retrospect, while retrofitting X86 with the functionality it needed for modern computing tasks was challenging, it allowed the type of transition that the market could accept rather than the drop-dead move that Itanium presented. Perhaps if the Itanium had been a better design from the outset, things might have been different, but it never really competed with the X86 chips that Intel itself and AMD built (the Opteron) to make the switch justifiable.

Death By Irrelevance

While Intel's own Machiavellian attempts to do away with X86 failed, other technologies have been stalking this architecture for some time. Probably the one that most people are aware of is ARM, a chip technology that has all but dominated the mobile phone and tablet markets.

What's radically different about ARM when compared with Intel is that ARM doesn't make all the chips. Instead, it licenses its technology and lets others pick-n-mix a SoC (system on chip) of their own recipe. By doing this it controls the core functionality and architecture, but others can customise their package to include exactly the sort of power and performance that their application needs.

That's the polar opposite of Intel's solution, where it decides everything, and you only get to decide in broad strokes of clock speed and cache what level you want.

ARM has also spent most of the last 30 years focusing entirely on power efficiency - something Intel spent many years almost entirely ignoring. Having missed the explosion of smartphone use and the multi-billion dollar chip sales that it represented, Intel tried in vain to catch up with its own power-efficient X86 phone chip, a project it's since abandoned.

Where Itanium failed to oust X86 because of the transition issue, ARM has arrived from an entirely different direction, where the computers that it's in are effectively appliances and not personal computers in a conventional sense.

Because of this, they haven't been in direct competition, with the possible exception of the failed ARM-based Microsoft Surface machines, and therefore people don't expect these devices to be a PC and run X86-based code.

Now we're seeing systems, like those that run Android (a Linux derivative) and Chrome OS where both ARM and Intel processors are being used, with few of the customers noticing - or caring - what chip they actually use.

Cloud based computing and the web interface are achieving what Intel's chip designers failed to do in making X86 largely irrelevant in the future of computing. That's not to say that it won't be with us for a long time yet. Windows is still popular, though the market for PC technology is in a gradual decline, from which it will probably never recover. Whereas 20 years ago having a computer with lots of useful applications invariably meant X86 and Microsoft Windows, those things are no longer mandatory.

The continuation of X86 chips by Intel may be more about the cost to the company of designing something new versus the ease of delivering technology it fully understands that can work with both legacy Windows customers and those that want a more powerful platform for Android or Chrome OS, or their inevitable love child.

Final Thoughts

Before we get ahead of ourselves, it's worth noting that Intel still sells hundreds of millions of chips every year, and AMD flogs a decent number too. It would be crazy for them to stop, as they still make a considerable amount of money out of doing this and will do for some considerable time to come.

But what made X86 successful wasn't Intel alone; it was how in coordination with Microsoft it managed to appear more relevant than it actually was in reality. That relationship is breaking down, because Microsoft isn't as relevant as it once was, and they each can't prop up the other through these difficult times.

I should note that Microsoft has supported technology other than X86 in the past, but not successfully, and today it only really supports ARM on Raspberry Pie and the few mobile phones it managed to ship Windows 10 onto before it entirely lost interest in phones.

It's clear from its Universal App platform that it really wanted to migrate away from X86 and all its associated legacy baggage, though both the ARM Surface and its phones didn't provide the impetus. That leaves Microsoft in a difficult place, because it doesn't really want to remain indefinitely tied to the X86 architecture, but at this time it doesn't have a practical alternative or a working plan to ween its customer base off X86-dependent code. Intel would prefer that it didn't, but it doesn't really have a say or that much influence with Microsoft these days.

As the market forces have affected both companies, they've realised that while they were strong together for so long, they both need to find a way out of this hole that isn't dependent on each other.

My motivation in writing about X86 was that I realised how little we've covered developments at Intel in the magazine over recent years. If asked what the fastest Intel PC chip available was. I'd hazard a guess that is was an LGA 1366-V3 Xeon, but I've no idea what clock speed or cache size it features.

Going back a few years, I knew the range of chips that Intel made intimately, and I was always eager to review significant new releases. These days, I'm not sure people really care, other than to mentally segment Celeron, i3, i5 and i7 performance levels in very general terms.

While this is an understandable disinterest for the general public. I'd consider myself and those that read this publication to be computing enthusiasts. If we can't summon up any great excitement for what Intel does with its chip technology, then that's pretty indicative of the current situation in regard to X86's longer-term prospects.

This isn't exclusively an Intel problem either, because video card development also seems of much less interesting than it once was for many.

Reflecting back on those times when it was all edge-of-the-seat product launches, the PC became like a compelling narrative, and we all wanted to see what happens next. Now, it has become a wholly predictable exercise where the clock speeds shift up in glacial increments, and simple upgrades are thwarted by arbitrary socket changes or exotic new memory types.

Where we once looked at each new release to finally deliver the performance that a favourite game craved, these days most titles are playable with the most modest power. And seeing all the settings put at maximum while using 4K resolution is purely an expenditure issue. The PC has become purely pay-to-win, and when you realise that, then it's dramatically less interesting.

X86 can't complain. It had a good run and saw many of the chip families it started out alongside biting the dust along the way. The technology that will replace it will probably have codes and notation that few people will even know or care about, because it's now all about what a thing does than how it does it.

X86 was the glue that made Intel and Microsoft work in combination, and without it in place, we will likely return to an era very much like the one that we last experienced in the early 80s. That was an exciting time for computer enthusiasts, and maybe like then, computer development is about to get a lot less predictable.