/ 2 September 2010

Hardware? Software? Who cares?

Almost every industry on the planet goes through alternating cycles of consolidation and fragmentation. In some industries — like energy — these cycles are long and slow. In others — like advertising — they are short and quick. Right now a tectonic shift is under way in one of the world’s most important industries: information technology.

No, I don’t mean your chubby IT guy who does desktop support on Tuesdays. I mean computing in every form, from desktop to mobile to cloud; as well as software, networking, storage and everything in between.

In the early days of the IT industry, giants like IBM controlled everything. They designed and manufactured the hardware. They wrote the software that ran on it. They invented and then implemented the networks that connected their giant mainframes.

This wasn’t just about reaping monopoly profits — though that was nice — it was about being big enough and having enough concentration of talent to take big bets and try new things. When research and development ate half your cash flow you needed to be big — at least in the 1960s and 1970s.

Then along came upstarts like Microsoft and Intel, who proved you could take one part of the IT pie, dominate it completely and then grow it into a global business. In Microsoft’s case it took a bet on software — which IBM thought a costly nuisance — and became a company of multimillionaires. Intel bet on microprocessors — the chips at the heart of computers — and became a household name with more than 90% of the market.

And so for the last 30 or so years specialisation has been the dominant trend in IT — and the key to making huge profits. When every component in a system can be made cheaper and better by a specialist, then a generalist has less and less reason to exist. This trend nearly killed IBM, until it was reborn as an IT services company — the last refuge of the generalists for the past three decades.

But since the turn of the century a shift has been slowly building momentum. It started with Google and its incredible rise to prominence. It might appear to be the archetypal software company, but Google owns dozens of hardware patents for things like making servers cheaper and easier to maintain.

Since it owns and maintains (literally) millions of servers, these sorts of patents have made it extremely difficult to compete with. If each search costs it, say, half as much as its competitors then it should dominate the market — which it does.

And Google is also increasingly a player in the networking space. In 2008, when the US government auctioned off the sought-after 700Mhz radio frequency, Google was one of the bidders. Why? Because that frequency is perfect for providing wireless broadband at low cost.

Playing outside its sandpit
And Google isn’t the only one playing outside of its sandpit. Two weeks ago a rumour circulated that Facebook was switching its servers from Intel’s dominant x86 architecture to ARM — the energy-efficient chips often used in cellphones.

The rumour turned out to be false, but it illustrates the influence that a single online social network could have on the entire microprocessor market. Facebook has well over 60 000 servers and, considering it just signed up its 500-millionth user, we can assume it will need a lot more in the future. A move from x86 to ARM would have cost Intel literally billions in revenue.

In Facebook’s case this influence was relatively indirect, but more and more software companies are buying into hardware, or vice versa. In January this year Sun Microsystems, a venerable but ailing hardware company, was acquired by Oracle, a database software giant. In August Intel bought McAfee — a leading anti-virus software company.

Why the shift? Much of it is about need for control. It’s becoming increasingly important for hardware and software to be tightly integrated (along with networking). This means a move away from specialisation and back to the vertically integrated generalists of the 1970s.

The ultimate example of a generalist made good? Apple. It struggled through decades of competition with nimbler specialists but is now reaping the benefits of its integrated approach.

Essentially the neat divide between hardware and software is blurring. The internet is quickly making utility computing a reality, one that is likely to dominate for at least the next three decades.

In that reality the distinction between hardware and software is irrelevant to consumers — like the distinction between alternating and direct current. They just want the technology to work — they don’t care how.

A decade from now questions like “Which operating system do you use?” may be met with blank looks. That seems sad to geeks like me, but it’s a sign that our industry is maturing — a natural and positive progression. After all, I have no idea how my car works, but I still enjoy driving it.