Oct 31, 2009 (03:10 AM EDT)
Is The Desktop PC Doomed?
Read the Original Article at InformationWeek
Well, is it? If all the buzzwords about cloud computing, Web apps, software as a service (SaaS), the free desktop, and the rise of netbooks and tablet PCs, are to be believed, it may well be.
The truth, I suspect, is more modest. What looms for the desktop as we know it is not the end, but rather death and then transfiguration. The desktop PC is set to go through a grand mutation of the form that will expand its reach rather than kill its current incarnation(s) outright. It may be the end of the desktop as we know it, but also the birth of many more desktops. As Roger Ebert said about the James Bond archetype during the post-Soviet era, the sun hasn't set on England yet -- but it's getting mighty dark out.
This will be a survey of the desktop as we have come to know it lately, and as it changes, evolves -- and, quite possibly, dies and reincarnates in new and unexpected forms.
The Balkanization Of The Desktop
In his blog post about Windows 7 RTM, Alexander Wolfe made a comment that can be seen as either a lament or exultation:
"Face it, we're at the end of the beginning of the Internet era. And old-fashioned, desktop OSes won't be all that relevant in a world of Web-host applications (aka SaaS) running on Netbooks or Apple iTouch Tablets."
The question is: How much of what we all do will be like that? It's easy to forget that there are still plenty of people whose work is grounded in the non-Web world, who don't use netbooks, and who still think of fruit when they hear the word Macintosh. So we have to ask: whose last great desktop is this?
We can draw a distinction, or, rather, trace a divide. On one side we have users who are and will remain desktop people. Most of them could be described as the casual Microsoft Office crowd, those who use the desktop because it's what they're used to, and what they get the best results from. A fair number are people who cannot get their work done any other way except on the desktop -- e.g., graphic designers, engineers, folks who live in Photoshop, AutoCAD, Quark Xpress or Maya. Those high-end desktop folks aren't ditching those apps for the Web anytime soon, if ever.
On the other side are people who can, or have, made the jump to the "new desktop" of the Web or the mini-device. Some of them are people who work in managed environments -- i.e., they have had this sea change in the way they work imposed on them. Others got their feet wet in such a world; they see the "heavy desktop" as being an option, not a necessity.
What's emerging, then, is not so much a different kind of desktop, but different classes of user -- and different ways to satisfy those various classes of user. Those of us who need a full-blown desktop will have one; those of us who don't now have a range of choices.
Windows 7: The Last Great Microsoft Desktop?
In his blog post, Wolfe went on to say this this about Windows 7, "I believe this will be the last great client-side OS launch we'll ever see."
Client-side is a slightly broader term than desktop, but I think Alex chose that word for a reason: all desktops are clients, but not all clients are desktops.
Most of the discussion about Windows 7 should be familiar by now, especially its lower system-resource requirements. Because Windows 7 was designed to run on a slightly broader range of devices (mini-book to maxi-desktop), it's that much less susceptible to being summarily ignored by the New Desktop folks. This applies even if we ignore the fact that what constitutes a low-end machine today is far more powerful than anything you could get four years ago.
As a result, Windows 7 should be adopted by many New Desktop people, even if only provisionally. The larger question of whether they'll stay with it -- either because of the long-term demise of Windows as a client OS or because of their own needs -- has no answer yet, and deserves close scrutiny.
If a good percentage of them stay with it, up until and through the acquisition of their next device -- be it netbook, notebook, full PC, tablet PC, super smartphone, or what have you -- then Windows may well have bought itself at least another generation of life.
Linux: The Contender?
It's impossible to read a tech-related Web site today (InformationWeek included) without tripping over a piece about Linux being the future of the desktop. A lot of the noise is justified. Linux is a powerful operating system with a huge crop of quality software, a good deal of which may already be familiar to Windows users, and without the security or cost issues that have plagued Windows before.
Truth is, Linux simply hasn't commanded the kind of desktop share that would make it a contender to either the Mac or Windows. Any number of possible reasons have been tossed around. For instance, it's been claimed that PC manufacturers not offering Linux as an option. Many do; but most people still choose Windows thanks to it being familiar and more immediately useful to them.
Perhaps the biggest reason Linux hasn't eclipsed Windows as a desktop of choice is simple: free and secure, amazingly, isn't enough of an enticement. Most people buy Windows with a PC, so it's effectively free to them. Vista and Windows 7 have made massive improvements over XP's security. And the improvements continue.
Evangelizing for Linux doesn't help much either. In short, the benefits of Linux as explained by its advocates just aren't tangible to most people.
The same could be said of most other free OS offerings currently in the running. Example: Haiku OS, which shows a great deal of promise despite being in its very early stages. But, again, unless a platform offers something substantially better than Windows (the way Mac OS X does) it may also end up as little more than a curiosity with a fan club.
So where will Linux make its inroads? There are several places it already has. Its small but fiercely devoted circle of users continue to keep it alive as a desktop system, and it has found a place in managed environments where Windows / Mac compatibility aren't essential.
Mac OS X: Another Destination?
Why is it that most talk of "the death of the desktop" leaves the Mac out of the picture? It's because the Mac, as far as the desktop goes, is its own animal. It's not just a desktop, it's a whole ecosystem -- you get the hardware, the OS, the whole deal, all designed from stem to stern by Apple to "just work." In that sense it isn't wholly different from, say, a server OS that's only bundled with specific hardware configurations to insure maximum compatibility and uptime.
The hardware, in particular, is what makes the Mac shine -- and not just metaphorically, either; those brushed-aluminum cases are real eye-catchers. But you have to buy a Mac to run OS X -- unless you use one of several Mac hacks that violate the Apple license agreement.
Those who buy into the Mac ecosystem, however, see it as a way to insure a degree of quality: they're getting an excellent OS (and OS X is an outstanding piece of work), top-notch hardware, relative freedom from security issues, and world-class support. What's not to like?
The price, for one. Switching to the Mac commands a cost premium -- not just for buying the hardware, but also replacing all of one's existing software for Mac-native versions. The other option is to buy Parallels Desktop and a Windows license, and run Windows applications in emulation (which many Mac users do in fact do). The price then encompasses more than just buying the OS, the hardware, or the applications alone. It's all of those things.
Macs do have a presence in the workplace, but generally only in environments where the existing workforce demands it -- e.g., creative teams who work with graphics and multi-media files. This is probably not due to a lack of deployment/central management tools for the Mac, such as OS X Server's own remote-installation suite, or third-party tools such as DeployStudio or Radmind. Most IT departments are still PC-centric due to fiat from the top down, with Mac support being the exception rather than the rule. That and the price premium for Macs means they are generally only acquired for the people that require (or demand) them.
Apple is not likely to produce hardware that eats into the low-end computing market the way netbooks have. It was only one year ago that CEO Steve Jobs proclaimed in a teleconference with analysts, "We don't know how to make a $500 computer that's not a piece of junk."
In theory, the iPhone could count as such a device -- it's not as expensive as a full-blown system, and it makes a fair degree of productivity possible on its own. But you can't use the iPhone without its data plan, which right now comes courtesy of only one provider in the United States: AT&T.
Even if for most people Apple remains the iPod and iPhone company, programmers and Web coders are now gravitating towards the Mac as a hassle-free work environment. The former group especially like it thanks to its Unix/BSD underpinnings, which they can expose or hide with equal ease. If those folks -- who typically serve as platform evangelists to their less technical friends -- were given the freedom to introduce Mac OS X to users on their own PCs, the Mac might experience an even greater explosion of market share.
But this is entirely incumbent on Apple, which has shown itself to be fiercely protective of its ecosystem. So much so, that Apple could lose out to the Next Big Thing: the Web as the desktop.
The Web: The New Desktop Killer?
The Web -- and the web browser -- have been slowly eating into and replacing some of the basic functions of the desktop. Why run a desktop note-taking application when there's Remember The Milk? Why use a word processor when there's Google Docs, which has document sharing and collaborative editing built right in? Heck, there are full office suites on the Web, vying to compete with standard desktop-bound apps.
But that incursion can only go so far. The frameworks, the languages, the APIs, the metaphors of the Web -- all of those can reproduce some of the behaviors of the desktop, but with many concurrent performance costs piled on top, and with many corner cases and side complications.
The problem with using the browser as an application framework is the browser. No two browsers implement the same standards the same way -- and that's by design, since those variations are part of how they compete with each other in the first place.
There's also the issue of what can be defined as an "application," and what application are used for. As relevant and even powerful as something like Facebook is, it's as far removed from the capacity of a desktop application as a bicycle is suited to moving furniture. Granted, its needs are not in the same league -- but the more Web apps are pushed to be as powerful, as rich, and as behaviorally complex as desktop apps, the more difficult it becomes to sustain such a level of work with the current toolset.
The browser-as-word-processor is a classic example: applications such as Google Docs are at the mercy of how the browser implements elements, such as editing in rich text fields. It's a little like driving a car that turns out to be powered by a wound-up rubber band.
The only thing that would come remotely close to addressing the problem is a total top-down redesign of the Web's front-end. HTML 5 is not going to fix this; it's a Band-Aid for a broken leg. A genuinely stateful network protocol would help. Ditto a new way to render information in the browser, much as Display PostScript or NeWS were posited as replacements for the X Windowing System.
Such goals may not even be possible to achieve communally -- as with the original Web, they might only be possible when someone creates a prototype, releases the implementation as an open spec, and has everyone else follow suit.
The Web can replace some desktop applications and create new kinds of apps that weren't there before. No one disputes this. What's debatable, and should be debated, is the idea that the Web can replace the desktop. Individual apps easy enough to implement in a browser are one thing. Reproducing the desktop in all of its statefulness and context sensitivity -- the real ambition of the web-as-desktop -- that's another mission entirely.
Virtualization: The Desktop On Demand?
If the Web isn't turning into the desktop, then the desktop itself may simply become a plastic commodity -- something to be invoked, destroyed, recreated and cloned on demand through a thin client. This is not theory; it's been happening for a long time, and is one of the key ways the desktop is being commoditized in workplaces. XenDesktop) is one such approach. Also worth mentioning is the nComputing approach, which allows for one physical device to virtualize its computing resources to a whole slew of thin clients at potentially great savings.
Most of how virtualization has affected the desktop remains confined to managed environments -- a classroom, an office, a hospital. End users don't feel it much. But a few things have appeared which may change this: the presence of in-home backup/media servers, for instance. With relatively little work it becomes possible to make such a thing into a system-virtualization server, so those who don't need a brand new, beefy PC can virtualize a desktop from the home server onto existing hardware.
Mobile Devices: The Other, Other Desktop?
What's really interesting about how the desktop is changing is how the hardware that appears in the mobile space is becoming remarkable powerful. The gain in power is so great that the key difference between mobile devices and low-end desktops isn't computing power or storage, but rather form factors and ease of use.
The latter and the former are interrelated: a keyboard is only useful when it's large enough on which to type comfortably on. That's why most of us don't use a cellphone-sized QWERTY keyboard to type reports. We use them instead to fire off short e-mails , SMS blasts, tweets and IMs. Combined, all those forms may well constitute a good chunk of people's productivity with a keyboard these days.
To that end, the mobile device is becoming a desktop of its own -- one where a significant subset of people's daily work can be accomplished. But it usually also works as an adjunct to a main desktop -- either a full-blown PC, or a Web "desktop" like a Gmail account. So, rather than just becoming a desktop of its own, it might be better to think of it as a piece of the desktop that can be broken off, reattached at will, and combined with Web functionality.
How this will develop depends on how effectively useful interfaces can be packed into small devices. A full-blown desktop interface on a device barely bigger than one's hand doesn't make much sense, which is why Intel and Google have both been putting a good deal of research into their Moblin and Android OS interfaces, respectively. Of the two, Moblin shows the most radical UI work and hints very strongly at how the future desktop can not only be portable, but truly useful on-the-go, and not just a glorified task-list or Twitter client.
What looks like the end of the desktop is only the end of one phase of the desktop, and the beginning of a whole slew of new desktops. What's also become clear is how "the desktop" was never just "the desktop" before, either -- it was meant to be many things to many different people's needs. The new desktops, plural, are preparing to satisfy those needs in ways that the old, singular desktop couldn't.
For Further Reading: