Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=232901077
Apple's OS/X operating system has long been the "safer" approach to desktop computing. While fuzzy math has been applied to imply that Windows is less secure only because its larger market share makes it a bigger target, architectural considerations have long made OS/X and Unix/Linux less vulnerable to attack.
But as every security-savvy IT pro knows, "safer" isn't "totally safe." So it's not surprising that anti-virus vendors are starting to coddle us with "total security" for the Mac. And as Apple moves forward with App Store plans for the Mac, it won't be surprising to see Apple cite Flashback malware as a reason for exerting strong-arm control over Mac apps.
As we all know, security typically comes with a tradeoff. You're giving up something, whether it's convenience or money. Neither of the approaches cited above will eliminate risk, so the question is, are their tradeoffs worth the gains?
Virus protection is debatable, but in the case of walled garden, I say "No" for three reasons: It won't make us safer, it takes away our organizations' options, and it will cost us more. These are the same reasons you don't wear a bulletproof vest if you aren't in law enforcement or a war zone: For the most part, it won't make you safer, it takes away some wardrobe options, and it will cost you more.
Mac users don't live in a war zone and, eventually, Windows users won't either, because of improvements in architecture that emulate Linux and Mac.
Flashback is clearly the biggest Mac botnet of all time, having infected somewhere in the neighborhood of 500,000 desktops. That's several orders of magnitude less than comparable PC botnets, such as Conficker, estimated to have infected about 12 million PCs. PC virus infections also dwarf Mac ones.
Why? The popular answer is "market share." That's nonsense. It's all about computing paradigm. Windows programs ran for so long as administrator-by-default that writing viruses for them was extremely easy. OS/X and Linux have always used a paradigm whereby their programs did NOT provide administrative access by default. Exceptions were made only through explicit authorization.
Zero day hacks that take advantage of browser bugs exist in all operating systems, but that "default administrator" is a major, major problem. Current versions of Windows have something called User Account Control that finally does what Linux and OS/X have been doing all along, but the caveat is that some enterprise administrators, to support poorly written Windows programs, have had to turn off UAC.
Point is, as the Mac increases in market share, it will indeed be the subject of more attacks, but the basic architecture is the right one. Windows is there in theory, but until app developers catch up, the "everyone's an admin" problem will remain.
Better browser and OS design are what's needed to protect against zero-day attacks, NOT more endpoint protection that slows down your machine and doesn't work anyway--or at least, not when it really matters. AV vendors don't and can't react quickly enough to the latest exploits, and that sandbox stuff that they do ought to be in the OS and browser.
There's never going to be 100% security. And the big deal nowadays is really about computer user gullibility and a lack of corporate security awareness programs. Fake anti-virus anyone? I don't care if you have UAC turned on--if you agree to install fake AV and type in your password to allow it to do system-level stuff to your machine, you're in deep guacamole no matter what platform you're on, OS/X, Windows, or Linux.
And whether you hope to foil intruders via browser or OS design or a lockdown of the OS with the introduction of a walled garden and trusted apps, writing bug-free code is just about impossible. (Note that Chrome's sandbox protection keeps showing that it's difficult, but not impossible, for intruders to exploit bugs and gain the type of access required for malware.)
So when Apple and other manufacturers ask us to accept a walled garden on our desktop computers, know that it won't make us safer.
As far as taking away our options, I've known organizations that simply could not write iOS apps because their legal departments wouldn't agree to Apple's terms and conditions. The notion of having to jailbreak a computer that your company buys in order to write custom apps is simply ludicrous. Think Microsoft won't go there? Fat chance. If Apple leads, and makes money doing it, Microsoft will follow.
One way that scenario will cost us more is the inability to repurpose our desktop computers using other lightweight operating systems as business needs change (for example, using a thin client OS once a desktop can't meet modern requirements). If the walled garden takes hold on Microsoft's and Apple's platforms, we can expect "trusted boot" environments that won't let us repurpose old hardware. We'll have to buy new.
We now take for granted our ability to do what we want with the hardware we buy. I worry that the walled garden paradigm will take away our control over hardware and operating systems and put it in the hands of a third party. We need to let desktop computer makers and their sales reps know that if there's a walled garden in our future, there must be a door to that garden that we control, not them.
The Enterprise 2.0 Conference brings together industry thought leaders to explore the latest innovations in enterprise social software, analytics, and big data tools and technologies. Learn how your business can harness these tools to improve internal business processes and create operational efficiencies. It happens in Boston, June 18-21. Register today!