Nov 13, 2013 (03:11 AM EST)
Server Innovation Moves Slowly
Read the Original Article at InformationWeek
Download the entire November 2013 InformationWeek special issue, distributed in an all-digital format (registration required).
A reading of the headlines of the day, once you get past the iPhone 5s and iPad Air stories, will leave you certain that IT shops everywhere are storing and processing big data. They're learning the intricacies of their customers and how they relate to products and services by analyzing everything from clickstream data to Twitter feeds and matching it against sales and inventory data. If they aren't doing that, then surely they're instrumenting every machine they have in a bid to capitalize on emerging Internet of things technology to improve manufacturing or predict part failures before they happen, right?
If not big data analytics, or Internet of things, then surely business du jour is employing complex computational models for financial trading or analyzing drug interactions, or understanding at a microscopic level the forces on a bridge or on carbon fiber components of a new aircraft.
Yes, it's one of those rare times in history when new technology meets new applications in ways that promise to reshape the very nature of business for benefit of customer and bottom line alike.
And then there's the reality we see in our data about servers, the workhorses that would have to be doing all this exotic computing.
The data paints a picture as boring as a Wagnerian opera, and it takes just about as long to get to a point. Year after year, it's a struggle to find a trend around server buying or use that amounts to anything other than doing just a bit more than last year -- or often exactly the same as last year. To look for longer-term trends, therefore, we've examined our statistics over a four-year period to see if we can get a better idea of where the market is going. More than that, we're going to try to square the darlings of the day -- big data, Internet of things, and advanced modeling and analytics -- with the reality of companies' IT technology adoption practices.
The Big Server Shrink Winds Down
Our four-year data shows a steady increase in the number of companies simply buying servers on a standard replacement cycle -- neither looking to lower server counts through consolidation nor ramping up server capacity to tackle new technologies or meet new business initiatives.
In 2010, 42 percent of companies were cutting servers via consolidation while just 26 percent were holding server count constant. Three years later, just 29 percent are consolidating and 35 percent are holding steady -- indicating that companies have largely put server virtualization in place and realized its gains. Meanwhile about 10 percent to 15 percent of respondents are looking to drive server counts up. These, we assume, are the early adopters of technologies like big data, advanced analytics and IoT -- technologies that will absolutely require more servers.
One reason that most IT organizations don't need to add new servers is that core counts, and more importantly typical memory configurations, have gone up steadily since 2010. Particularly as CPU chips have become more capable by adding more cores and supporting virtualization in hardware, server buyers have told us that memory configuration is of primary importance, with more always being better.
download the InformationWeek November special issue.