TechWeb

Gartner Cites Six Best Practices In Virtualizing Servers

Apr 23, 2008 (03:04 PM EDT)

Read the Original Article at http://www.informationweek.com/news/showArticle.jhtml?articleID=207401685


The biggest change underway in the data center is virtualization, said two Gartner IT analysts this week.

Gartner analysts have had "a thousand conversations" with clients on the subject, and as a result, they have come up with a list of six best practices for server virtualization.

  1. Consultants may recommend large-scale server virtualization. Gartner said start small and achieve the concrete server consolidation that you're seeking in a first phase." The second phase is more strategically important, more complex to implement and provides far more value for the customer," say Thomas Bittman and John Enck in comments accompanying the list. The second phase concentrates on the flexible allocation of resources to respond to business demand, such as starting up more virtual machines or assigning more resources to a virtual machine with a priority task. "In this phase, the focus shifts to delivering new services or improving the quality and speed of service," they wrote. In other words, start small, but think big, they say.

  2. Their second point is "Require a rapid return on investment." The IT organization "needs to build a business case with a rapid return on investment" because the virtualization market is evolving rapidly, Bittman and Enck write. They recommend shooting for an ROI within six months or less, although they don't indicate whether they think prices are likely to go up or down in the rapidly evolving market. Nevertheless, their benchmark is: deploying a minimum of 50 virtual machines in a year will make good on the investment in virtualization software.

  3. "Virtualize the right applications," they urged. Existing applications that utilize most of the hardware resources on which they sit "are not going to generate savings" by migrating them into virtual machines, they warned. Applications with high input/output traffic may become inefficient on virtual machines, because in most cases multiple virtual machines are sharing the limited I/O capacity of one piece of hardware. "Older, smaller packaged applications" make the best target for initial virtualization. The majority of such applications being moved into virtual machines are none the less deployed in production "in mission critical roles."

  4. Define a storage strategy to go with your server virtualization plans, they noted. The images of the virtual machines on which your organization depends need to be always accessible. If they are stored on disks directly attached to the server, then the failure of that server or storage system will make the images inaccessible. Putting them on a central storage system gives the enterprise "the flexibility to access virtual images from any server connected to the storage system," they pointed out.

  5. "Understand software licensing issues," they advised. Virtualization has appeared on the scene before independent software vendors have thought through how they want to charge for their products, once they're used in virtual machines. Keeping prices low would encourage usage in the midst of a rush top virtualize. But it's tempting also to just collect the added license fees as what used to be one copy of the product becomes three or four or more in virtual machines. "Gartner predicts that software pricing and licensing will remain problematic for the near future," said Bittman and Enck. That means don't expect a price break just because you want to increase usage of an ISV's product. But virtualization remains the "the most important trend through 2012." As competition sharpens, ISVs may adjust their pricing to emerge as the big benefactors of the trend. "Until new pricing models are found, users should seek to understand ISV's pricing and licensing policies in as much detail as possible."

  6. "Combine virtual machines effectively." Server administrators will try to balance an application that has high I/O during the workday with one that utilizes I/O channels for backup at night. But Bittman and Enck recommend trying to construct a dynamic way to balance workloads rather than devising "a perfect static consolidation mapping." The workloads are inevitably going to go up and down and may not be a perfect match at all times, if virtual machines achieve the 75-80% server utilization rate that many data center managers now consider optimum. "Being able to deal with these changes dynamically is a key goal, particularly in the early stages of virtualization," they wrote. That means buying into the management tools that VMware, Virtual Iron, XenSource and other third parties, such as Akorri and Veeam, now offer.

The market for virtualization software will mature over the next five years, but "most enterprises can't afford to wait--server sprawl, data center space and power problems are here now," wrote Enck. Following these guidelines, many organizations will sidestep these problems and proceed to virtualize their servers, he predicted.

"Many of these problems [covered in the six guidelines] can be avoided if enterprises make the proper assessments before they virtualize their machines," he wrote.