What's Holding Up Utility Computing?
But while vendors and integrators have long touted the virtues of utility computing, CIOs and end-user companies have taken a cautious approach to a full-scale IT utility. Although some have rolled out streamlined computing operations through server consolidation and virtualization, others have relegated utility computing to the back burner.
Some progress can be found, however. In our November 2005 survey of more than 200 IT and business executives at large and midsize companies, 59% are planning for some type of a utility environment this year, with proof-of-concept projects to be completed in 2007. A majority expects to at least partially complete utility deployment by 2010. (See chart, below.)
For the most part, these are small, incremental projects that won't require a lot of human and financial resources. This doesn't discount the potential impact of small-scale deployments, which can result in immediate savings and help users gain experience.
For example, a user with 50 Microsoft servers averaging 20% utilization—through consolidation, virtualization, and workload management—could drop to just five servers while reducing systems management and operational staffing.
Two important considerations are behind survey participants' move to utility computing. First, nearly all regard its development and deployment as a strategic advantage. Second, they plan to implement utility environments through a series of targeted IT or business services. For example, a $200 billion financial-services firm is creating a utility environment for all its E-mail resources, services, and systems. We also found other financial-services companies, as well as telecommunications and health-care providers, developing IT utility environments for customer-facing print-service resources. The goal is to streamline development, production, privacy, and security for account information in printed customer communications. Such services tend to be managed centrally with a homogeneous set of resources—Windows on Intel server platforms—making them likely candidates for a utility environment.
Although small-scale deployments can deliver immediate short-term business benefits, they can also inflict additional management headaches and costs over the long haul. The best way to avoid these unintended consequences is to first identify inhibitors in the organization, then deploy services incrementally.
We've identified eight sets of inhibitors to utility-computing deployments in most large and midsize organizations. The first is the need to control resources. All business units and IT organizations must understand that utility-computing applications will operate in a shared-resource environment. Departments have to be willing to cede control and management to the utility-operations group. But how do you get them to cooperate?
Creating an internal selling and PR program is a simple and effective way to get executives on board with the plan, according to the corporate CFO for IT at one of the largest electronics component manufacturers in the world, with more than 20 IT resource and data centers. This may include an analysis of the redundancies and costs that could be eliminated or minimized by sharing resources. Presentations and other promotional materials should help sell the utility-computing concept.
A second inhibitor is the need for up-front planning. Deploying utility computing is a strategic investment. But many executives, faced with budget pressures, focus primarily on tactical business and IT decisions. Instead, they need to identify long-term, strategic benefits as well as the potential for short-term ROI.
Advance planning should entail building an inventory of all applications that will run within the utility. Afterward, document all characteristics in detail—including the CPU, network, and storage resources required for allocation; the applications, middleware, and operating systems needed for provisioning; all data requirements; and all operational characteristics and policies, such as availability, batch window, billing, reporting, and response time—so the organization can adjust its resources dynamically.
Many enterprises have a mishmash of IT environments—Windows, Unix, Linux, and mainframes (IBM's MVS)—and not all utility-software vendors can support all environments. This is the third obstacle, and it could limit the number of applications deployed in the near term to one or two supported environments. Most vendors say they intend to support all common environments in the future. But standards for a truly heterogeneous environment won't be available until 2008. Companies should start deploying a utility in a homogeneous IT environment such as Windows or Linux on Intel server platforms, then extend it to other environments as technologies mature.
The fourth inhibitor is the emergence of multiple systems-management frameworks, such as Computer Associates Unicenter, Hewlett-Packard OpenView, IBM Tivoli, and Microsoft SMS/WLM. Many have invested significantly in one or more of these, and the inability to manage them could seriously limit utility deployment. To start off, enterprises should create a utility environment within each of their installed frameworks. Later, when so-called manager of managers technologies become more widely available and mature, organizations can merge the frameworks.
Preserving existing IT investments counts as the fifth inhibitor. Executives are wary of bringing their environments up to current levels just to deploy them in a utility. Clearly, this will increase the up-front cost. Organizations must determine the compatibility of all components they plan to deploy within the utility—servers, operating and storage systems, middleware, and independent software vendor applications—before they invest in new systems.
Another inhibitor involves budgeting, appropriation, and vendor disruption. Many businesses have a budget model that assumes a predetermined cost by month, quarter, or year. IT budgets and chargeback systems will likely have to adapt to the dynamic usage of resources within a utility. Old methods of capacity planning and costing will become obsolete within the utility—not only for the utility infrastructure as a whole, but also for applications' resource-usage charges.
Meanwhile, a utility-computing environment will tax IT vendors' ability to serve a dynamic market given to the kinds of peak and off-peak cycles seen in the energy and telecom industries. Many hardware vendors offer a capacity-on-demand option, whereby additional resources are billed to the customer as required. However, the dynamic capability of a utility will be especially disruptive to software vendors with pricing models based on the total capacity of the system to which their product is licensed. Some software vendors are beginning to respond to customer demand for pay-as-you-go licenses, and this will likely increase utility adoption.
Limited employee skills are another inhibitor. Many organizations, especially those that haven't deployed a systems-management framework, may lack the expertise needed to deploy a utility environment. Even if the deployment plan calls for third-party consulting or outsourcing, in-house training is a must, especially when it comes to developing and managing related policies and service-level agreements—tasks involving both IT and the business units.
The final inhibitor to utility computing is ROI. Many organizations require a minimum project ROI within a specified time—for example, a 15% return within 24 months. Unfortunately for CIOs, the deployment of utility computing is a strategic investment that commands high up-front costs for planning, training, software, and hardware setup, and it requires a long-term outlook toward realizing a substantial return.
It generally takes at least two years for utility computing to yield a positive ROI. But what if your company or business unit requires every IT investment to achieve payback within 12 months? At one U.S. consumer-brokerage firm, IT is complying by segmenting its deployment plan into measurable, process-specific chunks that can satisfy the rule when treated individually.
After identifying the obstacles to utility computing, organizations can embark on an action plan. The full deployment assumes the availability of a complex set of functions in a mature, operational environment. Standards for communication among these interrelated functions, particularly in a heterogeneous environment, are also required.
Businesses looking to start a project in the first quarter of this year should identify a small subset of the enterprise IT environment that is homogeneous. This reduces the planning period for the first phase of deployment, since fewer applications, storage/data resources, and technologies will have to be analyzed and accommodated.
Determine readiness by asking:
- Has a server-consolidation project been completed?
- Is a chargeback system implemented?
- Has a systems-management framework been deployed?
- Has network-attached storage or a storage-area network been installed?
- Being able to answer "yes" to each of these questions will go a long way in preparing for utility computing.
- The key success factor is a conservative approach that allows for incremental changes over time. In fact, most companies that use ROI or other financial metrics to justify deploying an IT utility environment also take an incremental, project-based approach to planning and executing deployments. Any other means usually leads to significant business disruption and increased costs. Organizations that employ a conservative, "eat the elephant one bite at a time" strategy for utility computing will minimize up-front costs, improve the chances for a positive ROI in the near term, and be able to take advantage of more complex functions as their utility environments mature in the latter half of the decade.
- Jim Cassell is a senior program director at Saugatuck Technology focusing on enterprise-systems technologies and management. Bruce Guptill is a managing director at Saugatuck Technology, responsible for research strategy and operations.
- Has your organization embarked on a utility-computing strategy? Tell us about it at [email protected].
- See Related Articles:
- Real Time Means Real Change, August 2004
Square Off: Is Pay-As-You-Go Computing Viable?, May 2004
Productivity Gains: Quantity Plus Quality, February 2004
Some organizations already see benefits from utility computing as follows: - Increased utilization of resources. No longer must IT resources be based on potential usage for an application—the utility system grabs resources from the pool as needed. For example, beginning in 2002, a multinational beverage and restaurant-services company with more than 40 data centers worldwide used a phased, incremental IT utility deployment to improve its IT availability. The organization reduced costs by better managing its resource allocation based on on-demand usage.
- Reduced staffing for systems management and operations. The utility automates many functions performed by the IT staff. Additional savings can be realized if hardware and software vendors offer capacity-on-demand, pay-as-you-go, or pay-per-use purchase and/or licensing options. One of the world's largest commercial and consumer insurance carriers cut IT, business-process staffing, and operations costs by deploying an internal IT utility environment. This, in turn, allowed more cost-effective, better-targeted outsourcing of IT and business processes, resulting in reduced overall business risks.
- Improved customer service. By dynamically adjusting resources to meet service-level agreements, companies can improve response times for customers. The multinational beverage producer cited above improved customer satisfaction by providing faster, better, and less-expensive customer-account support. Frequently, CRM solutions are deployed as "front ends" to IT utility environments to give companies and their customers better access to information, reduce miscommunication, and streamline sales and support functions based on more-transparent data flow.
- Better agility and flexibility. Utility computing helps companies meet emerging business requirements, such as the need to deploy a new application or make significant changes to a current one.
Rushing headlong into a utility-computing project could lead to the very headaches and costs you were trying to avoid. To play it safe, follow these deployment steps. - Month 1 > Establish goals
- Spell out goals. What do you plan to accomplish? What functions, processes, lines of businesses, or other business operations will be targeted?
- Identify obstacles.
- Establish an IT utility working group. Include executives from IT, finance, business operations, customer/supplier relations, and core IT vendors.
- Month 2 > Identify targets and build the plan
- Map out investment ideas and targeted business areas to a manageable and affordable set of projects.
- Look for the most repeatable improvements, based on a common set of technology and process investments.
- Draft a realistic utility-computing framework.
- Focus plans on building blocks that enable streamlining and integration of technologies, processes, and systems.
- Month 3 > Articulate, promote, and refine
- Draft a simple, yet comprehensive, document to articulate business advantages through utility investments.
- Review your plan with executives and managers in the selected target business and IT areas.
- Answer all questions; then incorporate questions and answers into a refined utility-computing plan document.
- Repeat the review process as necessary.