Content area
In an attempt to explain utility computing, many vendors and analysts use electricity or telephone analogies, likening a utility service approach to picking up the phone or switching on the lights - a simple action on the end-user's part, carried out at will. Behind the scenes what's happening is greatly complex, but the end-user is shielded from complexity. In its widest meaning, utility computing refers to the practice of automatically matching centralized resources to an application's demands, then charging the business unit for usage. There is no single utility computing tool to accomplish this - and in some camps, utility computing is more a philosophy or management approach than it is a software or hardware tool.
Full text
Slowly but surely... it's coming
Today's enterprise networks are complicated. Servers, routers, bridges. Storage devices and storage networks. Cables, switches, controllers... and that's just the hardware. Computing vendors-and end-users-would like to make these sprawling networks a whole lot easier to run. One phrase sums up this activity: utility computing.
In an attempt to explain utility computing, many vendors and analysts use electricity or telephone analogies, likening a utility service approach to picking up the phone or switching on the lights-a simple action on the end-user's part, carried out at will. Behind the scenes what's happening is greatly complex, but the end-user is shielded from complexity.
But Sun's Tony Siress, senior director of Advanced Services, doesn't like the usual electrical grid analogy for utility computing. "Electricity is a bad example," he said at a recent expo. It's more, he said, like a taxi. "Taxi cabs are a good example of a fully outsourced piece of infrastructure, and they're the right approach in some situations. The trick is understanding the mix of approaches that delivers the highest value and the least amount of risk to you."
In its widest meaning, utility computing refers to the practice of automatically matching centralized resources to an application's demands, then charging the business unit for usage. There is no single utility computing tool to accomplish this -and in some camps, utility computing is more a philosophy or management approach than it is a software or hardware tool. HP's Mark Linesch, VP of Adaptive Enterprise Programs, put it as well as anybody: "It's not about a big new technology. It's about establishing a tighter, more dynamic link between the business and its IT infrastructure."
Utility computing is the idea of keeping it simple by buying software, hardware and infrastructure that make computer resources easier to use. In practice, people often use "utility computing" interchangeably with outsourcing computing resources. That's not accurate anymore, although companies like IBM (itself a successful utility computing host) tend to talk about it that way. New utility computing initiatives are aimed squarely at complex networks anywhere, whether they're a company's own IT operation or a service provider's network.
Lots of companies would be thrilled with a workable utility computing scheme in their own IT departments. They're willing to start small with items like charge-back mechanisms for billing their own departments for the networking resources. There is a bit of internal conflict, though: The IT departments are looking for accurate charge-back mechanisms but the business units aren't all that wild about it. Who wants to be responsible for big computing budgets? That's right-no one.
The Market Today
Utility computing is working now, though not in the comprehensive sense that end-users would like, and that vendors are working towards. The comprehensive approach is waiting on a number of factors, including a limited ability to autoprovision storage resources.
The hosted model also has some issues. Some service providers have done very well by managing their data centers as a utility, but it's not easy to do. Utility computing environments, whether internal or hosted, need things like:
* Automated costing procedures for computing resources, and automatically assigned costs by project or business unit-whoever is sucking up the IT budget.
* Automated provisioning to meet the business unit's scaled-up (or scaled-down) needs.
* A way to charge-back without calling it that. Business units hate charge-backs because they want computing to come out of IT's budget, no matter how big a project the business unit is gunning for. But more CIOs are saying that's too darned bad-or words to that effect.
Hard or not, something has to give. Through no fault if its own, IT has focused on providing computing resources to different groups and projects. The demands vary radically from project to project and department to department, and IT is trying to make everybody happy. Recently, everything has grown so large that it's increasingly difficult to manage separate little worlds of information-rather like rowdy moons making ragged orbits around the central IT function.
There are separate servers for separate applications, separate storage, separate service levels. Each system needs it own protection: backup and recovery, provisioning, replication, whatever. It's easy to add more servers and storage to be able to meet service levels during processing spikes, but it gets harder and harder to manage and plan.
Technology
Utility computing requires dynamic provisioning and self-managing systems. Dynamic provisioning works by tracking different applications' current and future needs. It meets current needs right on the spot by preparing and offering storage right then and there, and strongly suggests adding storage space when calculating future needs. Dynamic provisioning should also include policy-driven levels, where the provisioning knows not to kick off the CRM application in favor of the stored.AVl files, and should also have some way of observing service-level requirements. Some base technologies for utility computing include virtualization and automated operations.
Virtualization: An underlying technology that makes it possible to quickly ready storage for incoming applications. Virtual ization actually ranges from a visual screen where administrators can make changes to their storage assignments, up to automatic provisioning where the software does it for you.
Automation: Very important to utility computing. A number of things need to be automated, at least to a point-end-users aren't always wild about "lights out" or "hands off computing. But a certain amount of automation really has to happen for utility computing to be useful. Automated processes already include simple levels of alerts, problem solving and repetitive tasks. They also need to include more sophisticated operations.
Discovery: Automatically identify storage networking devices (hosts, storage, etc). be able to apply them to specific business processes.
Provisioning: This is the big one. Automation should work to allocate computing power and storage room to shifting workloads. It should also know how to apply various settings (like user authentication and security policies) to various types of data and originating applications.
Configuration: Automatically implements network settings across environments- like system configurations, security settings and storage definitions.
Self-healing: Automate problem detection and subsequent correction or recovery.
As utility computing develops, many IT departments will retain ownership of their utility computing networks and procedures. This is largely due to human nature: few companies are comfortable with hosting important data at a site they do not own. That's not to say it's impossible-many companies blithely entrust their payroll information to companies like ADP. But these same companies also classify payroll as one of the last applications to come up in a disaster recovery scenario. The critical data (proprietary knowledge, compliance requirements, product information, and customer databasesO are different kettles of fish.
Many operations will find their way to remote service providers, particularly those with efficient costing models from utility computing-enabled networks. But even in large utility computing deals like American Express and IBM, where Amex turned over large portions of their computing tasks to IBM, Amex held on to its most critical data and processing.
Utility computing is slowly and surely coming. It's a natural response to an epidemic problem of over-provisioning, tightening resources, and everincreasing demands to manage and automate.
Christine Taylor writes about issues in enterprise computing and storage management (Wrightwood, CA)
www.keywordwriting.com
Copyright West World Publications, Inc. May 2004