Content area
IT has commonly focused on providing computing resources to different groups and projects. The demands vary radically from project to project and department to department, and IT is trying to make everybody happy. Utility computing is a model that can go a long way towards taming the beast, but only if it evolves to meet both technology and business needs demands. Utility computing is a model where computing resources are automatically assigned to a user on an as-needed basis. Depending on who you talk to, utility computing might be an IT management approach, a business strategy or a software/hardware tool. That is because utility computing lives or dies on the integration of its parts. Amazingly enough, utility computing might not be purely a matter for the enterprise. IT can be as complex for SMB to manage as for the enterprise. Utility computing is ultimately about how companies can make better use of all their computing resources.
Full text
On-Demand Computing Today
IT has commonly focused on providing computing resources to different groups and projects. The demands vary radically from project to project and department to department, and IT is trying to make everybody happy. Recently, everything has grown so large that it's increasingly difficult to manage separate little worlds of information - rather like rowdy B moons making ragged orbits around the central IT function. There are separate servers for separate applications, separate storage, separate service levels. Each system needs it own protection: backup and recovery, provisioning, replication. It's easy to add more servers and storage to meet service levels during processing spikes, but it gets harder and harder to manage and plan.
Utility computing is a model that can go a long way towards taming the beast, but only if it evolves to meet both technology and business needs demands. Utility computing is a model where computing resources are automatically assigned to a user on an asneeded basisrThe model is demanding? even basic 'utility computing demands multiple technologies including storage and server virtualization, grid computing, automated provisioning, and security. Given the impressive list of requirements, utility computing is often consigned to a hosted model such as large-scale services provided by companies like HP and IBM.
This has led to some confusion in the industry, which often talks as if utility computing were purely a host / service provider (SP) model. It's not; companies can (and perhaps should) deploy internal utility computing. But for now, the majority of utility computing investment is in hosted / public grid models. According to independent website Utility Computing, "The infrastructure required to deliver that reality is beginning to be put into place. IBM has been working on the idea for some time, but the major technology vendors are now all jostling for position. At this early stage, their offerings may be seen as IT outsourcing, where large corporations allow dedicated service providers to take care of all their IT needs." Internal deployments are out there and growing; with most adopters using professional services organizations that are proficient with utility computing structures.
Utility Computing Today
How does utility computing play out in today's storage and networking marketplace? Depending on who you talk to, utility computing might be an IT management approach, a business strategy or a software/hardware tool. HP's Mark Linesch, VP of adaptive enterprise programs, put it as well as anybody: "It's not about a big new technology...It's about establishing a tighter, more dynamic link between the business and its IT infrastructure."
That is because utility computing lives or dies on the integration of its parts. Utility networks exist today, but true utility computing requires close coordination between hardware components, the applications that run on them, and the data management tools that handle provisioning, storage pooling, and a myriad of tasks that require wide-scale automation across a utility network. The utility infrastructure must be able to automatically provision and deliver resources on demand, while tracking usage for later chargeback.
Such a level of flexibility and tracking requires management tools that are currently in their infancy, which explains why not every company is jumping on the utility bandwagon (basing your company's IT life on a bunch of relatively untried tools is only for the very brave or the foolhardy). But the real holdup for utility computing is that application providers have yet to move en masse toward UC-ready licensing models. "The software licensing models in particular are currently the barrier to utility pricing models," says Corey Ferengul, senior vice president at Meta Group. Ideally, utility computing pricing models would allow customers to pay "by the sip," much as we do with electricity and water. But software vendors are still predominantly selling their products on a per-seat or per-CPU basis, regardless of how much or how little an individual seat or CPU is utilized.
Like ILM, utility computing is more a strategic approach than a specific application or suite of applications. The idea behind utility computing is to provide unlimited computing power and storage capacity that can be used and reallocated for any application - and billed on a pay-per-use basis.
Ideally, utility computing brings some important benefits with it. These include:
* Simplified administration. Reduces time-consuming and complex administration overhead. This will happen faster when going to a reliable SP model, but internal deployment will yield the same benefits. Utility computing also needs scalable, standardized, and heterogeneous computing resources, and should not depend on highly proprietary hardware or software to work.
* Capacity to meet business needs. Enables administrators to manage fast growth and peaks-and-valleys capacity and processing demands. Avoids network downtime and lag by immediately provisioning for changing needs.
* Cost-effective. Leverages infrastructure costs to meet changing business requirements, serves business growth. Automated provisioning based on need yields excellent ROI on internal resources.
Basic requirements for successful utility computing
* Automating costing procedures for computing resources
Billing or chargeback information should be driven by the capacity required to support business processes. As a result of properly aligning infrastructure with business processes, the business wants IT to help minimize the costs of providing business services. Note that this sounds good on paper but can lead to heavy political infighting: Many business units hate chargeback because it adds costs to their bottom line. But in the face of spiraling IT costs -all of which are coming out of their budget - CIOs are increasingly unsympathetic.
* Automated provisioning to meet the business unit's scaled-up or scaled-down needs
Without automated provisioning, IT departments have to resort to painful manual techniques to deal with impossibly complex server farms, a plethora of operating systems, multiplying storage systems and expensive management software. The better automated provisioning technology gets, the easier this critical piece of utility computing will become.
* Virtualization
Virtualization is an underlying technology that makes it possible to quickly ready storage for incoming applications. Virtualization actually ranges from a visual screen where administrators can make changes to their storage assignments, up to automatic provisioning where the software does it for you.
* Other types of automation
Discovery. Automatically identify storage networking devices - hosts, storage, etc. Be able to apply them to specific business processes.
Provisioning. This is the big one. Automation should work to allocate computing power and storage room to shitting workloads. It should also know how to apply various settings - like user authentication and security policies - to various types of data and originating applications.
Configuration. Automatically implements network settings across environments - like system configurations, security settings and storage definitions.
Self-healing. Automate problem detection and subsequent correction or recovery.
* Flexible systems
Virtualization and automatic provisioning will have to work across operating systems and switches, and in multi-vendor environments. And yes, this is a tall order.
* security
If you thought security was tough in a regular network environment, try a utility computing network that is serving hundreds or thousands of customers. A case in point is the recent denial-of-service attack that Sun suffered on the very first day that the company allowed users to buy Internet access to its much-hyped, and much delayed, public utility grid.
* Grid computing and SOA
Grid computing is a form of distributed computing where resources are often spread across different physical locations and domains. Grid computing is a foundation technology for models like utility computing, where computing resources are pay-per-use commodities.
SOA (Service-Oriented Architecture) is a computing architecture that undergirds the act of delivering IT as a service. SOA can be used for designing, building, and managing distributed computing environments, works well best with standards-based computing resources, and efficiently enables utility computing infrastructure development.
Utility Computing and SMB
Amazingly enough, utility computing might not be purely a matter for the enterprise. IT can be as complex for SMB to manage as for the enterprise. SMB commonly lacks internal IT skills to optimize their network infrastructure, and can benefit from a solidly hosted, reliable and high-performance model. (Internally deploying a utility computing infrastructure runs into exactly the same challenges driving SMB to utility computing in the first place. At this point, most SMBs adopting utility computing will outsource to an SP.)
There are differences between SMB and the enterprise utility computing models, particularly the lack of a charge-back model in SMB. According to strategic consultancy THINKstrategies, SMB's utility computing SPs depend primarily on network and performance management tools, software distribution tools, and software diagnostic tools to serve their SMB clients.
* Network Management tools to proactively monitor hardware states
* Performance Management tools to effectively measure network, system, and software performance
* Software distribution tools to automatically update operating systems and applications from a central console
* Software diagnostic tools to perform system and software analyses, and self-healing techniques
Predictions
I expect utility computing to dovetail with developments in grid computing, SOA, automated provisioning and discovery, security, and other foundational technologies. Over time, storage, databases and applications will increasingly be made available for customers to access on demand over networks that appear as one large virtual computing system. Utility computing provides the enterprise with a charge-back function to support this business model. SMB will increasingly turn to its own brand of utility computing, where they turn over network management to an SP.
Utility computing is ultimately about how companies can make better use of all their computing resources. By delivering fast and intelligent access to network resources, utility computing leverages computing infrastructure costs and reduces management overhead.
Copyright West World Publications, Inc. Mar/Apr 2006