Jump to content

Utility computing

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Bumm13 (talk | contribs) at 20:43, 16 August 2024 (History: changed "CA" text instance to "California"). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Utility computing, or computer utility, is a service provisioning model in which a service provider makes computing resources and infrastructure management available to the customer as needed, and charges them for specific usage rather than a flat rate. Like other types of on-demand computing (such as grid computing), the utility model seeks to maximize the efficient use of resources and/or minimize associated costs. Utility is the packaging of system resources, such as computation, storage and services, as a metered service. This model has the advantage of a low or no initial cost to acquire computer resources; instead, resources are essentially rented.

This repackaging of computing services became the foundation of the shift to "on demand" computing, software as a service and cloud computing models that further propagated the idea of computing, application and network as a service.

There was some initial skepticism about such a significant shift.[1] However, the new model of computing caught on and eventually became mainstream.

IBM, HP and Microsoft were early leaders in the new field of utility computing, with their business units and researchers working on the architecture, payment and development challenges of the new computing model. Google, Amazon and others started to take the lead in 2008, as they established their own utility services for computing, storage and applications.

Utility computing can support grid computing which has the characteristic of very large computations or sudden peaks in demand which are supported via a large number of computers.

"Utility computing" has usually envisioned some form of virtualization so that the amount of storage or computing power available is considerably larger than that of a single time-sharing computer. Multiple servers are used on the "back end" to make this possible. These might be a dedicated computer cluster specifically built for the purpose of being rented out, or even an under-utilized supercomputer. The technique of running a single calculation on multiple computers is known as distributed computing.

The term "grid computing" is often used to describe a particular form of distributed computing, where the supporting nodes are geographically distributed or cross administrative domains. To provide utility computing services, a company can "bundle" the resources of members of the public for sale, who might be paid with a portion of the revenue from clients.

One model, common among volunteer computing applications, is for a central server to dispense tasks to participating nodes, on the behest of approved end-users (in the commercial case, the paying customers). Another model, sometimes called the virtual organization (VO),[citation needed] is more decentralized, with organizations buying and selling computing resources as needed or as they go idle.

The definition of "utility computing" is sometimes extended to specialized tasks, such as web services.

History

[edit]

Utility computing merely means "Pay and Use", with regards to computing power. Utility computing is not a new concept, but rather has quite a long history. Among the earliest references is:

If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility... The computer utility could become the basis of a new and important industry.

— John McCarthy, speaking at the MIT Centennial in 1961[2]

IBM and other mainframe providers conducted this kind of business in the following two decades, often referred to as time-sharing, offering computing power and database storage to banks and other large organizations from their worldwide data centers. To facilitate this business model, mainframe operating systems evolved to include process control facilities, security, and user metering. The advent of mini computers changed this business model, by making computers affordable to almost all companies. As Intel and AMD increased the power of PC architecture servers with each new generation of processor, data centers became filled with thousands of servers.

In the late 1990s utility computing re-surfaced. InsynQ, Inc. launched [on-demand] applications and desktop hosting services in 1997 using HP equipment. In 1998, HP set up the Utility Computing Division in Mountain View, California, assigning former Bell Labs computer scientists to begin work on a computing power plant, incorporating multiple utilities to form a software stack. Services such as "IP billing-on-tap" were marketed. HP introduced the Utility Data Center in 2001. Sun announced the Sun Cloud service to consumers in 2000. In December 2005, Alexa launched Alexa Web Search Platform, a Web search building tool for which the underlying power is utility computing. Alexa charges users for storage, utilization, etc. There is space in the market for specific industries and applications as well as other niche applications powered by utility computing. For example, PolyServe Inc. offers a clustered file system based on commodity server and storage hardware that creates highly available utility computing environments for mission-critical applications including Oracle and Microsoft SQL Server databases, as well as workload optimized solutions specifically tuned for bulk storage, high-performance computing, vertical industries such as financial services, seismic processing, and content serving. The Database Utility and File Serving Utility enable IT organizations to independently add servers or storage as needed, retask workloads to different hardware, and maintain the environment without disruption.

In spring 2006 3tera announced its AppLogic service and later that summer Amazon launched Amazon EC2 (Elastic Compute Cloud). These services allow the operation of general purpose computing applications. Both are based on Xen virtualization software and the most commonly used operating system on the virtual computers is Linux, though Windows and Solaris are supported. Common uses include web application, SaaS, image rendering and processing but also general-purpose business applications.

See also

[edit]

References

[edit]
  1. ^ On-demand computing: What are the odds?, ZD Net, Nov 2002, retrieved 2017-11-03
  2. ^ Garfinkel, Simson (1999). Abelson, Hal (ed.). Architects of the Information Society, Thirty-Five Years of the Laboratory for Computer Science at MIT. Cambridge: MIT Press. p. 1. ISBN 978-0-262-07196-3.

Decision support and business intelligence 8th edition page 680 ISBN 0-13-198660-0

[edit]