Cloud computing can be loosely defined as using scalable computing resources provided as a service from outside your environment on a pay-per-use basis. You use only what you need, and pay for only what you use. You can access any of the resources that live in the “cloud” at any time, and from anywhere across the Internet. You don’t have to care about how things are being maintained behind the scenes in the cloud.
|
The cloud is responsible for being highly available and responsive to the needs of your application. Cloud computing has also been called utility computing, or grid computing. Cloud computing is a paradigm shift in how we architect and deliver scalable applications. In the past, successful companies spent precious time and resources building an infrastructure that in turn provided them a competitive advantage. It was frequently a case of “You build it first and they will come.” In most cases, this approach:
- Left large tracts of unused computing capacity that took up space in big data centers.
- Required someone to babysit the servers.
- Had associated energy costs.
The unused computing power wasted away, with no way to push it out to other companies or users who might be willing to pay for additional compute cycles.With cloud computing, excess computing capacity can be put to use and be profitably sold to consumers. This transformation of computing and IT infrastructure into a utility, which is available to all, somewhat levels the playing field. It forces competition based on ideas rather than computing resources.
Resources that your applications and IT systems constantly need (to meet growing demands for storage, computing resources, messaging systems, and databases) are essentially commoditized. You can rent this infrastructure from the vendor that provides you with the best price and service. Simple, isn’t it? It’s a simple but revolutionary idea that is not entirely new.