The most obvious impediment to deployment addressed by utility computing is the need to provision and configure hardware again and again. Depending on the development methodology used, the size of the application, and the level of desired redundancy a single application may be integrated with hardware as many as six times. Every integration cycle inevitably introduces errors and they must be found and corrected. Utility computing eliminates continuous rebuilding of infrastructure. The needed infrastructure is defined along with the application in a portable format and the utility system creates that infrastructure dynamically every time the application is started.
However, that’s only the tip of the iceberg. Utility computing’s more significant impact on delivery will come from more subtle changes in workflow enabled by the fact application infrastructure is portable and can be instantiated multiple times at minimal cost. Thus developers can run actual copies of the full application to unit test their code. Eliminating simulation environments improve developer efficiency and increase the number of bugs found and fixed by the developers themselves. Test engineers can also use the application infrastructure even before code is complete in order to build test suites. And, during the test cycle, multiple copies of the application can be run to reduce the duration of the cycle. Utility computing also makes scalability and reliability testing more efficient by allowing cheap, simple testing.
Using utility computing enables developers and test engineers to focus on their core skills rather than worrying about hardware. The result is a more productive, streamlined process for taking applications to production.