Wincite Provides the Option to Implement Applications Using an Outsourced “Utility Computing” Model
Corporations are taking a fresh look at outsourcing selected computer applications that could be supported over the internet by “utility computing” providers, also referred to as ”cloud computing” and “managed hosting services”. A related outsourcing trend was started during the e-commerce boom when companies needed new web visibility fast for marketing and they found it economical and timely to use outside web hosting services.
The same basic technology is now being selectively applied to support internal applications which can network together internal users at any number of locations. The “utility computing” companies provide the processing and storage resources along with security and back-up services. This involves renting a computer resource or service that is managed by the utility company. The “utility companies” have unique expertise in managing vast numbers of servers, usually in the tens of thousands. In some ways this type of an external core resource, and the related network, is similar to how electrical power is generated at large power plants and then distributed over networks to consumers who use the electric power.
The “utility computing” business model can be tailored to the needs of both small and large companies. The smaller companies benefit from economies of scale related to the utility provider while the larger corporations may use the service to manage a portfolio of smaller specialized applications such as Wincite. The IT resources in large corporations are typically focused on the very large complex mission critical enterprise transaction systems of the company, and as a result they have limited time to support the smaller tailored applications that typically support the needs and the productivity of knowledge workers.
Wincite Is Using “Utility Computing” in Designing and Implementing New Applications
Recent experience has shown that Wincite can design, develop, and implement new apps more efficiently and in less time using the “Utility Computing” service model. Clients can interact with Wincite design consultants as they demonstrate the development tools inherent in the software. Subsequently there is an on going back and forth dialog with client business users regarding the screens and reports being developed in accordance with the requirements of the client. The client can subsequently assist in testing the functionality of the system and make contributions regarding enhancements. When the system is completed the client has the option to continue to use the “utility computing” service model or download the application onto an in house computer. Wincite has also developed a new model for service delivery in terms of what is referred to as “Software as a Service”. This involves Wincite managing the computer service provider along with the Wincite customized software application for a client.
Reasons for Considering a “Utility Computing” Provider as an Outsourced Resource
Wincite applications are now running on these “utility computing” networks in support of Wincite competitive intelligence and knowledge management applications. From an end users point of view there are no changes, end users see the same screens and reporting features that are currently available. The “utility computing” aspect is a behind the scenes option for running Wincite.
The following is a brief summary of some of the potential advantages in using the “utility computing” option on a continuing basis.
Economies of Scale in Purchasing and Managing Hardware and Software
The “utility computing” companies have economies of scale in purchasing both computer hardware and related operating systems as well as managing the day to day operations of a data center. These companies are growing rapidly and are purchasing the newest computer technology at volume discount prices.
High Levels of Niche Technical Expertise
Too be successful the utility service companies must have a high level of unique expertise in networking together a very large number of servers and related web based communication resources. This includes specialized expertise in security issues and automated back-up procedures.
Service Support as a Competitive Advantage
”Utility computing” companies frequently stress their level of support in terms of fast accurate responds and 24/7 365 availability. They view high levels of service as a means to establish and maintain a competitive advantage in the market place. As a result the corporate customers have substantial leverage in managing the level and quality services being provided by the utility.
Security Issues and Corporate Policies
Security is a major issue with large corporations. Until fairly recently corporations have had a reluctance to use the utility model because of perceived security issues. The general lack of problems to date and advances in security related technologies, appear to be reducing the risk factors associated with outsourcing systems services in general. The utility companies recognize the importance of this factor and realized that their existence is very dependent on the integrity of their security features.
Shifting Investment Costs to Service Expenses
Using the “utility computing” model is similar to moving from a front end investment cost basis to a periodic lease type expense. In some situations this maybe an advantage in justifying a new application in terms of budgeting and capital investment approvals. Application can always be moved to internal computer resources if and when a need arises.
Technical Background and Trends
The basic technology related to the “utility computing” model has been around for sometime, however, growth has been limited by the speed of the communication networks that connect users to large server farm facilities. The availability of broadband intranets has greatly increased the viability of the utility concept. A number of “utility computing” companies exist today and several of the leading technology companies are making very large investments in resources for entry into the market. Google is one of these companies and it has already provided a proof of concept for the technology in connection with their search services which currently network together 500,000 or more servers located in a number of locations around the world. The basic rationale for the model relates to balancing resources with demand by dynamically assigning processing requirements to different servers. The actual core processing centers can be easily scaled up by adding new servers to match increases in demand. Typically these new units are mass produced standard servers that are relatively inexpensive and can be easily replaced in case of maintenance.
What is in a Name?
Naming the underlying service model seems to be almost as much of a challenge as the actual technology itself. For example the following terms have been used: cloud computing, grid computing, managed hosting, on-demand, Software as a Service, server farms, computer clusters, hosting services, etc. The bottom line is the technology works, it is very cost effective, and it will take on even more meaning and use as internet speeds continue to increase and, most important, access to the internet pipeline becomes ubiquitous with the arrival of advanced communication networks. The technology is here, the challenges are the deployment of new access technologies and the acceptance of a new way of using computer services.