Desktop virtualization presents many advantages for information technology departments, as management of the desktop is quickly facilitated. One change in a setting in the operating system, for example Windows, can be made once in a central location and all employees receive the change once they access their virtual desktops. Previously, information technology departments either were required to visit each desktop to make this change or push the change down to each desktop from a central server.
Although desktop virtualization may appear as a panacea for information technology departments constantly endeavoring to maintain current with software patches and other updates, the practical options for large-scale adoption are just emerging. Consequently, many information technology departments are not moving so quickly to adopt this technology. Its real viability for computing has not been completely proven for massive scale adoption. Yet many are looking to this technology as a great potential for lower the operating costs of owning and managing personal computers and desktops, which usually are a large percentage of information technology budgets.
As with most innovations, there are risks involved. As the personal computer entity basically resides in the data center, the biggest risk is greater reliance on the uptime of the data center. If the data center goes down, personal computer access then becomes unavailable. In addition, it requires significant investments in up-front costs for servers, storage, network bandwidth, licenses and thin-client hardware. Software as a service may be a less expensive alternative in terms of expense and implementation as the application resides on a server and is accessed by means of a browser through the web.
Two strategies exist for the implementation of desktop virtualization: the "fate image" approach of today and the stateless strategy of the future. With the fat-image approach, the operating system and applications are combined into a single image stored on a data center server and viewed on a simplistic computer by means of various remote access protocols. The advantages to this approach are the centralization of storage and increased security for data. In the stateless approach, every time an end-user turns on their computing device, the data center creates a temporary virtual image from a set of master operating systems images and icons and delivers those to the computer. In this approach, end-users are given only the applications they need based on who they are, their privileges and what they are trying to do.
The disadvantage to the fat image approach is that the operating system and the applications are stored in the data center and if patches have to be applied, someone still has to do the work. Patching is still facilitated as it is done in the data center rather than on individual computers but every virtual desktop still requires the same patching and management as would regular desktops. In a stateless environment, such patching would be done in one place and only those who require the patched application would receive it.
As with all new technologies, adoption is slow, particularly in smaller organizations where desktop virtualization may not be an issue. However, for larger organizations, a wait-and-see approach could quickly turn into a missed opportunity sooner than expected.
Visit http://www.OCRuggedLaptops.com for more information about the rugged laptop industry.
Article Source: http://EzineArticles.com/?expert=Mack_Harris