Standardisation = productivity
Standards make a world of difference. In any process, a well-implemented standard will increase the efficiency and consistency of results. At a personal level, standards may be documented mentally; like a regular place to put the car keys around the house. We are creatures of habit for a reason.
Forward declaration of processes gives an assurance to those involved and removes the additional exertion of the deriving context of information, removing distractions from the task at hand. Defined expectations are the backbone for facilitating any kind of cooperative action. As such, the artificial world has been built on norms and regular patterns.
So how do we standardise information access? How can consistent expectations be defined for a multitude of highly mobile users?
Centralising data is the cornerstone of information access standardisation. From shared drives, to servers, to cloud storage, to cloud services, there is a natural progression of centralisation to standardise information resources to ever-growing networks of individuals. But, there is still huge disparity in the services available to the end-point user across devices, even within the same organisations. For all of our reliance on standardisation, from the information access perspective, things seem to be dragging.
Centralised computing has long seemed like the logical next step. But for all of the hype of desktop virtualisation, the traction of VDI has not materialised.
Centralisation and virtualisation: some of the benefits
Server virtualisation dates back to the 1960s, as a method of distributing mainframe resources amongst multiple processes, in order to maximise efficiency. Prior to this, when a mainframe could only run a single application at a time, a large portion of hardware resources sat idle. Virtualisation allowed mainframes to work on applications simultaneously, computing in separate virtual environments. With the advent of the personal computer, computing became decentralised, with a tendency to rich clients. This led to vast networks of non-standard machines in terms of resources and applications, very few of which are likely to be maximising their hardware potential.
In virtual desktop infrastructure, a host machine runs a hypervisor, a piece of software or firmware responsible for the virtual machines. Desktop virtualisation draws many additional benefits from centralisation. An OS can be distributed across the whole system from the same image. Not only does this bring all virtual user environments up to the same standards, but VDI also allows any updates or changes to be implemented system-wide in a short window of time. For different job roles, user groups can be defined, which contain a distinct set of applications relevant to their particular work.
One of the most important benefits is that a user can access their personal virtual desktop from any enabled machine. This allows them to access and search all of their documents consistently. On thin-client virtual desktop implementations, where the processing is done centrally, users have the same powerful resources available to them on cheap hardware and in mobile situations.
The obstacles to VDI implementation
In theory, these benefits sound convincing, but despite its popularity, VDI doesn’t have the strong foothold that might have been expected. This is due to a multitude of issues confronting virtual desktops.
A major problem with virtualisation is persistent network connection dependency. In an office environment, this is not such an issue. However, to reap the benefits of a mobile virtual desktop, there is a constant struggle with low-bandwidth, intermittent connection and VPN setup to provide a high quality user experience. To provide a quality VDI experience for multimedia, it is best if some of the client side resources can be leveraged. This pushes up the specification requirements, and thereby, cost of thin clients.
Licensing is somewhat a mixed issue with respect to VDI. On the one hand, fewer licenses may be required, and the use of program licenses may be more efficient. However, one of the main reasons that vendors get so excited about SaaS is that there’s a potential to get more of customers’ money. With VDI, it really comes down to the specifics of that particular virtual implementation’s licensing issues.
The main issue of VDI implementation in companies are the initial costs involved. Varying levels of suitability to virtualisation from different applications can have a big impact on implementation complexity and cost. As with any major shift in infrastructure, VDI is going to incur big costs upfront. As opposed to upgrading rich-client PCs across the course of a year, switching to a VDI requires funds to be poured in around the same time. There are definitely cases where VDI could save money on hardware and licensing, but the ROI will be slow, discouraging companies from undertaking large scale shifts to desktop virtualisation.
The real weakness of VDI is applications. If virtual desktops are unable to provide high quality experience compared to rich clients, enterprises are more likely to take the route of cloud storage for information access, paired with individual computers for applications. That isn’t to say that things look particularly bleak for VDI. Rich clients may be around for some time, but the idea of centralised computing does make a great deal of sense. Already, VDI definitely has a market in areas with homogenised applications and low processing requirements, where the benefits of system standardisation and information access can be truly leveraged, healthcare being a good example. As application support grows and user experience improves, VDI can still be a strong contender against cloud computing services and DaaS.
Leave a Reply