Why desktop virtualization projects fail

Desktop virtualization is one of the hottest topics of interest and a major initiative of many companies. Touted benefits include lower operating costs, simpler management and desktop mobility. Below we’ll explore what the barriers to wide-scale adoption of desktop virtualization solutions are and some approaches to deal with them. It’s not a fit for everyone in a company but it can be for many.

Challenge #1: Assuming desktop virtualization makes sense because thin clients are cheap – Many people assume that virtualizing desktops is going to be magnitudes cheaper because thin clients can be found for approximately $300-400 whereas a PC can cost $500-$1200.

Tip: Client costs are only part of the picture. Desktop virtualization can reduce capital expenditures but do not expect that to be the case in the first year. Building the infrastructure is expensive (storage, servers, licenses, etc.) and may be the same in the first year. Think about using existing PCs as clients instead of replacing them with thin clients. Thin clients are cheaper than PCs but the reduction in hardware costs may not be seen for a couple of years due to the infrastructure needing to be built. More importantly, operational expenses will be seen immediately and that is where the true cost savings can be found.

Challenge #2: Infrastructure people not understanding the desktop people – Server ops are not the same as desktop ops. Users have different behaviors and expectations on how their desktops will function. It is easy to virtualize a windows desktop but delivering what the user expects is not easy.

Tip: Understand your users and identify your use cases – Learn what apps users need to use, how they use them, where they use them and what they expect. Do your users need different apps depending on their physical location? Do they need dual monitors or multimedia acceleration? How should you deliver user profiles? Is printing going to be an issue? Spend a bit of time identifying and categorizing your use cases so you can design your solution around them.

Challenge #3: Bad desktop practices follow into the virtual world – Refreshing desktops will not be any easier if you allow users to install their own applications or store data with a desktop image. Not ensuring good security policies (screensaver locking and passwords) may leave desktops unprotected as users go from office to kiosk.

Tip: Identify unhealthy desktop practices and change what is feasible (in phases) – Start thinking about what makes managing desktops difficult today. If users don’t absolutely need to install their own apps, set policies that stop that behavior. Storage space, desktop refreshes and manageability will all be improved. If security is lax, improve it by doing basic things like auto-locking displays so someone can’t hijack a desktop left logged in.

Challenge #4: Not understanding Microsoft licensing – Microsoft bars OEM licenses from being transferred and they also require VECD (Virtual Enterprise Centralized Desktops) for all Windows desktops that are virtualized. There are additional per seat licenses from VMware and other desktop virtualization vendors.

Tip: Understand the licensing before starting a pilot – At the time of this writing, VECD is a device-based subscription and is $23/seat for SA (Software Assurance) or $110/device/year. An example from Microsoft’s website:

For example, a company with 10 thin clients and 10 laptops (not covered under SA) accessing a VDI environment requires a total of 20 Windows VECD licenses (20 x $110/year). However, if the same company has 10 thin clients and 10 laptops covered under SA, it will require 10 VECD licenses (10 x $110/year) and 10 VECD for SA licenses (10 x $23/year).

Challenge #5: Poor virtual desktop performance – The two biggest challenges after the ops piece is sizing and end-point selection. The desktops take a long time to boot up and flash video is choppy. There are new limitations in a virtual environment that were nonexistent when everyone had their own PC.

Tip: Work with a partner who can help size and architect a system – This is critical because of all the variables involved. Design is dictated by many of the answers to challenge #2. Also, end-points (thin clients, PCs, web-based access) all are unique in the user experience they deliver. If Youtube video is important, get an endpoint that specifically accelerates adobe flash. If users are far and network latency is high, either deploy WAN accelerators from companies like Riverbed or Cisco or use thin clients like Sunray’s from Sun Microsystems.

Desktop virtualization is still rapidly changing. The challenges and tips above are not an inclusive or exclusive list. They are meant to prompt some thought before jumping in if you want a higher probability of success. Don’t take on too much at once, do things in phases. As always, feedback is welcome.

About the author

Ed Saipetch

Ed Saipetch is virtualization practice lead and systems engineer. He has been and both the end user and value added reseller space with a focus on application infrastructure and web architecture scalability.

Leave a Comment