IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
The evolving reality of virtualisation
Fri, 1st Jul 2011
FYI, this story is more than a year old

Virtualisation has been a hot topic for the past years, and has grown into a mainstream must-have for businesses on the quest for savings. But with the new opportunities comes new challenges – and in principle virtualisation is not even new.

Revera’s general manager, Robin Cockayne, says his company’s founder, Roger Cockayne, has been working with virtualisation as a discipline for more than 20 years.

"The idea existed with the mainframe, partitioning machines into virtual segments, which people could leverage independently as if they were independent machines themselves,” Robin Cockayne says.

Then network virtualisation emerged with the ability to take physical network and hardware and split it up into virtual, logical ones. The storage was virtualised, so a virtual service could be connected to a number of different storage appliances, but the application connecting has no idea, presenting capacity from anywhere.

"It was these principles on which we built our virtual data centers. New Zealand customers didn’t have the money, the size, scale or expertise to buy rich enterprise hardware and get value out of it. So we’ve always used virtualisation to divide that up so that they can pay for what they use.”

Server virtualisation has been the final piece. In the past, physical servers were connecting to virtual infrastructure services such as network, storage, back up, data replication and some security services like firewalls. Now that servers are virtualised, multiple virtual systems can run on a single server.

"Virtualisation is being promoted as IT’s great redeemer and blade centres as the fastest fix for sustained performance. But in the rush for IT’s quick fix, computing managers often overlook the importance of synchronising abstracted capacity with physical IT and wider infrastructure management discipline and workflow. The oversight will be painful and quite unsatisfying,” Cockayne says.

Broadly implemented

Virtualisation has been widely adopted both in the New Zealand market and internationally, along with the move towards cloud computing. The IT security company Symantec has taken the temperature of the implementation progress in its 2011 Virtualisation and Evolution to the Cloud Survey, which examined how organisations plan to move business-critical initiatives to virtual and hybrid cloud computing environments.The survey highlighted topics including server, client and storage virtualisation, storage-as-a-service and hybrid/private cloud technologies, based on more than 3,700 respondents from 35 countries worldwide.

The findings were that more than 75% of organisations are discussing private and hybrid cloud deployments. Server virtualisation has been implemented by 45% of the enterprises, with storage virtualisation close behind on 43%.

"Virtualisation is an enabler for private and hybrid clouds, and our survey shows that planning a seamless move is critical to achieving all the simplicity, affordability and efficiency that these environments have to offer,” says John Magee, vice president of virtualisation and cloud solutions, Symantec.

Magee specified that organisations investing in virtualisation and cloud technologies tend to follow a similar path, starting by virtualising less critical applications such as test and development environments and progressing to more important applications such as email and collaboration; line of business; eCommerce and supply chain and finally ERP/CRM.

Karl Sice, general manager Pacific at the IT company Acronis, expects virtual machines to continue to supplant dedicated physical servers until all the servers that can be virtualised, will be.

"It’s likely that the only dedicated physical servers that will continue to exist will be for applications that require dedicated computing resources, and even those servers might be encapsulated as a single, rapidly moveable virtual machine,” he says.

"The reasons are obvious. Virtualisation return on investment is almost instantaneous, because multiple underused physical servers are replaced by virtual machines resident upon a single physical host. It also reduces IT resource total cost of ownership, because instead of applying dedicated resources to applications that don’t use them fully, existing resources are used more fully, reducing the amount of capital investment required.”

Sice says IT centres, especially those in small and medium-sized businesses, are struggling with the difficulties created by using different software solutions for physical, virtual and cloud backup operations.

"As a compromise solution, agent-based backup software has been adapted to work with proxy servers using snapshots, which has reduced the overhead on the virtual machines, but has required significant network bandwidth, proxy server resources and customisation to work.

"This circumstance isn’t unusual and can be considered a hidden cost associated with server virtualisation. In every case, these additional costs elongate the return on investment payback period and increase the virtual solution’s total cost of ownership,” Sice says.

Benefits bring challenges

Benefits through virtualisation beckon: reduced capital and operating expenses, because organisations need fewer physical servers, making the server infrastructure less costly and easier to manage. With fewer machines, companies also have reduced power, cooling and data-centre footprint requirements, while improving security. Virtualisation also improves the availability of applications and services, as enterprises can more easily recover virtual machines (VMs) from failure.

However, Graham Schultz, regional director for Brocade in Australia and New Zealand, points out that companies can only realise these benefits if their virtual environments are properly managed. In this complex and expanding environment, companies need a fast, reliable data-centre network to ensure optimal performance.

The network must also extend to the storage systems that support the virtual environment, whether the systems are virtual, physical, or a mix of the two. All this with no additional funding or staff, while planning for continued growth.

"There are many challenges inherent in properly managing infrastructure components, beginning with the virtual and physical servers themselves. While virtualisation does reduce the number of physical servers, it doesn’t eliminate the time required to manage server operating systems, applications and data. There is still the need to manage all network connections while maintaining proper security profiles for users and applications, Schultz says.

"Similarly, there’s no shortage of management chores related to storage systems, be they physical or virtual. IT leaders must stay on top of backups and replication, and ensure proper storage tiers are in place for optimum trade-off between availability, cost and performance.”

All of this complexity puts stress on the network infrastructure, which ultimately must provide the performance, availability and application mobility required in a virtualised data center. Schultz encourages IT leaders to consider these network-related challenges inherent in virtualisation:

  • Increasing VM density can dramatically increase the amount of traffic to and from any given server, putting greater demand on existing network bandwidth to maintain required performance levels.
  • The need to ensure that server communication latency is deterministic and lossless.
  • VM mobility restrictions can limit maintenance and application availability options to either the physical server, the blade chassis or the server rack.
  • The need to provide end-to-end visibility from VMs to storage, which typically requires multiple tools.
While the traditional solution to network bandwidth and performance challenges has been to add more devices, ports and network tiers, Schultz insists the result is a network that continually grows in complexity and rigidity, becoming more difficult and costly to manage and maintain – the opposite of what IT leaders need right now.

"To achieve optimal performance in a virtual environment, the network must evolve. An application and virtual machine may no longer be locked into any physical infrastructure – be it a server, a specific port or even storage. This means the network infrastructure and the tools to manage it must improve.

"Virtualised data center networks must provide visibility and control over data flows, while also becoming simpler to operate, more flexible, resilient and scalable,” he says.

Meeting the challenges

The challenges that lie ahead as organisations get more virtualised will lie around manageability. Charles Clarke, strategic sales engineer for APAC at Veeam Software, says providing flexible and effective disaster protection will be vital as more and more critical services become virtualised. Business continuity is one of the promises of virtualisation, so tolerance of any down-time or data loss is likely to decrease.

"One curiosity around virtualisation that we’ve experienced is that although organisations often virtualise to consolidate server hardware, the number of servers as guest VM’s they have to manage tends to increase, resulting in so-called 'VM sprawl'. So having visibility into what’s happening at the virtualisation layer from an audit, configuration management and capacity planning perspective will be increasingly important,” Clarke says.

Management and monitoring for Virtual Desktop Infrastructure implementations will present challenges too, Clarke suggests, and he sees a consistent, scalable and proven toolset aligned to business processes as crucial to success.

Clarke says the primary key to successful optimisation of virtualisation solutions is to realise that virtualisation represents a paradigm-shift and there needs to be an associated shift in thinking about backup, recovery and management tools

"Ensure that reporting and monitoring tools are in place as early as they can be. Having accurate and ‘easy-to-consume’ reports around change management and capacity planning will make the inevitable expansion of the virtual environment much more likely to be successful.

"Look to technologies that have a dedicated focus on virtualisation and even have their origins in that space. In addition, retro-fitted tools often suffer from tunnel-vision focused on the guest VM, rather than taking the entire virtual infrastructure into account. This makes it impossible to get a complete picture of the environment and means issues that may impact the entire virtual stack can be difficult to diagnose and resolve.

"Virtualisation offers incredible capability in continuity and portability, and organisations should pick tools that utilise these features to their maximum potential,” Clarke insists.

Robin Cockayne from Revera agrees on the potential, but shares in the warning, saying that as people take to virtualised IT and move from physical to virtual, they must first model the impacts to avoid inflicting collateral damage.

"Before moving existing applications they must first be modelled in the new environment to correctly calculate the size of necessary server capacity. It’s about planning and design. And this goes much further than hardware and operating systems, and includes environmental impacts such as cooling and power,” Cockayne says.

Expectations and reality

Symantec’s survey uncovered disparities between expectations and reality as enterprises deploy virtualisation solutions. It shows CEOs and CFOs are concerned about moving business-critical applications into virtual or cloud environments due to challenges including reliability, security, availability and performance.

"Early investments have revealed gaps between expectations and reality, which indicate that organisations are still learning what these technologies are capable of and how to overcome the new challenges they bring with them,” Symantec’s John Magee says.

In the figures from the survey it is apparent that server virtualisation projects were most successful, with only a 4% average gap between expected and realised goals. The main gaps occurred in scalability, reducing capital expenditures and reducing operating expenditures.

The average shortfall in storage virtualisation was 33%, with disappointments coming in agility, scalability and reducing operating expenditures.

Respondents reported an average gap between expected and realised goals of 26% with desktop virtualisation. They cited disappointments in new endpoint deployment, application delivery and application compatibility.Based on Symantec’s findings, Magee gives a word of advice to those implementing virtualisation: "Set realistic expectations and track your results. Remember that despite the hype, cloud is a new and still maturing market. Do your homework to set expectations that are realistic, then follow up and track results to identify ways to improve project efficiency going forward.”