Experiences with…VIRTUALIZATION

Paul's picture
17
2009

For this second article in the ‘Experiences with...' series I've chosen the subject of Virtualization.  In the broadest sense, ‘virtualization' is the creation of a virtual (rather than a real or actual) version of something and more specifically when referring to computer technology, it involves the creation of an operating system, a server, a storage device or a network. In my last article on VPN we created ‘virtual' private networks; to all intents and purposes the network connection appeared and acted just like an actual network while in reality it was a piece of software controlling portions of the Internet.

 

I'm going to focus on the creation and use of Virtual Machines (VMs) which involves the construction of an entire computer (CPU, RAM, graphics adapter, fixed and removable disk drives, network interface cards, etc...) that only exists as software and data files.

 

To set the scene I'd first like to relay my own history of working with PCs and how this led me to the use of VMs.

 

In the early 1990s (when I first began building PCs) the purchase of a computer was a major investment so a single ‘home PC' had to perform a broad range of duties: it had to act as a gaming PC for the kids, have an office suite and various accounting and taxation applications for home management activities, connect to the internet via the phone line, and so on. Even as the price of PCs began to reduce and households acquired more than one these poor machines were still being overloaded with a mass of games and diverse applications. Not surprisingly, applications and PCs ‘locked up' regularly as they struggled to manage the conflicts and resource allocations of all this code; some of which was poorly written.

 

My PCs were no different and as time went by and the data stored on these machines became greater in both quantity and importance I decided to network the machines together (using Windows for Workgroups 3.11) and began backing up critical files to floppy disk. This was not entirely successful, because the infrastructure was only as reliable as the computers on which it was running and security was poor (I don't believe my wife ever deleted any critical files but that was more by luck than judgement).

 

The solution seemed to lie in the acquisition of a server, running a server operating system to manage data, security, back-ups and network management; and so entered Windows NT Server 4.0. At first this appeared to be a great success but as time went by and an ever greater number of features were added to the server (DNS, DHCP, WINS, RRAS, Antivirus, etc...) this machine, too, became less reliable; perhaps never locking up but certainly requiring regular reboots to ensure all applications kept providing a reliable service.

 

Not long after Windows 2000 Server was introduced it became clear to me that the ‘single server' model was not sufficiently reliable to leave running unattended for long periods (months at a time). I decided to build several servers and dedicate single or groups of duties and applications to each of them. Over a period of several years I set up five individual boxes across two sites (see figure 1). It was a great success; machines did not need regular restarting and would run for long periods while giving reliable service.