Today's businesses are flooded with new buzzwords and acronyms; one of these being 'virtualised infrastructure'. But what is it exactly and is it something new or has this been lying around for some time and only now, due to changes in our business models, have we decided to adopt it?

Virtualisation was first developed in the 1960s to partition large, mainframe hardware. However, it was effectively abandoned during the 1980s and 1990s when client-server applications, inexpensive servers and desktops established a model of distributed computing. Rather than sharing resources centrally, organisations used these low-cost distributed systems to build up islands of computing capacity

While spreading computing power throughout a business, the downside of distributed computing is server underutilisation. Organisations tend to run only one application per server to limit the risks from that application's vulnerabilities. This results in a large scale underutilisation of server power and storage

Another issue is the increase in the cost of acquiring and maintaining an IT infrastructure. The operational costs to support a growing physical infrastructure have steadily increased. Increasing IT management costs is another problem. As computing environments become more complex, the level of specialised education and experience required for competent management personnel have increased - along with the associated costs to employ such personnel.

Companies are increasingly affected by the downtime of critical server applications and the inaccessibility of end user desktops. The threat of security attacks, natural disasters, health pandemics and terrorism have all elevated the importance of business continuity planning for both desktops and servers.

Managing and securing desktops presents numerous challenges. Effectively controlling a distributed environment and enforcing management, access and security policies, without impairing the users' ability to work effectively, is both complex and expensive. In addition, you must constantly apply numerous patches and upgrades to eliminate any security vulnerabilities. However, most of these problems can be solved by adopting a virtualised infrastructure at your data centre.

Virtualisation is an abstract layer that separates the physical hardware from the operating system, delivering greater resource utilisation and flexibility. It allows multiple virtual machines, with various operating systems inside them, to run separately on the same physical machine. Each virtual machine has its own set of virtual hardware such as memory, processors, network cards, disk drives and so forth, upon which the operating system and applications are loaded.

Virtualisation sits between the physical hardware and the operating system, hiding the physical hardware from the operating system so that the operating system sees a consistent, normalised set of hardware regardless of the actual physical hardware components.

Adopting a virtualised infrastructure enables developers to have more flexible control over their environments, allowing self-provisioning of complex arrangements and the ability to point another developer to a previously saved configuration of several virtual machines for testing and debugging purposes. This provides time savings during the machine set-up phase. It also helps IT administrators spend less time on repetitive tasks such as provisioning, configuration, monitoring and maintenance.

For more information, call Megabyte Ltd (F4, Technopark, Mosta) on 2142 1600, e-mail sales@megabyte.net or visit www.megabyte.net.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.