VIRTUALIZATION
"Virtualization" is a framework or methodology of dividing the resources of a computer into multiple execution environments, by applying one or more concepts or technologies such as hardware and software partitioning, time-sharing, partial or complete machine simulation, emulation and quality of service. In sequel to achieve virtualization a vital component called the virtual machine monitor (VMM) runs directly on the real hardware - without requiring a host operating system (OS). Virtualization is not always used to imply partitioning, or breaking something down into multiple entities. It can be understood with an example of different (intuitively opposite) connotation: by taking N disks, and making them appear as one (logical) disk through a virtualization layer.
Virtualization requires a platform of Parallel Virtual Machine (PVM) which is a software package that permits a heterogeneous collection of UNIX and/or Windows computers hooked together by a network to be used as a single large parallel computer. PVM is widely used in distributed computing as shown in Fig. 1. Colloquially, "virtualization abstracts out things” [1] due to following main reasons:
a) VMM is hosted, and runs entirely as an application on top of a host OS.
b) Easy accessibility of all instructions that execute on a virtual machine.
c) Non-privileged instructions are directly executed.
From Fig.1, it is understood that virtualization achieves the following:
i) to run all servers side by side by much less no. of physical servers.
ii) this leads to tremendous decrement in cost which includes hardware, operational & management maintenance, data centers (with large numbers of servers) etc.
Reference:
[1] http://www.kernelthread.com/publications/virtualization/
Next Issue Topics:
Why “Virtualiation”?
Relation with “Engineering and Technology”. |