First, I would like to take an apologetic approach. Although I’m an experienced Windows user, both server and desktop, I’m not an IT specialist nor am I claiming to be one. This post simply expresses my wishes and thoughts concerning future non-server Windows O/S, and naturally there are many possibilities and holes in those ideas.
Basically I’m a Windows O/S “fan”. I’ve been using Windows for years, both professionally and for personal use. As most “techies”, I’m also supporting quite a few family members and friends, with many aspects that are usually considered “tech stuff”, such as driver or hardware installations, application setups (yes, some consider this to be “tech stuff” too), backups, antiviruses, firewalls, routers etc. As such, I have also installed and re-installed Windows (and MS-DOS) over and over, countless of times, over many different versions of Windows, and over many servers and PCs. A re-installation is usually a last resort for me, and occurs when a certain application or driver had corrupted Windows “beyond repair”. This usually happens when a family member calls for help, swearing that he had “done nothing to have caused this”, “installed nothing which caused this”, and Windows won’t boot anymore for “no apparent reason”. One day, “Windows just decided on it’s own” that it won’t boot, not even with “Last known good configuration” or “safe mode”. I’m sure that quite a few reading these lines nod their heads now in agreement and understanding.
I do admit though, that some 10 years ago and prior to that, re-installing Windows was “fun”: Post re-installation, Windows would run fast, there weren’t dozens of icons on the desktop, the Registry was new and “clean”, and if you had formatted the drive prior to re-installation you only had to restore your documents from a backup. Naturally, you had to re-install different applications to allow that family member to continue working on his Word or Excel files etc. Later on the years, and after re-installing Windows so many times, I got fed up with it – it just took too much time, and you always end up with something that you either forgot to backup, and was now lost forever, or you forgot to install something (or lost the serial number/key to it). A possible solution to this was available for some time – simply create an image, and restore when necessary, right? Well, although this seemed like a magical solution, I usually ended up re-installing Windows anyhow. There are multiple reasons for that: either you had problems backing up your system drive, or you have had to find a spare HDD to backup to, the backup/restore process took forever, and except for the excellent DOS-based Ghost which usually got the job done, I found the majority of the backup software (including later versions of Ghost running in Windows) unreliable. Some of you now are angry with me and disagree. Well, don’t be – that was my personal experience.
Later on Windows introduced “System Restore”. I tried this option about 3 or 4 times before I gave it up. Either it failed to “restore the system”, or it messed up my documents and files. Sometimes I just could not find the System Restore point which I was certain that I did quite some time ago. So, what I initially thought would be a good solution to restoring a system to certain specific points in time, proved disappointing.
There were other options, such as Windows Repair and SFC. Windows Repair was sometimes able to do the job, and repair the system into bootable state. SFC also helped me out once or twice when system files got corrupted due to malware. But these do not help you out with reverting to a previous state of Windows, in order to earn faster performance, restore files which may have been deleted or corrupted etc. For some reason, SFC is something which AFAIK has no GUI in Windows, so it’s one of those things that if you haven’t come across by chance, you have no way of knowing them. It’s also command-line based, which requires a “techie” to run. Can you imagine your “tech-supported” family members or friends perform SFC without your guidance?
Now Windows 8 introduces the latest in restoring your PC. I guess that Microsoft also understood that at least their existing solutions are simply not (good) enough. Most repair/restore solutions require a “techie”. So they now came up with two new features: “Reset your PC” and “Refresh your PC”, which you can read about here. To make a long story short, “Reset your PC” re-installs Windows, possibly within minutes. “Refresh your PC” backs up your documents and settings (and Metro apps), re-installs Windows, and restores your documents and settings (and Metro apps). Microsoft also realizes that you may want to create a backup of a certain state or your machine, so they also provide you with a 3rd alternative, which is yet another command-line tool called recimg.exe. This will allow you to create an image in a specific folder of your choice. I’m uncertain how this actually works, and how you are supposed to restore that image as Microsoft did not elaborate on this in that post. They simply stated that this tool is in it’s early stages, so one can hope that they won’t cancel it eventually, and that it’ll come with a trivial GUI and won’t remain a somehow hidden command-line utility such as SFC.
Microsoft has acquired over the years the ability to perform virtualization. To the best of my knowledge, for Microsoft this started by purchasing Virtual PC from Connectix. Virtual PC is available for several years now, and Microsoft also provides an XP image which can be run on Virtual PC (“XP Mode“) that provides a kind of the “ultimate backward compatibility” to Windows XP for those who must have the ability to run older applications. Microsoft has also released a virtualization solution for servers, also originally by Connectix, called Microsoft Virtual Server. This product was replaced by today’s Windows 2008 virtualization feature, but more importantly to this post: Microsoft Hyper-V Server.
MS Hyper-V Server is a Server Core based virtualization solution. To put it simply, it is a stripped down Windows 2008 server, without components such as Internet Explorer, Windows Explorer and a lot of other stuff we got accustomed to, thus gaining performance and higher security. You interact with Hyper-V server either via a command line console or an MMC snap-in from another machine, and can use it to create VMs, just like VMWare solutions. Personally, I have used VMWare based solutions a lot more than Hyper-V, but seeing the stripped down Hyper-V, with it’s DOS-like command console gave me an idea: what if the next non-server Windows, would be a kind of Hyper-V solution (I’ll refer to it as “Hyper-V-8″).
Think about it: Microsoft has already laid down the principals for this by creating a “Server Core” with Hyper-V. Imagine the next Windows as an O/S built of two components; a “Windows Core & Hyper-V” component, and a VM that runs on top of it! (Note, that I’m not referring to running Hyper-V on a Windows 8 similarly to running Virtual PC on Windows 7). Just think of the possibilities! You install this “Hyper-V-8″ Windows just like any other Windows. The end-users need not knowing that it’s virtual-based. For all they care, they are simply installing Windows. One of the final steps of the installation would be that Windows automatically creates the first snapshot of itself. Each user may then proceed to install whatever applications they desire, making snapshots either “manually”, or automatically, just like “System Restore”. More advanced users will be able to jump between the different snapshots as required, create different branches of snapshots to experiment with different installations etc.This can also provide a dual boot solution for users who want to install several O/S on their machines.
A faulty device driver just crushed your Windows? Just revert to an earlier snapshot (“Last known good snapshot”); Want to test different applications? Create a snapshot and test all you like; A virus just destroyed Windows or your some of your data? You can attempt to detect your lost data in an earlier snapshot; Want to “Reset your PC”? No problem – just revert to the first snapshot created by Windows; Want to upgrade your HDDs? Restore a snapshot on your new HDD, after installing Windows Core; Want to upgrade your O/S or different applications but you are uncertain whether you’ll like the result? You guessed it! that’s right – use a snapshot.
OK, OK, OK. I know – there are plenty of challenges to face for this to work. Here are just several of these:
- The VM’s would have to be really fast, so users can’t tell the difference between the old style Windows and a VM machine.
- When I say “fast”, this also means gaming. You can’t have a Windows which isn’t capable of running the latest high-end games on the latest high-end hardware, just like a regular Windows O/S would.
- On that same note, there should be complete “hardware transparency”. All the hardware should be accessible and working over Hyper-V-8 VMs just like they would with a regular installed O/S.
- If we use so many snapshots as described above, they are going to take a lot of space. Although HDD are growing in size all the time, I guess that snapshots would have to be extremely efficient and compressed. Naturally, a user should be able to store snapshots on other devices such as external HDDs, flash drives etc.
This is just a minor list of issues, but there are plenty more, such as security, app legality and licensing, correctly indexing what exists on each snapshot and so on.
Well, if we’re going to have extremely compressed and efficient snapshots, why not store them in the cloud, and make them accessible “everywhere”? Cloud based solutions for storing an entire installed O/S and it’s apps are available for years now. Why not have the same capability with snapshots? There could be great advantage to storing snapshots in a cloud. They would be accessible from everywhere, to any client “Hyper-V-8 Cores” machine. If you have a hardware failure, your snapshots are safe and sound on a cloud somewhere, ready to be downloaded and used by your newly installed hardware.
Cloud integration should also be fast and possibly “lazy”. That is, it should be possible to download partial snapshots upon demand, because you can’t have tons of bytes downloaded every time you want to revert to a snapshot. So the snapshots have to be “smart”.
Let me take this one step further. This relates to having “partial snapshots” as described earlier, for the cloud. I’ll start by describing a common practice that is (was?) a methodology for installing an O/S – keep your data in a separate HDD or at least in a separate partition from your system. The logic is clear – if you end up re-installing your O/S, your data is safe elsewhere. Sadly, this approach requires effort. End-users (especially those family members) have to be educated to store their (vital) data not in the default My Documents, which usually (and unfortunately) is located in your system drive. I have tried this approach for years, but when I find myself about to re-install Windows on a family member’s machine, I always have to remind them to backup their data, and help them to look for it in their system drive. And what if the data was corrupted by a virus or a power outage?
Well, if we have a snapshot of the data, we’re basically “safe”, right? I mean, all the data is stored within the snapshot and therefore is accessible. However, it’s more than possible that the data has to be accessible from several snapshots? Can you imagine having to search for a data in one of the snapshots, and moving it to another? It would be a real drag to locate a specific required data from one snapshot, storing it on the internet or a flash drive, loading another snapshot and downloading the data to that snapshot. If you have several versions of the documents, you can easily lose track where they are located, which is the latest and so on.
One solution to this is to store data in the cloud or use a sync software such as Dropbox. But I feel that while SkyDrive, Google Documents and others are proving such a convenient capability – it is not sufficient, as users have to be connected to the internet, have to remember to upload (or sync) their data to the cloud (if they work on local software), or are simply reluctant to store their documents on the internet for security reasons. So, I was thinking more in the lines of having a VM just for data, which is accessible from the O/S VM. This resembles storing the data on a different drive or partition as described earlier, but would have major advantages because it will be accessible to different snapshots, and documents will have multiple versions which can be loaded and restored. You can easily view or revert just like in a source controlled environment. This “partial snapshot” can be viewed and accessible by the currently running O/S snapshot, and the end-user should be able to “jump between snapshots” freely, without having to think about tracking the data. This would require, of course, that a Hyper-V-8 Windows installation automatically creates a partial snapshot for the data, pinpointing the My Documents folders to that snapshot. This sounds like a virtual drive (resembling a TrueCrypt drive), but more advanced as far as file management goes (i.e. versioning). Naturally, it should not be affected by O/S snapshot change. More over, it would be great if files or folders had a special attribute which would indicate that they are “user’s data”, and as such Windows would automatically store them in that partial data snapshot. I mean – why bother detecting your data files in order to back them up, if applications can simply mark them as data, having Windows store them in the proper place? Can you imagine that saving a Word document would automatically save it in a VM snapshot dedicated to data? How about deciding that all PDF files should be automatically stored and versioned in such a snapshot?
The idea of having a “Hyper-V-8″ Windows struck me quite some time ago, but I finally sat down and wrote about it when I read about the “Reset/Refresh your PC”. Virtualization is already used successfully world-wide for commercial purposes. Servers nowadays are already running multiple VMs supporting Clouds, Enterprises production machines or QA servers. True, the servers used for running these are usually high-end machines, but technology just keeps getting better and cheaper all the time, so non-server machines are also advancing constantly. Home technology capable of running a VM desktop O/S in a “transparent” manner as described above, may already be here. It is more like a combination of already existing stuff which should be put together to smoothly accomplish the listed ideas (and much more). Maybe such features will be available in Windows 9 or 10?