vSphere upgrade experience, day 1

June 24th, 2009 by jason Leave a reply »

A few nights ago, I began the VI3 to vSphere upgrade in my home lab and I thought I would share a few experiences. This day 1 post will cover vSphere management tools (vCenter, Update Manager, etc.) and not the hypervisor itself (ESX or ESXi).

My VI3 environment has been through some wear and tear over the years, including a few unexpected power outages which could have caused corruption on the vCenter server or the back end databases. Although the part of me which desires peace of mind wanted to start “clean” with an empty database, I knew that I must go the upgrade route, maintaining my existing data because frankly this is the method I will likely be using to deploy most vSphere installations.

I run a lot of what I would consider “production workloads” on my home lab, including domain controllers, messaging servers for registered domains, web servers, Citrix servers, this blog, etc. Failure is an option as well as a good learning experience (after all, this is a lab), however, long term outage of my production workloads is not an option. My vCenter server is virtualized on VMware Server 2.x so I started out by shutting down its OS and taking a VMware snapshot. After the vCenter shutdown, I also captured a full backup of my SQL server databases (both the vCenter database as well as the Update Manager database). I now have a solid backout plan which does not incorporate crash consistent data.

I powered the vCenter VM back up. I then copied over the vCenter 4.0 .zip package and extracted it into a temp directory on the vCenter server. This was the first mistake I made. Not thinking clearly about my snapshotted VM, I had just inflated the VM’s delta file by a few GB. What I should have done is to perform the vCenter copy and extraction before the snapshot. This is not the end of the world. After the installation of vCenter 4 and Update Manager, the snapshot would have grown by several hundred MBs if not a few GBs anyway. The .zip file and extracted contents were just a lot of extra non-contiguous I/O baggage.

I’m going to perform an upgrade of the databases, but I don’t care to actually “upgrade” vCenter and all of its components from 2.5 Update 4 to version 4.0. I’ve never ever had good luck with vCenter upgrades. My method, therefore, is complete uninstall of vCenter and all components, then a new installation of vCenter while attaching to the existing database which will in turn be upgraded. During the uninstall of vCenter, I typically find that the vCenter uninstall routine leaves bits and pieces behind in folder structures as well as the registry. I manually deleted the remaining pieces and rebooted the vCenter server for good measure and a clean start for the vCenter 4.0 installation. In retrospect, the manual deletion of left over files and uninstall of the vCenter license server will turn out to be my second and third mistakes which I’ll talk about shortly.

After the reboot, I began the vCenter 4.0 installation. I also made sure my vCenter SQL account had DBO rights to the MSDB database, the vCenter database, and the Update Manager database. This is a new requirement during the installation of vSphere. I wasn’t too far into the installation when I ran into failure and the installation routine rolled back. The error message was rather cryptic and I’m sorry I don’t have a screenshot but it had to do with the installer’s inability to properly install and configure the local ADAM instance which I believe is used for vCenter federation (linked vCenter servers). I quickly found a long thread on the VMTN forums which pointed me to the solution in VMware’s knowledgebase. KB1010938 talks about NETWORK SERVICE NTFS directory permissions (READ) that are required on the root of the drive where vCenter is being installed. A quick check showed I lacked the necessary permissions. I resolved this quickly and re-ran the installation.

During the re-installation, I ran into my second problem (self inflicted). Way back when, I had set up SSL certificates for VI3. The certificate files are required to be present during the database upgrade because the certs are tattooed to the database as well. During my “cleanup” process I spoke about above, I had deleted the SSL folder containing the certificate files VMware had left behind. Turns out this was by design. Thankfully when I performed the cleanup, all files and folders went to the recycle bin and I was able to quickly retrieve them. Without the certificate files, I would have been looking at a new database installation which would have deleted all vCenter data including performance history.

After restoring the certificate files, I reran the installation a third time. The installation of vCenter Server and all of its components was successful. I was able to open the vSphere client and connect to the vCenter server. My hosts, VMs, and all data was present. All looked to be successful until I tried a VMotion. The ESX hosts which were still on VI3 were no longer licensed. Refer to my comment further up about uninstalling the license server being a mistake. vCenter 4.0 license keys do not license VI3 legacy hosts. A VI3 license server or host based license keys must be plugged into vCenter 4.0 in order to properly license VI3 legacy hosts. I resolved the issue by re-installing the VI3 license server on some junk VM in the interim and then plugged the license server name into vCenter 4.0’s licensing configuration. Viola! The ESX3 and ESXi3 hosts are now licensed and everything is working properly. After feeling confident in the installation, I removed the vCenter snapshot.

This was enough change for one night. The ESX host upgrades (rebuilds) will come a few days later. If I uncover any gotchas during host installations, I’ll be sure to share but I expect those to be fairly uneventful. I’ve installed a lot of ESX4/ESXi4 hosts during the vSphere beta and it’s straight forward, not unlike ESX3 /ESXi3 installations. Most of the ~150 changes in vSphere will be evident in vCenter and the various components. There’s a few enhancements in the hypervisor installation but nothing that hasn’t already been pointed out in various other blogs and installation videos.

Advertisement

No comments

  1. rbrambley says:

    Jason,

    The VC upgrade is always the hard part. Too many linked pieces. Always something different and unique with each implementation I’m involved in.

    Your idea for uninstall, cleanup, fresh install seems like a good decision as long as you remember to leave the License server in place until all ESX hosts are upgraded.

    I’m wondering how smooth a local SQL 2005 upgrade would be ….

  2. Aleks says:

    Very nice and helpful article..thanks Jason..!

  3. Eric Gray says:

    Thanks for sharing the experience, you really had me on the edge of my seat for a minute… I thought you were going to say, “…then I gave up and installed SCVMM with Hyper-V…” Heh heh. 🙂

    I remember lobbying hard to get the uninstaller to leave the SSL files in place… and then detect them during a subsequent installation without generating new ones. Glad you could recover.

  4. Roger Lund says:

    Jason,

    Let this be advice , as to why you want to read the installation guide, and best practices.

    It states not to remove the license server.

    Just giving you a hard time.

    Roger

  5. jason says:

    I read that guide completely the day it came out but I obviously forgot at least one important detail. My fault. Give me some credit, I remembered the SQL database requirement 🙂

  6. Sysadm says:

    Thank you for sharing your personal experience. We are also migrating to vSphere 4 but we an issue with ADAM and a missing SSL port registry key (HKLM\System\CurrentControlSet\Services\VMwareVCMSDS\Parameters) and other ADAM issues. Haven’t you had this problem?
    Nice post!