Updating Windows at the source

Since the release of Windows Vista, Windows has been installed by using a compressed image file, known as a WIM file. This is what allows Microsoft to ship one disk containing the home and other versions of Windows, unlike the multiple disks of the XP era. What makes a WIM file even more useful is that it can be mounted inside a running copy of Windows and have patches and drivers injected directly into the image. This is extremely handy when you realise that Windows 7 has been out for almost 6 years now and has a couple of hundred patches out there. Anything that cuts down the wait for Updates to install is a good thing, as well as having a more secure system out the box.

There are a couple of limitations however:

  1. You can’t inject all the update patches offline. Certain updates can only be installed when Windows is running.
  2. NET Frameworks cannot be injected offline. These will need to be installed and patched after Windows is up and running.
  3. You can only inject patches if they are in CAB or MSU format. EXE files are not usable here.

To update Windows 7 (or 8 or Server editions for that matter) you will need the following:

  • Windows 7 media or ISO file. I don’t have access to OEM disks so cannot say if those can be updated. What you really need is the install.wim file, found in the \Sources directory on the disk. It’s the single biggest file on the disk.
  • Windows 7 Automated Installation Kit or the later Windows 8.1 Assessment and Deployment Kit. You need this for the DISM tools which services the WIM file.
  • Access to the updates for Windows 7. There are many ways to get these, but I have found that looking the C:\Windows\SoftwareDistribution\Download folder on a patched machine to be one of the better ways to get the updates. Other tools have had mixed success for me.
  • Hard drive space and patience. Injecting updates, committing the changes to the WIM file and optionally recreating the ISO file will take time.

Here’s my step by step guide on how to do this update procedure. A note before I begin however. My guide is a little longer than strictly speaking necessary. If you have access to ISO editing software, you could just replace the install.wim file and be done. However, I am going to include the steps to rebuild an ISO image, including the option to make it UEFI boot compatible.

Updating Windows

  1. Make 3 folders on a hard drive. For example C:\Win7, C:\Updates and C:\Mount.
  2. Copy the install.wim file from your ISO or DVD to C:\Win7.
  3. Install the Windows 7 AIK or Windows 8.1 ADK. Specifically, we are looking for the Deployment Tools option. We don’t need the rest for this process.
  4. Place all the updates for Windows 7 into the C:\Updates folder.
  5. Open up the “Deployment and Imaging Tools Environment” shortcut as an Administrator. The DISM commands will only run with Admin approval.
  6. Run the command dism /get-wiminfo /wimfile:C:\Win7\install.wim
    This will tell us about the various Windows editions present in the WIM file. Depending on the disk, it may include multiple editions or only 1. Take note of the index number which corresponds to the edition of Windows you want to update, we will use it in the next command.
  7. dism /mount-wim /wimfile:C:\Win7\install.wim /index:X /mountdir:C:\Mount (replace X with the number you want from step 6) DISM will mount the image edition at the C:\Mount folder
  8. dism /image:C:\Mount /add-package /packagepath:C:\Updates
    DISM will now start to add all the MSU and CAB files it finds in the C:\Updates directory and apply them to the mounted image. This will take some time, so feel free to take a break. Some updates may cause an error; these updates are only meant to be installed when Windows is running. You will need to find out what updates caused the error and remove them. Type dism /unmount-wim /mountdir:C:\Mount /discard to discard all the changes and follow steps 7 & 8 again until the process is error free.
  9. dism /unmount-wim /mountdir:C:\Mount /commit
    This will commit the changes, save and unmount the WIM file.
  10.   If you want to update another edition of Windows 7, go back to step 7 and use another index number. Go through steps 7-9 again for all editions you want to update.

Building the new ISO for Windows 7

If you are planning to use the updated WIM file with Microsoft Deployment Toolkit, you are good to go and can use the updated install.wim file in conjunction with the rest of the Windows setup files. Otherwise, you’ll need to create a new ISO image that can be used virtually, burned to DVD or used on a USB flash drive for install purposes.

Open up the “Deployment and Imaging Tools Environment” shortcut as an Administrator again. Run the following command to make the ISO file that can boot on traditional BIOS based systems or on UEFI systems. For the most modern UEFI systems, make sure Secure Boot is disabled before you install Windows 7, as it is not Secure Boot capable.

For this step, copy all the files from your Windows 7 DVD or ISO to the Win7 directory, but leave out the old install.wim file or you will have wasted your time.

oscdimg.exe -u2 -udfver102 -bootdata:2#p0,bC:\Win7\boot\etfsboot.com#pEF,ebC:\Win7\efi\microsoft\boot\efisys.bin -o –lVOLUME_LABEL C:\Win7 C:\Win7\Win7.iso

Replace VOLUME_LABEL with something of your choice.

You can now burn the ISO file to DVD, use it on a flash drive or as an ISO with any VM software.

I have not tried this procedure with Windows 8.x, but I believe it should work the same way as the file layout of the relevant files and folders are near identical.

The Windows 10 upgrade experience

On Wednesday 29 July 2015, a new chapter opened up in the history of Microsoft’s Windows. Windows 10 was unleashed on the world, Microsoft’s attempt to remedy the largely cool reaction to Windows 8, as well as stay relevant (at least in the eyes of a lot of tech bloggers) in the new app centric world. The return of the Start Menu, an unprecedented public participation process via the Windows Insider program, free upgrades for a year, DirectX 12 for gamers and many more features all combined to build up a hype that has not been seen for a long time in the Windows world.

Like millions of other people, I reserved my copy of the Windows 10 upgrade via the app in the system tray that appeared after a round of Windows Updates a few months back. The idea was that this application would trickle download Windows 10 in the background as soon as the system went RTM, so that on launch day you’d be ready to upgrade immediately. Only problem is that the trickle download process started 1-2 days before the launch of Windows 10, which meant that with my slow ADSL speed, it would be some time before I’d be ready to go, let alone the chance that I’d be in one of the later waves of the rollout. This is probably due to the fact that build 10240 only went RTM 2 weeks before the global rollout. Either way, I was impatient to get going.

Thankfully Microsoft thought clearly and made ISO images available for direct download or via the Windows 10 Media Creation Tool. I snagged a copy of the Media Creation Tool and used it to download a copy of Windows 10 at work, where I have access to a faster internet connection. Once the ISO file was built by the tool, I burned it to DVD for 3 other staff members who were interested. It’s legal to do this by the way, since each person would be having their own upgrade key made during the upgrade process. For myself, I used the excellent Rufus utility to burn the image to a flash drive. Although the Media Creation Tool can burn the image to flash drive, I’ve come to trust Rufus, especially thanks to its ability to create properly booting UEFI capable media.

Once at home, I simply inserted the flash drive, double clicked on setup.exe and let the upgrade process run. I had previously been running Windows 8.1 with all updates applied. The installation process ran smoothly and took about half an hour to move all my files, upgrade itself and get to the desktop. All of my software remained installed and I haven’t yet had any compatibility issues software wise. I did have some issues with my PC’s built in Bluetooth adapter, but a couple of hours after the upgrade, a driver had been installed in the background and the adapter was good to go again. After the upgrade, I did manually install Nvidia’s latest graphics driver, since I already had it downloaded and couldn’t wait on Windows Update to deliver the driver.

So far, I mostly like Windows 10. It’s been stable despite the upgrade, no blue screens or crashes. As mentioned, all my software has remained in working without issue. Speed wise it feels a little faster than Windows 8.1, but not much. The speed may be more impactful on users coming from Windows 7 or earlier. My biggest real gripe at the moment with Windows 10 is the severe regression in the OneDrive client, a very well moaned about topic on the net. Windows 8 and 8.1 spoiled me in that regards with placeholder sync that let me see the files that were on my OneDrive, without actually needing to download them. The Windows 10 version basically takes us back to the Windows 7 version of the client where you have to chose which folders and files to sync, which will then chew up space on your hard drive. I am not happy at all with this change, but I am holding out that the new client that should be here by the end of the year will offer a better experience.

One small note: my copy of Windows 10 wouldn’t activate until a day after the install. While I kept thinking that somehow it was related to my Windows 8.1 key, it was simply a case of the fact that the activation servers were getting hammered into oblivion. Over 14 million people upgraded in the first 24 hours, so I am not surprised that I struggled to activate. I am assuming that now, almost 2 weeks later, activation should be happening immediately as per normal again.

It’s been a common refrain that I’ve seen on the net from reviews that if there’s one thing Windows 10 needs, it’s that it needs more polish. Lots of little fit and finish issues keep cropping up older legacy parts of Windows are moved into the modern framework. Different right click menus, a System Settings App that isn’t quite Control Panel yet, out of place icons etc. all need some time and attention before Windows 10 becomes its own unique system. With the promise of Windows as a Service, it’s likely that many of these issues will go away with time as the system keeps being updated and improved. One thing is for sure, it’s going to be an interesting ride indeed.

The long hunt for a cure

At the end of March 2014, our school took ownership of a new Intel 2600GZ server to replace our previous HP ML350 G5 server which was the heart of our network. The HP had done a fantastic job over the years, but was rapidly starting to age and wasn’t officially supported by Windows Server 2012 R2. Our new server has 32GB of RAM, dual Xeon processers, dual power supplies, 4 network ports and a dedicated remote management card. Although a little pricier than what I had originally budgeted for, it matched what the HP had and would earn its keep over the next 5-7 years worth of service.

After racking and powering up the server, I installed firmware updates and then Server 2012 R2. Install was quicker than any other server I’ve done in the past, thanks to the SSD boot volume. After going through all the driver installs, Windows Updates and so on, the server was almost ready to start serving. One of the last actions I did was to bond all 4 network ports together to create a network team. My thinking was that having a 4Gb/s team would prevent any bottlenecks to the server when under heavy load, as well as provide redundancy should a cable or switch port go faulty. Good idea in theory, but in reality I’ve never had a cable or port in the server room go bad in 6+ years.

Looking back now, I’m not sure exactly why I bothered creating a team. While the server is heavily used as a domain controller, DHCP, DNS and file server, it never comes close to saturating 1Gb/s, let alone 4. Almost every computer in the school is still connected at 100Mb/s, so the server itself never really comes under too much strain.

Either way, once everything was set up, I proceeded to copy all the files across from the old HP to the new Intel server. I used Robocopy to bulk move files, and in some cases needed to let the process finish up over night since there were so many files, especially lots of small files. Data deduplication was turned on, shares were shared and everything looked good to go.

When school resumed after the holidays, the biggest problem came to light right on the first morning: users being unable to simultaneously access Office files. We have a PowerPoint slideshow that is run every morning in the register period that has all the daily notices for meetings, events, reminders, detention etc. Prior to the move, this system worked without fault for many years. After the move, the moment the 2nd or 3rd teacher tried to access the slideshow, they would get this result:

WP_20140409_001
Green bar of doom crawling across the navigation pane, while this odd Downloading box would appear and take forever to do anything and would tend to lock Explorer up. Complaints naturally came in thick and fast and the worst part is that I couldn’t pinpoint what the issue was, aside from my suspicion that the new SMB3 protocol was to blame. I had hoped that the big Update 1 update that shipped for Windows 8.1 and Server would help, but it didn’t. Disabling SMB signing didn’t help either. At one point, my colleague and I even installed Windows 8.1 and Office 2013 on some test machines to try and rule out that possibility, but they ended up doing the same thing. As a stop gap measure, I made a dedicated Notices drive on the old HP, which was still running Server 2008, which ran fine with concurrent access to the same file. Online forums weren’t any real help and none of the other admins in Cape Town I spoke to had encountered the problem either.

In the last school holidays just gone by, we finally had a decent gap between other jobs to experiment on the new server and see if we could correct the problem. I broke the network team, unplugged 3 of the 4 cables and disabled the LACP protocol on the switch. After reassigning the correct IP to the now single network port, we did some tests on opening up files on 2 and then 3 computers at the same time. We opened up 15MB Word documents, 5MB complicated Excel files, 200MB video files and more. The downloading box never showed up once. Unfortunately, without heavier real world testing by the staff, I don’t know if the problem has been resolved once and for all. I am intending to move the Notices drive during the next school holiday and we will see what happens after that.

Chalk one up for strange issues that are almost impossible to hunt down.

Saving old memories

The school I work at will be 60 years old in 2017 – a pretty decent milestone for a school, though there are many older schools here in Cape Town. As with any institution that has survived this long, there are bound to be many old photos of events in years gone by. A school is a very busy place with hundreds of events each year: sports matches, outings, camps, tours domestically and/or internationally, dramatic presentations, musicals, concerts, prizegivings and more all lead to many potential photographic opportunities.

Unfortunately, for the last 30 years or so, the school has largely relied on one person to take photos and keep a visual record of the school: my direct boss. From when he arrived in the mid 1980’s through to today, he has been building up a massive collection of photos. Since 2004, all the photos have been taken digitally, so there is a good archive that has built up the last 11 years. However, prior to that, everything in the school was done on 35mm film and this is where the problem comes in. All the photos on colour slides are in a slow race against time to be preserved. All colour slides will discolour and fade in time, more so if they are not stored properly. Once the colours are gone (or shifted too badly to rescue,) all the unique memories on those pieces of plastic are gone forever. Colour negatives are a bit more stable if stored in the plastic strips they came from the photo store. Black and white negatives are probably the most stable of the lot.

At the end of 2014 through a chance discussion with the school librarian, I discovered that there was a batch of slides sitting in a box in one of her cupboards. I asked her if I could take them to start scanning them, as we luckily have a Canon scanner that can scan slides and negatives. I was thinking of having them professionally scanned by a specialised company here in Cape Town, but the price would quickly become prohibitive for the number of slides that needed to be converted. As such, I’ve been slowly chipping away at the first box of slides, scanning them at 4800 dpi and saving the resulting large JPEG file. My boss has promised to colour correct and touch up these slides in Lightroom/Photoshop Elements when I am done scanning, after which we can upload these photos to our dedicated Past Pupils page on Facebook.

So far I’ve managed to scan about 165 slides, most of which I’ve taken out the holders to do so, especially the glass ones. It’s become clear that many of the photos were soft or slightly out of focus when taken originally, but it probably wasn’t noticed at the time. Also, 30 odd years of age on the film itself also doesn’t help either. There’s still a pile of probably about a hundred to go, though I’ve managed to whittle out private slides of my boss or slides that were too far gone to bother rescuing.

With the end of that box in sight, I went back to the library last week looking for anything more. As many slides as there were in the first box, they only cover a small time period of the school’s history – 3 or 4 years at the most in the 1980’s. After some more scratching and an impromptu spring clean by the librarian, I took possession of another box of slides, as well as dozens of packets of negatives, both colour and monochrome as well as some printed photos. Once the initial box of slides are done, I can focus on the negatives. Thankfully, scanning the negatives will be a little less time consuming, for the simple reason that I no longer need to take the film out of holders. I simply mount the strip of 4 negatives and scan away, estimating a saving of about 5 minutes per batch.

The biggest downside of the 35mm products is that in today’s digital world, you cannot share the memories on those pieces of plastic if you don’t digitise them. Digitised, you can share them online as well as use them inside the school for projection during events. Projecting slides today isn’t impossible, but getting a slide projector isn’t easy, not to mention that the mere act of displaying the slides will reduce their lifespan even more due to the heat of the lamp. For archival purposes, having the photos in JPEG format allows the files to be replicated all over the show, avoiding any one point of failure. If the film is damaged and destroyed, there is nothing to fall back on, especially in the case of slides. While JPEG isn’t up to true archival quality or standards, in computing terms it’s probably the closest thing there is. Every consumer operating system since Windows 95 can view the files, which is a good 20 year track record now. It’s of course nowhere near film’s 130+ years of service, but for now, it’s a good enough solution.

Low end laptop pain

In the course of my job, I’ve been asked on occasion to give feedback or a reccomendation to staff members regarding the purchase of a personal of family laptop. Unfortunately due to the ever changing nature of the IT field, the answers I give aren’t always what the person wants to hear.

I have two questions I ask the person before I make any recommendations:

  1. What do you intend to use the laptop for?
  2. What is your approximate budget for the laptop?
  3. How long do you intend to keep the laptop for?

The answer to the first question is usually a pretty generic answer: typing documents, surfing the internet, preparing lessons, doing research, check email. Using SMART Notebook also comes up now and then. Question 2 usually results in a range of about R4000-R6000 (roughly $380 – $550, exchange rates make this number fluctuate.) Question 3 results in a range of 3 years up to 5 or longer.

I often specify a laptop that is slightly over the asker’s budget, with a justification that spending slightly more results in a better quality laptop that lasts longer and is less likely to drive the person up the wall in the long run. Bottom of the line laptops have to cut so many corners that the experience is often highly frustrating. Low amounts of RAM, lowest end processors, slow mechanical hard drives, low resolution low quality screens, creaky plastic shells, poor trackpads and more leave  and taste in the mouth and that’s just on the hardware side of things. Software wise, the lowest end version of Windows is installed, including the Starter edition in the Windows 7 era. Bundled anti-virus applications, trial ware and lots of often bloated, unneeded software is pre-installed by the manufacturer in order to try and recoup costs and eke out some sort of profit.

Over the last few years, I’ve come to be a firm believer in the power of the SSD. With the right drive, it can often seem like you super charging a laptop that otherwise would need to be replaced due to age. It won’t completely mask age or low specs on a laptop, but it comes close. Windows starts faster, applications load quicker, battery life is extended, noise is reduced and user experience is often improved because you have less of the freezing/lockup sensation after boot. I don’t know if the drives will ever get as cheap as mechanical hard drives, but I believe that even a SATA3 based drive in most consumer laptops would go a long way to increasing user satisfaction across the board. Unfortunately, marketing still spreads the word that 1TB and larger drives are a good thing to have, when in reality not that many people are going to using all that space on a laptop.

As much as I’ve moaned about low quality laptops in this piece, I am reminded that it’s due to the flexibility of Windows that there is such a wide range of devices available at all cost points. From the most painful low end devices that are affordable to most people, all the way up to high end ultrabooks that are extremely pricey but have all the bells and whistles. Competition in the industry plus attrition has also helped to weed out some of the smaller or less interested players, as well as leading to a growing awareness that quality needs to increase in order to stand out against the competition. I can only hope that as time goes on, this trend continues and that the creaky poor machines of the past become nothing more than a bad memory.

The joy of fixing Windows Update

Most of the time, Windows Update does its thing quietly and without problems. Updates are downloaded, installed and maintained without much fuss. The entire Microsoft/Windows Update channel is one of the most bandwidth intensive features on the internet, especially on “Patch Tuesday.” Again, most of the time things just works, but occasionally things do end up breaking.

Last week I was finally able to clean up a PC in the school that has been having WU issues for months now. Unable to work on the machine due to heavy use by the librarian, I was finally able to do so due to her absence last week, to finally tackle the issue head on without being pressured for time or interruptions. The librarian’s PC didn’t have issues before, when suddenly one day I started noticing that it was being flagged with a red cross in the WSUS Management Console. Manually installing the offending updates didn’t solve the problem, so I was left with a mystery to try and fix the PC without just nuking it and starting from scratch. What was more interesting is that it was only a small handful of updates failing, other updates were being installed successfully.

First thing I did was use the built in System File Checker scan, which found problems and fixed those, but didn’t resolve the WU errors. After that, I ran Microsoft’s System Update Readiness checker, which scans your Windows install in order to fix things. It too picked up some errors with missing files, which I replaced. It took a few run-throughs, but eventually the tool no longer indicated that there were any errors. Unfortunately, the WU errors didn’t go away. Next I tried the Windows Update trouble shooting tool from Microsoft’s support pages. The tool took a while to scan the system and found the error code that WU was reporting and also claimed to fix said error. However, trying to install the offending 2 updates still failed.

Turning now to the net, most advice for the 0x8007000D error code indicates that the actual update installation files have become corrupted or damaged on the hard drive. I thought I might have a sick hard drive on hand, so I ran Seagate’s Seatools to check that idea out. Turns out the hard drive is perfectly healthy according to Seatools. Finally, I followed the other piece of advice, which was to stop the Windows Update service, rename the C:\Windows\SoftwareDistribution folder to something else and reboot. After the reboot, Windows builds a new folder. I deleted the old folder, which recovered almost 5GB worth of space. I went back to Windows Update to check, and the same 2 updates were offered as before. I tried them and lo and behold, they downloaded and installed without a hitch. All of a sudden, the PC was back up to date after a good few months of being out of compliance.

It’s always a good feeling to have a computer back in full working order. While nuking it would have probably saved me many hours, the hassle of rebuilding the librarian’s profile and reinstalling the library management software made the time spent curing the machine worth it. Hopefully from now on out, the machine behaves itself and doesn’t develop issues again down the line.

Bandwidth buffet

For any person now in their late 20’s to early 30’s, the sound of a dial up modem should be a familiar, yet fading memory. Connecting to the internet more often than not meant listening to those squawking devices and hoping that the connection was clean and free of noise that would slow the connection down. As time went on, ADSL arrived, 3G arrived, HSDPA arrive, VDSL and now fibre have arrived. Slowly but surely, the trickle of data coming down the pipes has become a raging torrent, at least for those that can afford the higher speed options.

Driven by the laws of supply and demand, websites have evolved to become a lot more feature rich and complex than in years gone by. Images have gone from highly compressed gif and jpeg files into higher resolution jpeg and png files. Internet video, once an incredibly frustrating experience involving downloadable clips and QuickTime, Real or Windows Media Player has largely evolved into slick, easy to use web based players. Of course, video has also crept up from thumbnail size resolutions all the way up to the current 4K. It’s become really simple: the bigger your pipe, the more you are going to drink from the fountain by consuming rich media, online streaming and more. Creating and uploading content is also more viable than ever before, which means symmetrical connections are becoming far more important to the end user than they ever were before.

Take the following picture below, taken from my school’s firewall logs for 1 January – 30 April 2015:

Stats

That’s a total of 970GB of traffic on 1 ADSL line, the speed of which has fluctuated during the course of the year due to stability issues. We have another ADSL line which is only used by a few people, some phones and tablets, but I don’t have usage stats for that line. However, taken all together, our school has definitely used over 1TB of data in 4 months. At this rate, we may end up pushing close to 3TB by the year’s end. Also keep in note that these stats are without any wide scale WiFi available to students. I shudder to think of what the numbers will be once we have WiFi going, or even if we get a faster upload so that things like Dropbox, OneDrive and so on become viable.

Here’s a second interesting picture as well:

Stats2

Of all the web traffic going through the firewall, 82.5% of the traffic was unique or uncacheable due to its dynamic nature. In earlier days, caching statistics were higher since websites were less dynamic and had far more static HTML code, less scripts etc. That being said, the cache did at least manage to serve about 15% of the total web traffic. Every little bit helps when you are on a slow connection.

In the end, it all goes to show that the more bandwidth you have, the more applications and users are going to end up making use of it. Thankfully, bandwidth prices are much lower than they have ever been, though on some connections the speed is throttled to make sure that the end user doesn’t gorge themselves to the detriment of other users.

Categories: Networking Tags:
Follow

Get every new post delivered to your Inbox.