Most of the time, Windows Update does its thing quietly and without problems. Updates are downloaded, installed and maintained without much fuss. The entire Microsoft/Windows Update channel is one of the most bandwidth intensive features on the internet, especially on “Patch Tuesday.” Again, most of the time things just works, but occasionally things do end up breaking.
Last week I was finally able to clean up a PC in the school that has been having WU issues for months now. Unable to work on the machine due to heavy use by the librarian, I was finally able to do so due to her absence last week, to finally tackle the issue head on without being pressured for time or interruptions. The librarian’s PC didn’t have issues before, when suddenly one day I started noticing that it was being flagged with a red cross in the WSUS Management Console. Manually installing the offending updates didn’t solve the problem, so I was left with a mystery to try and fix the PC without just nuking it and starting from scratch. What was more interesting is that it was only a small handful of updates failing, other updates were being installed successfully.
First thing I did was use the built in System File Checker scan, which found problems and fixed those, but didn’t resolve the WU errors. After that, I ran Microsoft’s System Update Readiness checker, which scans your Windows install in order to fix things. It too picked up some errors with missing files, which I replaced. It took a few run-throughs, but eventually the tool no longer indicated that there were any errors. Unfortunately, the WU errors didn’t go away. Next I tried the Windows Update trouble shooting tool from Microsoft’s support pages. The tool took a while to scan the system and found the error code that WU was reporting and also claimed to fix said error. However, trying to install the offending 2 updates still failed.
Turning now to the net, most advice for the 0x8007000D error code indicates that the actual update installation files have become corrupted or damaged on the hard drive. I thought I might have a sick hard drive on hand, so I ran Seagate’s Seatools to check that idea out. Turns out the hard drive is perfectly healthy according to Seatools. Finally, I followed the other piece of advice, which was to stop the Windows Update service, rename the C:\Windows\SoftwareDistribution folder to something else and reboot. After the reboot, Windows builds a new folder. I deleted the old folder, which recovered almost 5GB worth of space. I went back to Windows Update to check, and the same 2 updates were offered as before. I tried them and lo and behold, they downloaded and installed without a hitch. All of a sudden, the PC was back up to date after a good few months of being out of compliance.
It’s always a good feeling to have a computer back in full working order. While nuking it would have probably saved me many hours, the hassle of rebuilding the librarian’s profile and reinstalling the library management software made the time spent curing the machine worth it. Hopefully from now on out, the machine behaves itself and doesn’t develop issues again down the line.
For any person now in their late 20’s to early 30’s, the sound of a dial up modem should be a familiar, yet fading memory. Connecting to the internet more often than not meant listening to those squawking devices and hoping that the connection was clean and free of noise that would slow the connection down. As time went on, ADSL arrived, 3G arrived, HSDPA arrive, VDSL and now fibre have arrived. Slowly but surely, the trickle of data coming down the pipes has become a raging torrent, at least for those that can afford the higher speed options.
Driven by the laws of supply and demand, websites have evolved to become a lot more feature rich and complex than in years gone by. Images have gone from highly compressed gif and jpeg files into higher resolution jpeg and png files. Internet video, once an incredibly frustrating experience involving downloadable clips and QuickTime, Real or Windows Media Player has largely evolved into slick, easy to use web based players. Of course, video has also crept up from thumbnail size resolutions all the way up to the current 4K. It’s become really simple: the bigger your pipe, the more you are going to drink from the fountain by consuming rich media, online streaming and more. Creating and uploading content is also more viable than ever before, which means symmetrical connections are becoming far more important to the end user than they ever were before.
Take the following picture below, taken from my school’s firewall logs for 1 January – 30 April 2015:
That’s a total of 970GB of traffic on 1 ADSL line, the speed of which has fluctuated during the course of the year due to stability issues. We have another ADSL line which is only used by a few people, some phones and tablets, but I don’t have usage stats for that line. However, taken all together, our school has definitely used over 1TB of data in 4 months. At this rate, we may end up pushing close to 3TB by the year’s end. Also keep in note that these stats are without any wide scale WiFi available to students. I shudder to think of what the numbers will be once we have WiFi going, or even if we get a faster upload so that things like Dropbox, OneDrive and so on become viable.
Here’s a second interesting picture as well:
Of all the web traffic going through the firewall, 82.5% of the traffic was unique or uncacheable due to its dynamic nature. In earlier days, caching statistics were higher since websites were less dynamic and had far more static HTML code, less scripts etc. That being said, the cache did at least manage to serve about 15% of the total web traffic. Every little bit helps when you are on a slow connection.
In the end, it all goes to show that the more bandwidth you have, the more applications and users are going to end up making use of it. Thankfully, bandwidth prices are much lower than they have ever been, though on some connections the speed is throttled to make sure that the end user doesn’t gorge themselves to the detriment of other users.
Microsoft Windows is an amazing piece of software. It powers an incredibly wide range of hardware, as well as running on wildly different system specifications. One person may have a bargain basement Celeron or Pentium laptop, while another person is running on a fully tricked out Core i7 beast – Windows covers it all. With multiple OEM’s making products, the consumer is generally spoiled for choice across a wide range of price points. The downside to this however is that Windows has often been associated with a race to the bottom of the barrel, while Apple for example refuses to go below a certain line and rightly or wrongly, and maintains a prestigious, upmarket image.
Part of the race to the bottom means that profits for OEM’s are razor thin. Make a mistake and your competitors are going to pounce. Fail to keep up and likewise. Fail to cut down on costs and you risk going bust. As a result of this fierce competition, consumers have sometimes been the victim of this industry competition. Laptops are built with creaky plastic that doesn’t always sit flush, screen resolutions haven’t increased in years, mechanical hard drives are still king, multiple models that often leave people confused as to what the differences are between it and another model, the amount of RAM is just enough to get by with and cheap Realtek network and audio solutions are used etc… On the software side of things, OEM’s take money from anti-virus vendors to preload their wares onto the computers. Throw in CD/DVD burning trial solutions, vendor back up programs as well as other useless vendor software and you are left with a horrible laptop/desktop experience. Users don’t love Windows, they just tolerate it.
The hardware issue is tricky, since that depends on economies of scale to work. A SSD hard drive for example would greatly improve people’s experiences with their computer, but a 250GB drive for example still costs much more than a 1TB mechanical drive. Screen resolution in laptops is slowly starting to move forward again, but it will take time. Trackpads are also finally starting to improve, but it’s still hit and miss. With desktops, it’s really become about trying to cut down on size as much as possible and go small.
The software side of things is where the most immediate improvement can be made. If OEM’s followed Microsoft’s Windows Signature Edition experience, I think many a customer would be happy. Instead of having Windows loaded down with bloatware, trials and other software, Windows instead would come clean out the box, with a few minimal applications installed – Flash, Adobe Reader, Skype and Microsoft Security Essentials (for Windows 7). For Windows 8 based machines, the OEM’s should make sure that the devices are shipped with Windows 8.1 minimum, but ideally Update 1 should be installed as well, which improves the experience on traditional laptops/desktops. OEM’s should strive to keep their images as up to date as possible, so that the end user isn’t downloading a few GB worth of updates after their first boot. There’s nothing worse that powering up and watching Windows Update firing up and tearing through a few GB worth of bandwidth as it pulls down patches.
Lastly, hardware in the computer should not require an application be installed so that the driver is installed as well. I’ve had this problem with Lenovo and Samsung laptops, where in order to get rid of an outstanding entry in Device Manager, I’ve had to install one of the Samsung/Lenovo utilities. Often these utilities don’t work well and just add frustration for the end user.
Famed Windows blogger Paul Thurrott has a few articles up where he goes right back to basics and does completely clean installs of Windows on some of his devices. As he notes, it’s sometimes the only way to truly be rid of all the bloatware OEM’s like to install. Included are steps on how to legally download clean ISO images you can burn to disk or USB stick for a clean install of Windows. You can find his articles here, here, here, here and here.
Over the last two weeks, we’ve slowly been ramping up our classroom computer swap program at work. 6 year old Core 2 based computers with horrid chassis and power supplies are coming out, being replaced with first gen Core i3 boxes that are quieter, smaller and faster. However, a recent event almost threatened to derail the project.
I placed one of the replacement computers in a class, had it setup as per usual and all was going well. After rebooting however, I noticed that the SMART Board (model SB-680) was not behaving properly. The board was either vanishing just before the computer was fully booted into Windows, or the board would constantly reset and be basically unusable. Changing USB ports did hot help at all, they only gave a temporary fix that lasted until the next boot.
I got the reseller of the board involved to do deeper technical diagnostics, though in honesty it was more a case of handing the problem over to someone else. This past Friday afternoon they arrived and we started a long troubleshooting process. The board was hooked up to the techies laptop and after some time, it settled down and behaved normally. We then swapped out the controller card, swapped out the board itself and tested on the replacement PC. All to no avail, the problem kept coming back. We even tried a new USB booster cable and USB cable, same result.
In desperation, I went into the BIOS to change the USB settings for the newer classroom motherboards. The Intel DQ57TM motherboards had been running completely fine for the last 4 years without issues in both our computer labs, so I couldn’t understand why it would give issues now. They are all flashed to the latest firmware Intel offers, so there would be no fix that way. It turns out that one simple BIOS setting may have caused the issue.
When I setup the computer via network boot and install using Microsoft’s Deployment Toolkit, I had to set the USB Backward Compatibility option to Disabled in the BIOS, as the keyboard and mouse were non functional in Windows PE. After the whole install process was over, I didn’t bother to change the setting again, since I didn’t believe it would affect anything. Suffice to say, enabling the option caused Windows to install a whole bunch of extra USB root hubs and stuff after the reboot. In turn, this then let the SMART Board behave properly. Our reseller’s techie learned something new, as did I. Now I know that I must make sure the setting goes back to Enabled before installation in the classroom, so that headaches can be avoided.
The truly bizarre thing however is that the problem only seems to be triggered if the SMART Board is hooked up to the computer via an USB booster extension cable. If the board is close enough to the computer desk and doesn’t use the booster extension, the board seems to work fine with the setting at Disabled. I have 2 classrooms where such is the case, and neither of those rooms have reported issues with their boards since the school year started.
Another quirky problem to add to the knowledge base of fun when it comes to SMART Boards.
After 10 weeks of study, thought provoking questions as well as the odd bit of frustration, I finally finished off the last module of my UCT IT Management short course last night. Offered in partnership between the University of Cape Town and a private company called GetSmarter, the course is aimed at widening the knowledge of IT managers of all walks. There are a wide range of other courses on offer from the site, ranging from 8 – 10 weeks, all of them offered online. I decided to do the management course in the hope that I would pick up some new skills and get some new ideas, since these days I’m doing a lot more management rather than just purely technical work. Shaping budgets and policy is something new to me, so all the more reason I was eager to take the course.
The IT course is 10 weeks long as mentioned, so a week for every module. The entire course is run through GetSmarter’s VLE, which is a heavily modified version of Moodle. 5 of the 10 modules are tested via online quizzes of the usual fare i.e. multiple choice, True/False, pick the correct one etc. The other 5 modules are written assignments where you download a document with a case scenario in it as well as questions. From there you have to answer questions as well outline various scenarios, all while watching a line count per answer. Once completed, these documents are uploaded back into the VLE for marking.
I found that as the course went past the half way mark and into week 6, the content of the course became quite theoretical and abstract and dealt less with current trends and topics. Coming from a network administrator’s position in a school, a large amount of the terms and concepts I was exposed to were completely new to me. Changing my thinking to think along business lines proved to be quite a challenge, since the corporate world moves quite differently than the educational world. I know that out of the 10 modules, module 3 was definitely my least favourite, as it was incredibly densely packed with jargon and enormous amounts of theoretical knowledge.
Overall, I think the course is worth the money asked for it, extra studies are always good in jogging the brain out of its set ways. However, if you are new to network administration or IT, it’s definitely not the course for you – more vendor qualifications are appropriate in that case. This course is more for techies and admins who are moving up towards managing IT in their place of work, though as mentioned the course is almost completely focussed on the corporate world.
On an unrelated note, the course also showed me that Moodle can definitely work if enough effort is put into it – custom theme, disabling many end user features and so on. My experience is limited, but it’s been the best Moodle experience I’ve ever had.
Windows Update is usually a very reliable method of keeping Windows based computers up to date. Rough in the early days, it’s come a long way since then. Smooth and mostly transparent in the background, it isn’t often that bad updates slip through.
Unfortunately, during after August’s Patch Tuesday, such an event occurred. After a number of updates were either automatically approved or approved by myself, we had some computers blue screen and go into a reboot loop. Thankfully, out of almost 180 computers, only 5 have suffered the problem seen below:
All of the affected computers were running Windows 7 x64 SP1 with all updates applied. The first 3 times this happened, I couldn’t find a cure for the problem and ended up wiping and redoing the computer from scratch. Later in the week, I found some instructions online on how to get out of the loop and get back into working order.
- Get into the Recovery Console either from install media or by letting the Repair your Computer wizard run after a number of crashes.
- Open up a Command Prompt and delete the FNTCACHE.DAT file located in C:\Windows\System32
- Reboot the computer, and you should now be able to get back into Windows.
- Delete the FNTCACHE.DAT file again, as it will have been recreated by Windows.
- Lastly, go to Windows Update in the Control Panel, then view Installed Updates. Remove KB2982791 and optionally KB2970228. The other 2 updates mentioned out there on the web only apply to Windows 8.1/Server 2012 and so are irrelevant to Windows 7 computers.
- Reboot after the patches are removed.
As I said, it’s not often anymore that bad updates slip through all of Microsoft’s testing, but it does happen. Although it’s frustrating, I don’t intend to modify how I approve patches. I’d rather take the risk of something like this happening than get hammered by Alureon or Conficker or some other nasty because I ignored security patches.