The long hunt for a cure

At the end of March 2014, our school took ownership of a new Intel 2600GZ server to replace our previous HP ML350 G5 server which was the heart of our network. The HP had done a fantastic job over the years, but was rapidly starting to age and wasn’t officially supported by Windows Server 2012 R2. Our new server has 32GB of RAM, dual Xeon processers, dual power supplies, 4 network ports and a dedicated remote management card. Although a little pricier than what I had originally budgeted for, it matched what the HP had and would earn its keep over the next 5-7 years worth of service.

After racking and powering up the server, I installed firmware updates and then Server 2012 R2. Install was quicker than any other server I’ve done in the past, thanks to the SSD boot volume. After going through all the driver installs, Windows Updates and so on, the server was almost ready to start serving. One of the last actions I did was to bond all 4 network ports together to create a network team. My thinking was that having a 4Gb/s team would prevent any bottlenecks to the server when under heavy load, as well as provide redundancy should a cable or switch port go faulty. Good idea in theory, but in reality I’ve never had a cable or port in the server room go bad in 6+ years.

Looking back now, I’m not sure exactly why I bothered creating a team. While the server is heavily used as a domain controller, DHCP, DNS and file server, it never comes close to saturating 1Gb/s, let alone 4. Almost every computer in the school is still connected at 100Mb/s, so the server itself never really comes under too much strain.

Either way, once everything was set up, I proceeded to copy all the files across from the old HP to the new Intel server. I used Robocopy to bulk move files, and in some cases needed to let the process finish up over night since there were so many files, especially lots of small files. Data deduplication was turned on, shares were shared and everything looked good to go.

When school resumed after the holidays, the biggest problem came to light right on the first morning: users being unable to simultaneously access Office files. We have a PowerPoint slideshow that is run every morning in the register period that has all the daily notices for meetings, events, reminders, detention etc. Prior to the move, this system worked without fault for many years. After the move, the moment the 2nd or 3rd teacher tried to access the slideshow, they would get this result:

WP_20140409_001
Green bar of doom crawling across the navigation pane, while this odd Downloading box would appear and take forever to do anything and would tend to lock Explorer up. Complaints naturally came in thick and fast and the worst part is that I couldn’t pinpoint what the issue was, aside from my suspicion that the new SMB3 protocol was to blame. I had hoped that the big Update 1 update that shipped for Windows 8.1 and Server would help, but it didn’t. Disabling SMB signing didn’t help either. At one point, my colleague and I even installed Windows 8.1 and Office 2013 on some test machines to try and rule out that possibility, but they ended up doing the same thing. As a stop gap measure, I made a dedicated Notices drive on the old HP, which was still running Server 2008, which ran fine with concurrent access to the same file. Online forums weren’t any real help and none of the other admins in Cape Town I spoke to had encountered the problem either.

In the last school holidays just gone by, we finally had a decent gap between other jobs to experiment on the new server and see if we could correct the problem. I broke the network team, unplugged 3 of the 4 cables and disabled the LACP protocol on the switch. After reassigning the correct IP to the now single network port, we did some tests on opening up files on 2 and then 3 computers at the same time. We opened up 15MB Word documents, 5MB complicated Excel files, 200MB video files and more. The downloading box never showed up once. Unfortunately, without heavier real world testing by the staff, I don’t know if the problem has been resolved once and for all. I am intending to move the Notices drive during the next school holiday and we will see what happens after that.

Chalk one up for strange issues that are almost impossible to hunt down.

Saving old memories

The school I work at will be 60 years old in 2017 – a pretty decent milestone for a school, though there are many older schools here in Cape Town. As with any institution that has survived this long, there are bound to be many old photos of events in years gone by. A school is a very busy place with hundreds of events each year: sports matches, outings, camps, tours domestically and/or internationally, dramatic presentations, musicals, concerts, prizegivings and more all lead to many potential photographic opportunities.

Unfortunately, for the last 30 years or so, the school has largely relied on one person to take photos and keep a visual record of the school: my direct boss. From when he arrived in the mid 1980’s through to today, he has been building up a massive collection of photos. Since 2004, all the photos have been taken digitally, so there is a good archive that has built up the last 11 years. However, prior to that, everything in the school was done on 35mm film and this is where the problem comes in. All the photos on colour slides are in a slow race against time to be preserved. All colour slides will discolour and fade in time, more so if they are not stored properly. Once the colours are gone (or shifted too badly to rescue,) all the unique memories on those pieces of plastic are gone forever. Colour negatives are a bit more stable if stored in the plastic strips they came from the photo store. Black and white negatives are probably the most stable of the lot.

At the end of 2014 through a chance discussion with the school librarian, I discovered that there was a batch of slides sitting in a box in one of her cupboards. I asked her if I could take them to start scanning them, as we luckily have a Canon scanner that can scan slides and negatives. I was thinking of having them professionally scanned by a specialised company here in Cape Town, but the price would quickly become prohibitive for the number of slides that needed to be converted. As such, I’ve been slowly chipping away at the first box of slides, scanning them at 4800 dpi and saving the resulting large JPEG file. My boss has promised to colour correct and touch up these slides in Lightroom/Photoshop Elements when I am done scanning, after which we can upload these photos to our dedicated Past Pupils page on Facebook.

So far I’ve managed to scan about 165 slides, most of which I’ve taken out the holders to do so, especially the glass ones. It’s become clear that many of the photos were soft or slightly out of focus when taken originally, but it probably wasn’t noticed at the time. Also, 30 odd years of age on the film itself also doesn’t help either. There’s still a pile of probably about a hundred to go, though I’ve managed to whittle out private slides of my boss or slides that were too far gone to bother rescuing.

With the end of that box in sight, I went back to the library last week looking for anything more. As many slides as there were in the first box, they only cover a small time period of the school’s history – 3 or 4 years at the most in the 1980’s. After some more scratching and an impromptu spring clean by the librarian, I took possession of another box of slides, as well as dozens of packets of negatives, both colour and monochrome as well as some printed photos. Once the initial box of slides are done, I can focus on the negatives. Thankfully, scanning the negatives will be a little less time consuming, for the simple reason that I no longer need to take the film out of holders. I simply mount the strip of 4 negatives and scan away, estimating a saving of about 5 minutes per batch.

The biggest downside of the 35mm products is that in today’s digital world, you cannot share the memories on those pieces of plastic if you don’t digitise them. Digitised, you can share them online as well as use them inside the school for projection during events. Projecting slides today isn’t impossible, but getting a slide projector isn’t easy, not to mention that the mere act of displaying the slides will reduce their lifespan even more due to the heat of the lamp. For archival purposes, having the photos in JPEG format allows the files to be replicated all over the show, avoiding any one point of failure. If the film is damaged and destroyed, there is nothing to fall back on, especially in the case of slides. While JPEG isn’t up to true archival quality or standards, in computing terms it’s probably the closest thing there is. Every consumer operating system since Windows 95 can view the files, which is a good 20 year track record now. It’s of course nowhere near film’s 130+ years of service, but for now, it’s a good enough solution.

Low end laptop pain

In the course of my job, I’ve been asked on occasion to give feedback or a reccomendation to staff members regarding the purchase of a personal of family laptop. Unfortunately due to the ever changing nature of the IT field, the answers I give aren’t always what the person wants to hear.

I have two questions I ask the person before I make any recommendations:

  1. What do you intend to use the laptop for?
  2. What is your approximate budget for the laptop?
  3. How long do you intend to keep the laptop for?

The answer to the first question is usually a pretty generic answer: typing documents, surfing the internet, preparing lessons, doing research, check email. Using SMART Notebook also comes up now and then. Question 2 usually results in a range of about R4000-R6000 (roughly $380 – $550, exchange rates make this number fluctuate.) Question 3 results in a range of 3 years up to 5 or longer.

I often specify a laptop that is slightly over the asker’s budget, with a justification that spending slightly more results in a better quality laptop that lasts longer and is less likely to drive the person up the wall in the long run. Bottom of the line laptops have to cut so many corners that the experience is often highly frustrating. Low amounts of RAM, lowest end processors, slow mechanical hard drives, low resolution low quality screens, creaky plastic shells, poor trackpads and more leave  and taste in the mouth and that’s just on the hardware side of things. Software wise, the lowest end version of Windows is installed, including the Starter edition in the Windows 7 era. Bundled anti-virus applications, trial ware and lots of often bloated, unneeded software is pre-installed by the manufacturer in order to try and recoup costs and eke out some sort of profit.

Over the last few years, I’ve come to be a firm believer in the power of the SSD. With the right drive, it can often seem like you super charging a laptop that otherwise would need to be replaced due to age. It won’t completely mask age or low specs on a laptop, but it comes close. Windows starts faster, applications load quicker, battery life is extended, noise is reduced and user experience is often improved because you have less of the freezing/lockup sensation after boot. I don’t know if the drives will ever get as cheap as mechanical hard drives, but I believe that even a SATA3 based drive in most consumer laptops would go a long way to increasing user satisfaction across the board. Unfortunately, marketing still spreads the word that 1TB and larger drives are a good thing to have, when in reality not that many people are going to using all that space on a laptop.

As much as I’ve moaned about low quality laptops in this piece, I am reminded that it’s due to the flexibility of Windows that there is such a wide range of devices available at all cost points. From the most painful low end devices that are affordable to most people, all the way up to high end ultrabooks that are extremely pricey but have all the bells and whistles. Competition in the industry plus attrition has also helped to weed out some of the smaller or less interested players, as well as leading to a growing awareness that quality needs to increase in order to stand out against the competition. I can only hope that as time goes on, this trend continues and that the creaky poor machines of the past become nothing more than a bad memory.

The joy of fixing Windows Update

Most of the time, Windows Update does its thing quietly and without problems. Updates are downloaded, installed and maintained without much fuss. The entire Microsoft/Windows Update channel is one of the most bandwidth intensive features on the internet, especially on “Patch Tuesday.” Again, most of the time things just works, but occasionally things do end up breaking.

Last week I was finally able to clean up a PC in the school that has been having WU issues for months now. Unable to work on the machine due to heavy use by the librarian, I was finally able to do so due to her absence last week, to finally tackle the issue head on without being pressured for time or interruptions. The librarian’s PC didn’t have issues before, when suddenly one day I started noticing that it was being flagged with a red cross in the WSUS Management Console. Manually installing the offending updates didn’t solve the problem, so I was left with a mystery to try and fix the PC without just nuking it and starting from scratch. What was more interesting is that it was only a small handful of updates failing, other updates were being installed successfully.

First thing I did was use the built in System File Checker scan, which found problems and fixed those, but didn’t resolve the WU errors. After that, I ran Microsoft’s System Update Readiness checker, which scans your Windows install in order to fix things. It too picked up some errors with missing files, which I replaced. It took a few run-throughs, but eventually the tool no longer indicated that there were any errors. Unfortunately, the WU errors didn’t go away. Next I tried the Windows Update trouble shooting tool from Microsoft’s support pages. The tool took a while to scan the system and found the error code that WU was reporting and also claimed to fix said error. However, trying to install the offending 2 updates still failed.

Turning now to the net, most advice for the 0x8007000D error code indicates that the actual update installation files have become corrupted or damaged on the hard drive. I thought I might have a sick hard drive on hand, so I ran Seagate’s Seatools to check that idea out. Turns out the hard drive is perfectly healthy according to Seatools. Finally, I followed the other piece of advice, which was to stop the Windows Update service, rename the C:\Windows\SoftwareDistribution folder to something else and reboot. After the reboot, Windows builds a new folder. I deleted the old folder, which recovered almost 5GB worth of space. I went back to Windows Update to check, and the same 2 updates were offered as before. I tried them and lo and behold, they downloaded and installed without a hitch. All of a sudden, the PC was back up to date after a good few months of being out of compliance.

It’s always a good feeling to have a computer back in full working order. While nuking it would have probably saved me many hours, the hassle of rebuilding the librarian’s profile and reinstalling the library management software made the time spent curing the machine worth it. Hopefully from now on out, the machine behaves itself and doesn’t develop issues again down the line.

Bandwidth buffet

For any person now in their late 20’s to early 30’s, the sound of a dial up modem should be a familiar, yet fading memory. Connecting to the internet more often than not meant listening to those squawking devices and hoping that the connection was clean and free of noise that would slow the connection down. As time went on, ADSL arrived, 3G arrived, HSDPA arrive, VDSL and now fibre have arrived. Slowly but surely, the trickle of data coming down the pipes has become a raging torrent, at least for those that can afford the higher speed options.

Driven by the laws of supply and demand, websites have evolved to become a lot more feature rich and complex than in years gone by. Images have gone from highly compressed gif and jpeg files into higher resolution jpeg and png files. Internet video, once an incredibly frustrating experience involving downloadable clips and QuickTime, Real or Windows Media Player has largely evolved into slick, easy to use web based players. Of course, video has also crept up from thumbnail size resolutions all the way up to the current 4K. It’s become really simple: the bigger your pipe, the more you are going to drink from the fountain by consuming rich media, online streaming and more. Creating and uploading content is also more viable than ever before, which means symmetrical connections are becoming far more important to the end user than they ever were before.

Take the following picture below, taken from my school’s firewall logs for 1 January – 30 April 2015:

Stats

That’s a total of 970GB of traffic on 1 ADSL line, the speed of which has fluctuated during the course of the year due to stability issues. We have another ADSL line which is only used by a few people, some phones and tablets, but I don’t have usage stats for that line. However, taken all together, our school has definitely used over 1TB of data in 4 months. At this rate, we may end up pushing close to 3TB by the year’s end. Also keep in note that these stats are without any wide scale WiFi available to students. I shudder to think of what the numbers will be once we have WiFi going, or even if we get a faster upload so that things like Dropbox, OneDrive and so on become viable.

Here’s a second interesting picture as well:

Stats2

Of all the web traffic going through the firewall, 82.5% of the traffic was unique or uncacheable due to its dynamic nature. In earlier days, caching statistics were higher since websites were less dynamic and had far more static HTML code, less scripts etc. That being said, the cache did at least manage to serve about 15% of the total web traffic. Every little bit helps when you are on a slow connection.

In the end, it all goes to show that the more bandwidth you have, the more applications and users are going to end up making use of it. Thankfully, bandwidth prices are much lower than they have ever been, though on some connections the speed is throttled to make sure that the end user doesn’t gorge themselves to the detriment of other users.

Categories: Networking Tags:

The Clean Windows PC experience

February 15, 2015 Leave a comment

Microsoft Windows is an amazing piece of software. It powers an incredibly wide range of hardware, as well as running on wildly different system specifications. One person may have a bargain basement Celeron or Pentium laptop, while another person is running on a fully tricked out Core i7 beast – Windows covers it all. With multiple OEM’s making products, the consumer is generally spoiled for choice across a wide range of price points. The downside to this however is that Windows has often been associated with a race to the bottom of the barrel, while Apple for example refuses to go below a certain line and rightly or wrongly, and maintains a prestigious, upmarket image.

Part of the race to the bottom means that profits for OEM’s are razor thin. Make a mistake and your competitors are going to pounce. Fail to keep up and likewise. Fail to cut down on costs and you risk going bust. As a result of this fierce competition, consumers have sometimes been the victim of this industry competition. Laptops are built with creaky plastic that doesn’t always sit flush, screen resolutions haven’t increased in years, mechanical hard drives are still king, multiple models that often leave people confused as to what the differences are between it and another model, the amount of RAM is just enough to get by with and cheap Realtek network and audio solutions are used etc… On the software side of things, OEM’s take money from anti-virus vendors to preload their wares onto the computers. Throw in CD/DVD burning trial solutions, vendor back up programs as well as other useless vendor software and you are left with a horrible laptop/desktop experience. Users don’t love Windows, they just tolerate it.

The hardware issue is tricky, since that depends on economies of scale to work. A SSD hard drive for example would greatly improve people’s experiences with their computer, but a 250GB drive for example still costs much more than a 1TB mechanical drive. Screen resolution in laptops is slowly starting to move forward again, but it will take time. Trackpads are also finally starting to improve, but it’s still hit and miss. With desktops, it’s really become about trying to cut down on size as much as possible and go small.

The software side of things is where the most immediate improvement can be made. If OEM’s followed Microsoft’s Windows Signature Edition experience, I think many a customer would be happy. Instead of having Windows loaded down with bloatware, trials and other software, Windows instead would come clean out the box, with a few minimal applications installed – Flash, Adobe Reader, Skype and Microsoft Security Essentials (for Windows 7). For Windows 8 based machines, the OEM’s should make sure that the devices are shipped with Windows 8.1 minimum, but ideally Update 1 should be installed as well, which improves the experience on traditional laptops/desktops. OEM’s should strive to keep their images as up to date as possible, so that the end user isn’t downloading a few GB worth of updates after their first boot. There’s nothing worse that powering up and watching Windows Update firing up and tearing through a few GB worth of bandwidth as it pulls down patches.

Lastly, hardware in the computer should not require an application be installed so that the driver is installed as well. I’ve had this problem with Lenovo and Samsung laptops, where in order to get rid of an outstanding entry in Device Manager, I’ve had to install one of the Samsung/Lenovo utilities. Often these utilities don’t work well and just add frustration for the end user.

Famed Windows blogger Paul Thurrott has a few articles up where he goes right back to basics and does completely clean installs of Windows on some of his devices. As he notes, it’s sometimes the only way to truly be rid of all the bloatware OEM’s like to install. Included are steps on how to legally download clean ISO images you can burn to disk or USB stick for a clean install of Windows. You can find his articles here, here, here, here and here.

SMART Board and USB port fun

February 1, 2015 Leave a comment

Over the last two weeks, we’ve slowly been ramping up our classroom computer swap program at work. 6 year old Core 2 based computers with horrid chassis and power supplies are coming out, being replaced with first gen Core i3 boxes that are quieter, smaller and faster. However, a recent event almost threatened to derail the project.

I placed one of the replacement computers in a class, had it setup as per usual and all was going well. After rebooting however, I noticed that the SMART Board (model SB-680) was not behaving properly. The board was either vanishing just before the computer was fully booted into Windows, or the board would constantly reset and be basically unusable. Changing USB ports did hot help at all, they only gave a temporary fix that lasted until the next boot.

I got the reseller of the board involved to do deeper technical diagnostics, though in honesty it was more a case of handing the problem over to someone else. This past Friday afternoon they arrived and we started a long troubleshooting process. The board was hooked up to the techies laptop and after some time, it settled down and behaved normally. We then swapped out the controller card, swapped out the board itself and tested on the replacement PC. All to no avail, the problem kept coming back. We even tried a new USB booster cable and USB cable, same result.

In desperation, I went into the BIOS to change the USB settings for the newer classroom motherboards. The Intel DQ57TM motherboards had been running completely fine for the last 4 years without issues in both our computer labs, so I couldn’t understand why it would give issues now. They are all flashed to the latest firmware Intel offers, so there would be no fix that way. It turns out that one simple BIOS setting may have caused the issue.

When I setup the computer via network boot and install using Microsoft’s Deployment Toolkit, I had to set the USB Backward Compatibility option to Disabled in the BIOS, as the keyboard and mouse were non functional in Windows PE. After the whole install process was over, I didn’t bother to change the setting again, since I didn’t believe it would affect anything. Suffice to say, enabling the option caused Windows to install a whole bunch of extra USB root hubs and stuff after the reboot. In turn, this then let the SMART Board behave properly. Our reseller’s techie learned something new, as did I. Now I know that I must make sure the setting goes back to Enabled before installation in the classroom, so that headaches can be avoided.

The truly bizarre thing however is that the problem only seems to be triggered if the SMART Board is hooked up to the computer via an USB booster extension cable. If the board is close enough to the computer desk and doesn’t use the booster extension, the board seems to work fine with the setting at Disabled. I have 2 classrooms where such is the case, and neither of those rooms have reported issues with their boards since the school year started.

Another quirky problem to add to the knowledge base of fun when it comes to SMART Boards.

Follow

Get every new post delivered to your Inbox.