Keeping Adobe Flash Player updated on a network

The Adobe Flash Player plugin is a pain in the arse. It’s a security nightmare, with more holes in the codebase than Swiss cheese. It seems every other week Flash makes the headlines when some or another security vulnerability is discovered and exploited. Cue the groans from network admins and users around the world as Flash has to be updated *yet* again. Unfortunately, one can’t quite get permanently rid of it just yet, as too many websites still rely on it. While you could get away with not using it at home, in a school where multiple people use a computer and visit different websites, one doesn’t have much choice really but to make sure Flash is installed.

On Windows 7 and below, the situation with Flash is a bit crazy. There’s a version for Internet Explorer (ActiveX plugin), a version for Firefox that is installed separately and Google Chrome bundles its own version – I’m not sure about smaller or niche browsers, but I think modern Opera inherits Flash via its relationship with Chrome’s engine. Thankfully with Windows 8 and above, Flash for Internet Explorer is distributed via Windows Update. It’s automatic and contains no 3rd party advertisements, anti-virus offers, browser bundling etc – all things Adobe have done in the past with their Flash installers. Trying to install Flash from Adobe’s website on Windows 8 and above will fail, which at least may help to kill off the fake Flash installer routine used by malware authors to trick unsuspecting users.

The usual method of installing Flash is highly cumbersome if you run a large network – not to mention that EXE files are much less flexible than MSI files for deployment and silent install options. Thankfully Adobe do make Flash Player in MSI format, but it’s not easy to get hold of directly. You have to sign a free enterprise deployment license to be able to legally distribute Flash and Reader in your organisation. The problem becomes how to distribute the updates especially if you aren’t running System Center or another product like that. Enter WSUS Package Publisher, indispensable if you make use of WSUS on your network.

WPP allows you to use the enterprise update catalogs Adobe and some other vendors offer. Using this, you essentially push the updates into your existing WSUS infrastructure, where it ends up delivered to the client computers like any other update. One thing you need to do is tweak the update as you publish it, so that it isn’t applicable to computers running Windows 8 upwards – if you don’t do this, the update will download on newer Windows versions, but will fail to install repeatedly and will need to be hidden. The other thing I’ve also discovered that needs to be fixed is that the silent install command line switch needs to be deleted. When a MSI file is delivered via WSUS, it is automatically installed silently. I discovered this the hard way, since one of the Flash updates I imported was failing to install on every computer. Turning on MSI logging and searching for the error code eventually lead me to discovering what was wrong, after which I corrected the problem and now know what to do with every new update that comes out for Flash.

Since using WPP, I’ve felt happier about the safety of my network, as I can usually get Flash pushed out with 2-3 days of the initial download. This is far better than having to visit each computer manually and keeping Flash up to date that way!

Curating digital photos

The rise of digital photography over the last 15 or so years has had many side effects. The most obvious is that analog film cameras have largely, though not completely vanished. No longer did you have to pay and wait for processing, hoping that your photos came out, or that you loaded the correct speed film. Digital gave you instant feedback either on the camera or once the files were transferred onto your computer.

My school is 59 years old as of this post. We have an archive of digital photos reaching back to the year 2000. The previous 43 years of school history is on film, but sadly most of whatever was taken was either lost or destroyed as the years went by. It seems there wasn’t much effort to archive and protect the slides and negatives at the time. While there are some slides and negatives, a large portion of the school’s history may be irredeemably lost. This is great pity, as what I have found, scanned and posted online has brought many happy memories back to people, most of whom may never have seen those photos.

Digital photos are far easier to store, backup and replicate to more than one location, which gives a huge amount of protection over analog slides and negatives. However, with the increase in megapixels and sensor quality over the years, combined with larger memory cards has lead to an unforeseen consequence: we have exponentially more photos now than we have ever had before. It’s so easy to take hundreds of shots of events now compared to the days of film when you were limited by how many spools you had and how much you were prepared to pay for developing and printing. Not only that, but since digital is now so ubiquitous, more people can contribute photos than ever before.

To give an example of this point: imagine a school sports day. Lots of activities all over the show that one photographer can’t cover on their own. Now imagine that there’s 5-10 students taking photos as well, covering all areas. Say each person takes 250 photos and suddenly you can end up with a total of 1500-2750 photos from one event – and that’s using a conservative figure. Obviously not all of these photos are going to be useful, which is where the time consuming art of weeding out the bad photos comes in. Most amateur student photographers I’ve spoken to never take the time to actually curate their photos. In fact, most staff members who have taken photos of school events haven’t done so either. It’s too easy to simply dump the whole contents of a memory card into a folder on the server and leave it there. This is what has happened with our digital archives over the years, to the point where we have something like 138000 files taking up over 480GB of space on our photos network share.

That number was a lot higher before I decided to take on the task of curating and cleaning up the mess the share had become. Not all of the files on the drive were photos, as I’ve deleted a number of Photoshop PSD’s, PowerPoint presentations, AVI and MP4 movie clips and other odds and ends. I’ve also deleted a huge amount of duplicates. Last week I brought home a fresh copy of all the files in the drive and imported it into Adobe Lightroom. It took a long time, but Lightroom calculated there was something like 128000 odd photos. I’m not sure about the discrepancy between that figure and what Windows Explorer tells me, but I think there may have been more duplicates that Lightroom ignored on import.

Now with the power of Lightroom, I’ve been able to really start going through some of the albums. I’ve curated 5 sub folders now, rejecting and deleting up to half of the photos in each case. Factors I look for when deleting photos include the following:

  • Focus. My most important metric really. 99% of the time, I’m going to delete out of focus photos.
  • Resolution. Photos of 640×480 or smaller are of no real use to us, even as an archive. I made the call to delete these, even if they are the only record of an event.
  • Motion blur. Too much of this ruins the photo. This usually occurs because shutter speed is too slow and it leads to a strange looking photo.
  • Framing. Things like cut off heads, people too distant, people partially in the edges of photos and so forth usually end up being binned.
  • Damaged files. Caused by bit-rot or due the camera/memory card being faulty, these are tossed.
  • Noise. Too much digital noise due to high ISO speeds or older sensors lead to very unpleasant, soft and noisy photos. I rescue where I can, but otherwise these too are binned.
  • RAW files. RAW files are fantastic for many things, but as part of an archive they are problematic. Every camera manufacturer has their own RAW format, which doesn’t always open well in 3rd party software. The alternative DNG format as created by Adobe is an option, but unless you take extra steps, they aren’t easily viewable. By contrast, JPEG files are universal and can be opened on just about any platform in existence.
  • Severe over or under exposure. Files that are extremely exposed in either direction are usually useless, especially if they are in JPEG form right out the camera.
  • Too similar photos. When you take photos in burst mode, you’ll often end up with many photos that are near identical, often only with small variations between frames. I usually pick the best 1 or 2 of the lot and delete the rest. This is especially true in sports/action shots.

I still have an incredibly long way to go. I’ve deleted well over 20000 files by now, but a mountain is still in front of me. Of course, as 2016 goes on and more photos get added to the 2016 archive, that mountain is only going to grow. Still, I’ve made a start and I am happy with what I’ve done so far. Thanks to the process, I’ve been able to upload many albums of decent, usable photos to our Facebook pages so that pupils, parents and staff can view, tag, share and download them.

In closing, I would suggest that any person who enjoys their photo collection to take the time to properly curate said collection. It isn’t always easy to delete photos, especially if they are the only one of a special event/person. However, unless one learns to be decisive, the collection is just going to eventually grow to the point of overwhelming you. Take time to savour the quality, not the quantity.

Updating Windows at the source

Since the release of Windows Vista, Windows has been installed by using a compressed image file, known as a WIM file. This is what allows Microsoft to ship one disk containing the home and other versions of Windows, unlike the multiple disks of the XP era. What makes a WIM file even more useful is that it can be mounted inside a running copy of Windows and have patches and drivers injected directly into the image. This is extremely handy when you realise that Windows 7 has been out for almost 6 years now and has a couple of hundred patches out there. Anything that cuts down the wait for Updates to install is a good thing, as well as having a more secure system out the box.

There are a couple of limitations however:

  1. You can’t inject all the update patches offline. Certain updates can only be installed when Windows is running.
  2. NET Frameworks cannot be injected offline. These will need to be installed and patched after Windows is up and running.
  3. You can only inject patches if they are in CAB or MSU format. EXE files are not usable here.

To update Windows 7 (or 8 or Server editions for that matter) you will need the following:

  • Windows 7 media or ISO file. I don’t have access to OEM disks so cannot say if those can be updated. What you really need is the install.wim file, found in the \Sources directory on the disk. It’s the single biggest file on the disk.
  • Windows 7 Automated Installation Kit or the later Windows 8.1 Assessment and Deployment Kit. You need this for the DISM tools which services the WIM file.
  • Access to the updates for Windows 7. There are many ways to get these, but I have found that looking the C:\Windows\SoftwareDistribution\Download folder on a patched machine to be one of the better ways to get the updates. Other tools have had mixed success for me.
  • Hard drive space and patience. Injecting updates, committing the changes to the WIM file and optionally recreating the ISO file will take time.

Here’s my step by step guide on how to do this update procedure. A note before I begin however. My guide is a little longer than strictly speaking necessary. If you have access to ISO editing software, you could just replace the install.wim file and be done. However, I am going to include the steps to rebuild an ISO image, including the option to make it UEFI boot compatible.

Updating Windows

  1. Make 3 folders on a hard drive. For example C:\Win7, C:\Updates and C:\Mount.
  2. Copy the install.wim file from your ISO or DVD to C:\Win7.
  3. Install the Windows 7 AIK or Windows 8.1 ADK. Specifically, we are looking for the Deployment Tools option. We don’t need the rest for this process.
  4. Place all the updates for Windows 7 into the C:\Updates folder.
  5. Open up the “Deployment and Imaging Tools Environment” shortcut as an Administrator. The DISM commands will only run with Admin approval.
  6. Run the command dism /get-wiminfo /wimfile:C:\Win7\install.wim
    This will tell us about the various Windows editions present in the WIM file. Depending on the disk, it may include multiple editions or only 1. Take note of the index number which corresponds to the edition of Windows you want to update, we will use it in the next command.
  7. dism /mount-wim /wimfile:C:\Win7\install.wim /index:X /mountdir:C:\Mount (replace X with the number you want from step 6) DISM will mount the image edition at the C:\Mount folder
  8. dism /image:C:\Mount /add-package /packagepath:C:\Updates
    DISM will now start to add all the MSU and CAB files it finds in the C:\Updates directory and apply them to the mounted image. This will take some time, so feel free to take a break. Some updates may cause an error; these updates are only meant to be installed when Windows is running. You will need to find out what updates caused the error and remove them. Type dism /unmount-wim /mountdir:C:\Mount /discard to discard all the changes and follow steps 7 & 8 again until the process is error free.
  9. dism /unmount-wim /mountdir:C:\Mount /commit
    This will commit the changes, save and unmount the WIM file.
  10.   If you want to update another edition of Windows 7, go back to step 7 and use another index number. Go through steps 7-9 again for all editions you want to update.

Building the new ISO for Windows 7

If you are planning to use the updated WIM file with Microsoft Deployment Toolkit, you are good to go and can use the updated install.wim file in conjunction with the rest of the Windows setup files. Otherwise, you’ll need to create a new ISO image that can be used virtually, burned to DVD or used on a USB flash drive for install purposes.

Open up the “Deployment and Imaging Tools Environment” shortcut as an Administrator again. Run the following command to make the ISO file that can boot on traditional BIOS based systems or on UEFI systems. For the most modern UEFI systems, make sure Secure Boot is disabled before you install Windows 7, as it is not Secure Boot capable.

For this step, copy all the files from your Windows 7 DVD or ISO to the Win7 directory, but leave out the old install.wim file or you will have wasted your time.

oscdimg.exe -u2 -udfver102 -bootdata:2#p0,bC:\Win7\boot\etfsboot.com#pEF,ebC:\Win7\efi\microsoft\boot\efisys.bin -o –lVOLUME_LABEL C:\Win7 C:\Win7\Win7.iso

Replace VOLUME_LABEL with something of your choice.

You can now burn the ISO file to DVD, use it on a flash drive or as an ISO with any VM software.

I have not tried this procedure with Windows 8.x, but I believe it should work the same way as the file layout of the relevant files and folders are near identical.

The Windows 10 upgrade experience

On Wednesday 29 July 2015, a new chapter opened up in the history of Microsoft’s Windows. Windows 10 was unleashed on the world, Microsoft’s attempt to remedy the largely cool reaction to Windows 8, as well as stay relevant (at least in the eyes of a lot of tech bloggers) in the new app centric world. The return of the Start Menu, an unprecedented public participation process via the Windows Insider program, free upgrades for a year, DirectX 12 for gamers and many more features all combined to build up a hype that has not been seen for a long time in the Windows world.

Like millions of other people, I reserved my copy of the Windows 10 upgrade via the app in the system tray that appeared after a round of Windows Updates a few months back. The idea was that this application would trickle download Windows 10 in the background as soon as the system went RTM, so that on launch day you’d be ready to upgrade immediately. Only problem is that the trickle download process started 1-2 days before the launch of Windows 10, which meant that with my slow ADSL speed, it would be some time before I’d be ready to go, let alone the chance that I’d be in one of the later waves of the rollout. This is probably due to the fact that build 10240 only went RTM 2 weeks before the global rollout. Either way, I was impatient to get going.

Thankfully Microsoft thought clearly and made ISO images available for direct download or via the Windows 10 Media Creation Tool. I snagged a copy of the Media Creation Tool and used it to download a copy of Windows 10 at work, where I have access to a faster internet connection. Once the ISO file was built by the tool, I burned it to DVD for 3 other staff members who were interested. It’s legal to do this by the way, since each person would be having their own upgrade key made during the upgrade process. For myself, I used the excellent Rufus utility to burn the image to a flash drive. Although the Media Creation Tool can burn the image to flash drive, I’ve come to trust Rufus, especially thanks to its ability to create properly booting UEFI capable media.

Once at home, I simply inserted the flash drive, double clicked on setup.exe and let the upgrade process run. I had previously been running Windows 8.1 with all updates applied. The installation process ran smoothly and took about half an hour to move all my files, upgrade itself and get to the desktop. All of my software remained installed and I haven’t yet had any compatibility issues software wise. I did have some issues with my PC’s built in Bluetooth adapter, but a couple of hours after the upgrade, a driver had been installed in the background and the adapter was good to go again. After the upgrade, I did manually install Nvidia’s latest graphics driver, since I already had it downloaded and couldn’t wait on Windows Update to deliver the driver.

So far, I mostly like Windows 10. It’s been stable despite the upgrade, no blue screens or crashes. As mentioned, all my software has remained in working without issue. Speed wise it feels a little faster than Windows 8.1, but not much. The speed may be more impactful on users coming from Windows 7 or earlier. My biggest real gripe at the moment with Windows 10 is the severe regression in the OneDrive client, a very well moaned about topic on the net. Windows 8 and 8.1 spoiled me in that regards with placeholder sync that let me see the files that were on my OneDrive, without actually needing to download them. The Windows 10 version basically takes us back to the Windows 7 version of the client where you have to chose which folders and files to sync, which will then chew up space on your hard drive. I am not happy at all with this change, but I am holding out that the new client that should be here by the end of the year will offer a better experience.

One small note: my copy of Windows 10 wouldn’t activate until a day after the install. While I kept thinking that somehow it was related to my Windows 8.1 key, it was simply a case of the fact that the activation servers were getting hammered into oblivion. Over 14 million people upgraded in the first 24 hours, so I am not surprised that I struggled to activate. I am assuming that now, almost 2 weeks later, activation should be happening immediately as per normal again.

It’s been a common refrain that I’ve seen on the net from reviews that if there’s one thing Windows 10 needs, it’s that it needs more polish. Lots of little fit and finish issues keep cropping up older legacy parts of Windows are moved into the modern framework. Different right click menus, a System Settings App that isn’t quite Control Panel yet, out of place icons etc. all need some time and attention before Windows 10 becomes its own unique system. With the promise of Windows as a Service, it’s likely that many of these issues will go away with time as the system keeps being updated and improved. One thing is for sure, it’s going to be an interesting ride indeed.

The long hunt for a cure

At the end of March 2014, our school took ownership of a new Intel 2600GZ server to replace our previous HP ML350 G5 server which was the heart of our network. The HP had done a fantastic job over the years, but was rapidly starting to age and wasn’t officially supported by Windows Server 2012 R2. Our new server has 32GB of RAM, dual Xeon processers, dual power supplies, 4 network ports and a dedicated remote management card. Although a little pricier than what I had originally budgeted for, it matched what the HP had and would earn its keep over the next 5-7 years worth of service.

After racking and powering up the server, I installed firmware updates and then Server 2012 R2. Install was quicker than any other server I’ve done in the past, thanks to the SSD boot volume. After going through all the driver installs, Windows Updates and so on, the server was almost ready to start serving. One of the last actions I did was to bond all 4 network ports together to create a network team. My thinking was that having a 4Gb/s team would prevent any bottlenecks to the server when under heavy load, as well as provide redundancy should a cable or switch port go faulty. Good idea in theory, but in reality I’ve never had a cable or port in the server room go bad in 6+ years.

Looking back now, I’m not sure exactly why I bothered creating a team. While the server is heavily used as a domain controller, DHCP, DNS and file server, it never comes close to saturating 1Gb/s, let alone 4. Almost every computer in the school is still connected at 100Mb/s, so the server itself never really comes under too much strain.

Either way, once everything was set up, I proceeded to copy all the files across from the old HP to the new Intel server. I used Robocopy to bulk move files, and in some cases needed to let the process finish up over night since there were so many files, especially lots of small files. Data deduplication was turned on, shares were shared and everything looked good to go.

When school resumed after the holidays, the biggest problem came to light right on the first morning: users being unable to simultaneously access Office files. We have a PowerPoint slideshow that is run every morning in the register period that has all the daily notices for meetings, events, reminders, detention etc. Prior to the move, this system worked without fault for many years. After the move, the moment the 2nd or 3rd teacher tried to access the slideshow, they would get this result:

WP_20140409_001
Green bar of doom crawling across the navigation pane, while this odd Downloading box would appear and take forever to do anything and would tend to lock Explorer up. Complaints naturally came in thick and fast and the worst part is that I couldn’t pinpoint what the issue was, aside from my suspicion that the new SMB3 protocol was to blame. I had hoped that the big Update 1 update that shipped for Windows 8.1 and Server would help, but it didn’t. Disabling SMB signing didn’t help either. At one point, my colleague and I even installed Windows 8.1 and Office 2013 on some test machines to try and rule out that possibility, but they ended up doing the same thing. As a stop gap measure, I made a dedicated Notices drive on the old HP, which was still running Server 2008, which ran fine with concurrent access to the same file. Online forums weren’t any real help and none of the other admins in Cape Town I spoke to had encountered the problem either.

In the last school holidays just gone by, we finally had a decent gap between other jobs to experiment on the new server and see if we could correct the problem. I broke the network team, unplugged 3 of the 4 cables and disabled the LACP protocol on the switch. After reassigning the correct IP to the now single network port, we did some tests on opening up files on 2 and then 3 computers at the same time. We opened up 15MB Word documents, 5MB complicated Excel files, 200MB video files and more. The downloading box never showed up once. Unfortunately, without heavier real world testing by the staff, I don’t know if the problem has been resolved once and for all. I am intending to move the Notices drive during the next school holiday and we will see what happens after that.

Chalk one up for strange issues that are almost impossible to hunt down.

Saving old memories

The school I work at will be 60 years old in 2017 – a pretty decent milestone for a school, though there are many older schools here in Cape Town. As with any institution that has survived this long, there are bound to be many old photos of events in years gone by. A school is a very busy place with hundreds of events each year: sports matches, outings, camps, tours domestically and/or internationally, dramatic presentations, musicals, concerts, prizegivings and more all lead to many potential photographic opportunities.

Unfortunately, for the last 30 years or so, the school has largely relied on one person to take photos and keep a visual record of the school: my direct boss. From when he arrived in the mid 1980’s through to today, he has been building up a massive collection of photos. Since 2004, all the photos have been taken digitally, so there is a good archive that has built up the last 11 years. However, prior to that, everything in the school was done on 35mm film and this is where the problem comes in. All the photos on colour slides are in a slow race against time to be preserved. All colour slides will discolour and fade in time, more so if they are not stored properly. Once the colours are gone (or shifted too badly to rescue,) all the unique memories on those pieces of plastic are gone forever. Colour negatives are a bit more stable if stored in the plastic strips they came from the photo store. Black and white negatives are probably the most stable of the lot.

At the end of 2014 through a chance discussion with the school librarian, I discovered that there was a batch of slides sitting in a box in one of her cupboards. I asked her if I could take them to start scanning them, as we luckily have a Canon scanner that can scan slides and negatives. I was thinking of having them professionally scanned by a specialised company here in Cape Town, but the price would quickly become prohibitive for the number of slides that needed to be converted. As such, I’ve been slowly chipping away at the first box of slides, scanning them at 4800 dpi and saving the resulting large JPEG file. My boss has promised to colour correct and touch up these slides in Lightroom/Photoshop Elements when I am done scanning, after which we can upload these photos to our dedicated Past Pupils page on Facebook.

So far I’ve managed to scan about 165 slides, most of which I’ve taken out the holders to do so, especially the glass ones. It’s become clear that many of the photos were soft or slightly out of focus when taken originally, but it probably wasn’t noticed at the time. Also, 30 odd years of age on the film itself also doesn’t help either. There’s still a pile of probably about a hundred to go, though I’ve managed to whittle out private slides of my boss or slides that were too far gone to bother rescuing.

With the end of that box in sight, I went back to the library last week looking for anything more. As many slides as there were in the first box, they only cover a small time period of the school’s history – 3 or 4 years at the most in the 1980’s. After some more scratching and an impromptu spring clean by the librarian, I took possession of another box of slides, as well as dozens of packets of negatives, both colour and monochrome as well as some printed photos. Once the initial box of slides are done, I can focus on the negatives. Thankfully, scanning the negatives will be a little less time consuming, for the simple reason that I no longer need to take the film out of holders. I simply mount the strip of 4 negatives and scan away, estimating a saving of about 5 minutes per batch.

The biggest downside of the 35mm products is that in today’s digital world, you cannot share the memories on those pieces of plastic if you don’t digitise them. Digitised, you can share them online as well as use them inside the school for projection during events. Projecting slides today isn’t impossible, but getting a slide projector isn’t easy, not to mention that the mere act of displaying the slides will reduce their lifespan even more due to the heat of the lamp. For archival purposes, having the photos in JPEG format allows the files to be replicated all over the show, avoiding any one point of failure. If the film is damaged and destroyed, there is nothing to fall back on, especially in the case of slides. While JPEG isn’t up to true archival quality or standards, in computing terms it’s probably the closest thing there is. Every consumer operating system since Windows 95 can view the files, which is a good 20 year track record now. It’s of course nowhere near film’s 130+ years of service, but for now, it’s a good enough solution.

Low end laptop pain

In the course of my job, I’ve been asked on occasion to give feedback or a reccomendation to staff members regarding the purchase of a personal of family laptop. Unfortunately due to the ever changing nature of the IT field, the answers I give aren’t always what the person wants to hear.

I have two questions I ask the person before I make any recommendations:

  1. What do you intend to use the laptop for?
  2. What is your approximate budget for the laptop?
  3. How long do you intend to keep the laptop for?

The answer to the first question is usually a pretty generic answer: typing documents, surfing the internet, preparing lessons, doing research, check email. Using SMART Notebook also comes up now and then. Question 2 usually results in a range of about R4000-R6000 (roughly $380 – $550, exchange rates make this number fluctuate.) Question 3 results in a range of 3 years up to 5 or longer.

I often specify a laptop that is slightly over the asker’s budget, with a justification that spending slightly more results in a better quality laptop that lasts longer and is less likely to drive the person up the wall in the long run. Bottom of the line laptops have to cut so many corners that the experience is often highly frustrating. Low amounts of RAM, lowest end processors, slow mechanical hard drives, low resolution low quality screens, creaky plastic shells, poor trackpads and more leave  and taste in the mouth and that’s just on the hardware side of things. Software wise, the lowest end version of Windows is installed, including the Starter edition in the Windows 7 era. Bundled anti-virus applications, trial ware and lots of often bloated, unneeded software is pre-installed by the manufacturer in order to try and recoup costs and eke out some sort of profit.

Over the last few years, I’ve come to be a firm believer in the power of the SSD. With the right drive, it can often seem like you super charging a laptop that otherwise would need to be replaced due to age. It won’t completely mask age or low specs on a laptop, but it comes close. Windows starts faster, applications load quicker, battery life is extended, noise is reduced and user experience is often improved because you have less of the freezing/lockup sensation after boot. I don’t know if the drives will ever get as cheap as mechanical hard drives, but I believe that even a SATA3 based drive in most consumer laptops would go a long way to increasing user satisfaction across the board. Unfortunately, marketing still spreads the word that 1TB and larger drives are a good thing to have, when in reality not that many people are going to using all that space on a laptop.

As much as I’ve moaned about low quality laptops in this piece, I am reminded that it’s due to the flexibility of Windows that there is such a wide range of devices available at all cost points. From the most painful low end devices that are affordable to most people, all the way up to high end ultrabooks that are extremely pricey but have all the bells and whistles. Competition in the industry plus attrition has also helped to weed out some of the smaller or less interested players, as well as leading to a growing awareness that quality needs to increase in order to stand out against the competition. I can only hope that as time goes on, this trend continues and that the creaky poor machines of the past become nothing more than a bad memory.

Follow

Get every new post delivered to your Inbox.