Archive

Archive for the ‘General’ Category

Lessons learned from migrating to Office 365

May 30, 2017 1 comment

My migration of staff email accounts from our onsite Exchange Server to Office 365 continues as I write this, though now at a somewhat quicker pace. With just under 50 mailboxes left to move, I should be done by the the end of this school term. So far the move has been mostly trouble free, with no email being lost. There have been some small incidents that have helped to shape future mailbox moves and have provided valuable lessons. In no order, here’s some of what I’ve learnt along the way:

  • If you plan to migrate your user’s existing mailboxes up to the cloud, you absolutely need a fast internet connection. 20Mbp/s minimum in both directions, but the faster the better.
  • If possible, get your users to perform mail cleanups before you move their mailbox. The less items in a mailbox, the less time it takes to move said mailbox into the cloud. There’s also less clutter for users after the move, which usually makes people happy, since less clutter is always a good thing.
  • If you are doing a staged migration, try to move as many mailboxes as you can per batch, so that you don’t draw the process out too long. The longer you run two systems, the more risk of something breaking or going wrong along the way.
  • Watch out for user accounts that have been renamed, i.e. people with surname changes. If this isn’t cleaned up properly before being synced to the cloud, it can come back to bite you in the ass. Cue frantic searching and entering arcane commands into Powershell.
  • Users don’t always appreciate or use manuals you may have written. Write a manual anyway, so that you’ve covered your ass.
  • Mailbox moves often don’t happen as fast as you think they should. Budget extra time for a large move.
  • Modern Outlook Web App is a really nice mail client. Light years from Exchange 2007 version obviously.
  • Use Office 2016 for fixed desktop users to connect to Exchange where possible. All previous versions are not going to get the same attention and support from Microsoft in case of trouble.
  • Office 2016 perpetual (i.e. the version you volume license and uses MSI installer) won’t get feature updates over its lifespan. This means no new and cool features like Focussed Inbox.
  • Some programs that interface with Outlook don’t like the 64 bit version of Office.
  • Direct users to the stand alone Outlook apps on Android and iOS. The built in mail client should connect with too much hassles, but Android and Exchange have always had a slightly rocky relationship in my view.

I’m in the process of moving the last giant mailboxes over in the coming week. Once that’s done, the pace of migration should go up as I move other users over with more “normal” size mailboxes. Once everyone has moved, it’s a case of testing to make sure everything is ok, then changing MX records to cut over for direct email delivery to the cloud and to cut out mail coming onsite and then back out again.

Those WTF moments

October 14, 2016 Leave a comment

Sometimes in the world of IT, you have moments where all you can do is scratch your head and ask WTF happened. Such was the case on Monday this week, before I even got back into work and before the term started. I received a text from my head of IT who said that he was unable to access one of our (virtual) servers to post our PowerPoint daily notice we show our learners. In-between getting dressed and packing my bags for work, I remoted in to take a look.

I couldn’t see the server on the network nor could I Remote Desktop in to it. Ping worked surprisingly, but that seemed to be about it. Going to Hyper-V manager revealed the server was on and I could connect via the console. I picked up a clue as what to what could be wrong when I noticed that the Heartbeat status wasn’t being reported to Hyper-V Manager. This indicated that the service had stopped running for some reason.

The previous Friday I had rebooted all our servers in order to finish their update cycles, as well as to prepare them for the term ahead. This particular server had come up from the reboot ok, so I didn’t do an in depth check. It’s never given me an issue like what happened before, so I made the mistake of assuming all was well. Anyway, after connecting, I could see that all the Hyper-V services were not running inside the VM. Manually trying to start them didn’t work. I tried to upgrade the Integration Components since Hyper-V indicated that my other VM’s needed an update for the components. No matter how I ran the setup file, it would not execute on the sick VM. By this time I had to leave to get to work, so the problem had to wait until I got in.

After arriving at work and settling in, I cloned the VM to my PC so I could play around more easily. Numerous attempts at a cure all failed, until I came across a post on the internet that described the same symptoms as I had. There was a link to a Microsoft KB article, which included steps on how to fix the problem. The KB dated from a few years back, so I found it incredibly bizarre that the problem only hit us now. Still, the sick server is running Server 2008, so I went ahead and made the change in the registry as documented. A reboot later and the server on my PC was suddenly working normally again. All relevant services were starting up correctly again and the server was back in action.

Since it was successful on my local cloned image, I went ahead and made the same change on the sick VM itself. Sure enough, one reboot later and we were back in business. In the aftermath, I spent a lot of time trying to figure out what caused this issue. While I did have IIS installed on the server years ago, I don’t recall there ever being a SSL certificate on that server. How exactly we ended up with the situation is probably something I’ll never fully know. As I said to my colleague, we’ve both seen random stuff over the years, but this one was really a WTF moment in a big way.

Categories: General, Software

Salvaging old equipment a.k.a dumpster diving

Last week I watched a couple of videos on YouTube where old computers were rescued from the kerb or dumpsite and refurbished for use. This saves on e-waste and also provides cheap computers to those who cannot afford a new machine. This got me thinking about all the equipment I have discarded, sold or donated while at my school, as well as the actual value of refurbishing old equipment.

As time has gone on, I estimate I’ve gotten rid of over 100 old computers, ± 40 projectors, ± 20 printers and countless individual parts such as dead hard drives, power supplies, motherboards etc. Some of this went into the the trash, while others were donated or sold off to raise some funds for the school. In fact, we cleaned up 6 computers for sale over the first week of holidays. However, the process is time consuming, especially with old equipment like that. The process goes something like this:

  • Physically inspect the chassis to look for loose panels, missing screws, worn/sticky buttons etc.
  • Open the chassis and blow out all the dust using our air compressor, then perform a visual inspection of the motherboard, looking for swollen/blown capacitors, loose cable connections etc.
  • Power on the PC and listen for fans that need lubrication. Most often this is the power supply, graphics card or the chassis fan. Fans that grind are a sure sign of that fan seizing up completely in the not too distant future.
  • Perform lubrication on the fans that require it, which means removing the part from the PC to get to the fan lubrication cover.
  • Install as much RAM as possible as well as a working DVD drive if required.
  • Wipe the hard drive and install Linux Mint/FreeDOS as a free operating system, as we cannot sell the computers with Windows on them.
  • Leave the PC running for a while to determine minimum stability.

This leaves us with a working PC, but it is time consuming, even if it only needs minimal checks and a dust blow out.

It made me think about how far back one can and should go with refurbishing old PC’s. While there are plenty of Pentium 4 and Pentium D based computers out there, they have the disadvantage of running very hot, using a lot of electricity and in the P4’s case, are single threaded chips. Coupled with IDE or SATA1 speed hard drives and the computer is unpleasant to use, even with a freshly installed operating system. Again, while this will provide a computer to a charity or needy person who has never had one before, the economics of using such an old machine weighs heavily against it.

Printers are easier, in the sense that they generally just need a new toner or ink cartridge(s). The problem with older devices though are if they are using the now defunct Paralell Port, or as HP loves to do, not provide drivers for modern versions of Windows. I had to replace all our old HP Laserjet 1018’s in the school because they flat our refused to run stably under Windows 7. I’ve got a 4 colour laser MFP in the office that I have to discard, as the device will not behave properly under anything newer than Windows Vista at best. HP have not put out modern usable drivers for this machine, instead reccomending that you buy a modern, supported printer. This to me is a tragedy, as the device has less than 8000 pages on the counter. There is nothing physically wrong with the machine, but unless we run it on an old version of Windows, it’s become little more than a glorified door stop.

Projectors have the problem of either having their lamps require replacing, colour wheel dies (DLP projectors only) or there’s a problem with the internal LCD panels on LCD models. When you ask for a quote on repair or a new lamp, it actually becomes more cost effective to buy a new projector rather than repair the existing one. Not to mention, most older projectors won’t have modern ports like HDMI or network ports on them, so they are less useful in today’s changing world.

In the end, this is all part of the vicious cycle of technological progress. Unless we can somehow convince manufacturers to better support their products, we are going to be locked into producing tons of e-waste. Reusing old computers is a good start, but there also comes a point where it is no longer viable to use older equipment. One thing that could definitely be improved is much more visibility for e-waste recyclers. Equipment can be properly stripped and salvaged by these firms, who then get the components properly recycled and also avoid polluting areas with toxic chemicals that leech out of electronics as they decompose. It would also help if more people took an interest in repairing their own stuff if it breaks, rather than just throwing it away. There’s a thrill that comes from fixing something with your own hands, a thrill that more people should want to experience.

Low end laptop pain

In the course of my job, I’ve been asked on occasion to give feedback or a reccomendation to staff members regarding the purchase of a personal of family laptop. Unfortunately due to the ever changing nature of the IT field, the answers I give aren’t always what the person wants to hear.

I have two questions I ask the person before I make any recommendations:

  1. What do you intend to use the laptop for?
  2. What is your approximate budget for the laptop?
  3. How long do you intend to keep the laptop for?

The answer to the first question is usually a pretty generic answer: typing documents, surfing the internet, preparing lessons, doing research, check email. Using SMART Notebook also comes up now and then. Question 2 usually results in a range of about R4000-R6000 (roughly $380 – $550, exchange rates make this number fluctuate.) Question 3 results in a range of 3 years up to 5 or longer.

I often specify a laptop that is slightly over the asker’s budget, with a justification that spending slightly more results in a better quality laptop that lasts longer and is less likely to drive the person up the wall in the long run. Bottom of the line laptops have to cut so many corners that the experience is often highly frustrating. Low amounts of RAM, lowest end processors, slow mechanical hard drives, low resolution low quality screens, creaky plastic shells, poor trackpads and more leave  and taste in the mouth and that’s just on the hardware side of things. Software wise, the lowest end version of Windows is installed, including the Starter edition in the Windows 7 era. Bundled anti-virus applications, trial ware and lots of often bloated, unneeded software is pre-installed by the manufacturer in order to try and recoup costs and eke out some sort of profit.

Over the last few years, I’ve come to be a firm believer in the power of the SSD. With the right drive, it can often seem like you super charging a laptop that otherwise would need to be replaced due to age. It won’t completely mask age or low specs on a laptop, but it comes close. Windows starts faster, applications load quicker, battery life is extended, noise is reduced and user experience is often improved because you have less of the freezing/lockup sensation after boot. I don’t know if the drives will ever get as cheap as mechanical hard drives, but I believe that even a SATA3 based drive in most consumer laptops would go a long way to increasing user satisfaction across the board. Unfortunately, marketing still spreads the word that 1TB and larger drives are a good thing to have, when in reality not that many people are going to using all that space on a laptop.

As much as I’ve moaned about low quality laptops in this piece, I am reminded that it’s due to the flexibility of Windows that there is such a wide range of devices available at all cost points. From the most painful low end devices that are affordable to most people, all the way up to high end ultrabooks that are extremely pricey but have all the bells and whistles. Competition in the industry plus attrition has also helped to weed out some of the smaller or less interested players, as well as leading to a growing awareness that quality needs to increase in order to stand out against the competition. I can only hope that as time goes on, this trend continues and that the creaky poor machines of the past become nothing more than a bad memory.

The Clean Windows PC experience

February 15, 2015 Leave a comment

Microsoft Windows is an amazing piece of software. It powers an incredibly wide range of hardware, as well as running on wildly different system specifications. One person may have a bargain basement Celeron or Pentium laptop, while another person is running on a fully tricked out Core i7 beast – Windows covers it all. With multiple OEM’s making products, the consumer is generally spoiled for choice across a wide range of price points. The downside to this however is that Windows has often been associated with a race to the bottom of the barrel, while Apple for example refuses to go below a certain line and rightly or wrongly, and maintains a prestigious, upmarket image.

Part of the race to the bottom means that profits for OEM’s are razor thin. Make a mistake and your competitors are going to pounce. Fail to keep up and likewise. Fail to cut down on costs and you risk going bust. As a result of this fierce competition, consumers have sometimes been the victim of this industry competition. Laptops are built with creaky plastic that doesn’t always sit flush, screen resolutions haven’t increased in years, mechanical hard drives are still king, multiple models that often leave people confused as to what the differences are between it and another model, the amount of RAM is just enough to get by with and cheap Realtek network and audio solutions are used etc… On the software side of things, OEM’s take money from anti-virus vendors to preload their wares onto the computers. Throw in CD/DVD burning trial solutions, vendor back up programs as well as other useless vendor software and you are left with a horrible laptop/desktop experience. Users don’t love Windows, they just tolerate it.

The hardware issue is tricky, since that depends on economies of scale to work. A SSD hard drive for example would greatly improve people’s experiences with their computer, but a 250GB drive for example still costs much more than a 1TB mechanical drive. Screen resolution in laptops is slowly starting to move forward again, but it will take time. Trackpads are also finally starting to improve, but it’s still hit and miss. With desktops, it’s really become about trying to cut down on size as much as possible and go small.

The software side of things is where the most immediate improvement can be made. If OEM’s followed Microsoft’s Windows Signature Edition experience, I think many a customer would be happy. Instead of having Windows loaded down with bloatware, trials and other software, Windows instead would come clean out the box, with a few minimal applications installed – Flash, Adobe Reader, Skype and Microsoft Security Essentials (for Windows 7). For Windows 8 based machines, the OEM’s should make sure that the devices are shipped with Windows 8.1 minimum, but ideally Update 1 should be installed as well, which improves the experience on traditional laptops/desktops. OEM’s should strive to keep their images as up to date as possible, so that the end user isn’t downloading a few GB worth of updates after their first boot. There’s nothing worse that powering up and watching Windows Update firing up and tearing through a few GB worth of bandwidth as it pulls down patches.

Lastly, hardware in the computer should not require an application be installed so that the driver is installed as well. I’ve had this problem with Lenovo and Samsung laptops, where in order to get rid of an outstanding entry in Device Manager, I’ve had to install one of the Samsung/Lenovo utilities. Often these utilities don’t work well and just add frustration for the end user.

Famed Windows blogger Paul Thurrott has a few articles up where he goes right back to basics and does completely clean installs of Windows on some of his devices. As he notes, it’s sometimes the only way to truly be rid of all the bloatware OEM’s like to install. Included are steps on how to legally download clean ISO images you can burn to disk or USB stick for a clean install of Windows. You can find his articles here, here, here, here and here.

UCT IT Management short course

October 28, 2014 Leave a comment

After 10 weeks of study, thought provoking questions as well as the odd bit of frustration, I finally finished off the last module of my UCT IT Management short course last night. Offered in partnership between the University of Cape Town and a private company called GetSmarter, the course is aimed at widening the knowledge of IT managers of all walks. There are a wide range of other courses on offer from the site, ranging from 8 – 10 weeks, all of them offered online. I decided to do the management course in the hope that I would pick up some new skills and get some new ideas, since these days I’m doing a lot more management rather than just purely technical work. Shaping budgets and policy is something new to me, so all the more reason I was eager to take the course.

The IT course is 10 weeks long as mentioned, so a week for every module. The entire course is run through GetSmarter’s VLE, which is a heavily modified version of Moodle. 5 of the 10 modules are tested via online quizzes of the usual fare i.e. multiple choice, True/False, pick the correct one etc. The other 5 modules are written assignments where you download a document with a case scenario in it as well as questions. From there you have to answer questions as well outline various scenarios, all while watching a line count per answer. Once completed, these documents are uploaded back into the VLE for marking.

I found that as the course went past the half way mark and into week 6, the content of the course became quite theoretical and abstract and dealt less with current trends and topics. Coming from a network administrator’s position in a school, a large amount of the terms and concepts I was exposed to were completely new to me. Changing my thinking to think along business lines proved to be quite a challenge, since the corporate world moves quite differently than the educational world. I know that out of the 10 modules, module 3 was definitely my least favourite, as it was incredibly densely packed with jargon and enormous amounts of theoretical knowledge.

Overall, I think the course is worth the money asked for it, extra studies are always good in jogging the brain out of its set ways. However, if you are new to network administration or IT, it’s definitely not the course for you – more vendor qualifications are appropriate in that case. This course is more for techies and admins who are moving up towards managing IT in their place of work, though as mentioned the course is almost completely focussed on the corporate world.

On an unrelated note, the course also showed me that Moodle can definitely work if enough effort is put into it – custom theme, disabling many end user features and so on. My experience is limited, but it’s been the best Moodle experience I’ve ever had.

Cabling and De-cabling

The school I work at is 57 years old this year. This means that over the course of the life of the school buildings, lots and lots of cables have been installed for various systems. Discounting electrical cable, there are cables for the computer network, the burglar alarm system, the classroom and corridor intercom system, the old analogue and later PABX phone systems and even some analogue CCTV cables laying around somewhere in the roof.

After a recent venue reshuffle, my colleague and I have had the fairly rare chance to really go wild and rip out as much of the useless cable as we can find in the areas that were reshuffled. We decided not to just cut off the cable at the most convenient spot, but to follow it as much as is practically possible all the way back to its source. This method takes a lot more time but is a much more thorough cleansing than if we just snipped when it vanished from view.

As we’ve followed the cables and opened trunking, it’s become abundantly clear that cable was never ever removed in the past. The phone system in particular is an example. From what I can figure out and from talking to long time members of staff, it appears we once had an analogue phone system from our national phone company. There were no internal extensions, only direct lines all over the show. This meant that the phone company ran a lot of thick multi core cabling all over the show before using one or two pairs for the end jacks. We’ve discovered that many of these thick multi core cables simply taper off to a dead end point.

When the school got a Samsung analogue/digital PABX, the telecoms company that installed it simply ran their floor cables all over the show without removing the older cables first. Hundreds of meters of cable were run to support the Samsung PABX. Often cables were glued on the walls, in door frames or wherever else, often leading to a very messy appearance.

Two years ago, we moved to an Avaya VoIP system, which runs on the existing network cables. Although the telecoms company responsible for that system did rip out quite a bit, they didn’t bother going after the floor cables or anything like that. So for the last 2 years, we’ve been sitting with a lot of dead cable all over the show. When we eventually get to ripping it out, you end up with a pile like this:

20140704_113931

That whole pile is either multi core cable from the original phone system, or from the Samsung PABX. We had about 2 other piles of similar size when we cleaned up other venues, though those did include network cables and some power cable as well.

With the removal of so much cable, it becomes possible to install smaller and more compact trunking which not only looks neater, it also leads to easier cable management. Getting cables to stay in place while you hammer the cover back on of a 100mmx40mm piece of trunking is not an easy feat.

When I head back to work after my break, we’ll no doubt continue the hunt for cables and rip out as much of the dead stuff as we can find. I just wish it was easier to recycle these cables than it currently is.

Categories: General Tags: ,