Archive

Archive for the ‘Personal’ Category

Retro Gaming blues

One of PC gaming’s biggest selling points is that there is an absolutely incredibly rich back catalogue of games reaching back decades at this point. Thousands upon thousands of games in every possible genre and in dozens of world languages, released by studios big and small all over the world. This is truly one of the great selling points of PC gaming, but it comes with a downside that doesn’t seem to get a lot of coverage that I’ve seen, but of course the internet is a big place and I don’t frequent the gaming sites the way I used to when I was younger.

For myself in particular, I am not even really talking 90’s games all that much but rather games made in the mid to late 2000’s. The thing is, the game engine itself more often than not will run even under Windows 11 x64 fine, but getting the game and getting it installed is by far the more difficult challenge. Let me explain…

Until the major rise of Steam and other digital download services, most gamers got their games on optical media – CD or DVD – and thanks to rampant piracy in the industry, publishers turned to copy protection systems such as SafeDisc and SecuROM amongst many other systems. None of these systems made headways in truly curbing piracy, but they left behind an unintended toxic legacy: legally playing these games years later on modern computers.

If you were someone who bought a lot of games on physical media and built up a collection, chances are there are games you own that are not available on any digital download service such as GOG.com for whatever reason. This means that the only way to play the game is to install it from the original disk. Unfortunately back in 2015 or so, Microsoft issued patches that totally kill the ability of SafeDisc and earlier versions of SecuRom to work. These protection systems will not work at all on Windows 10 and 11 and if you fully patched an older Windows 7/8 retro gaming PC, they too will not play these games. The only option left to you then is to find a crack so that you can run the game without having a disk in the drive. Of course, this is presuming you still have a PC with an optical drive in it – so very many computers no longer have optical drives at all.

Let me use an example of a game I own: Medal of Honor Airborne. The game uses the Unreal 3 engine and whilst it is a good 15 years old now, it is still more than playable. However, I had a terrible time installing it on Windows 11 off the disk:

  1. The setup process struggled mightily to go past a point whilst reading off the disk. The disk is not faulty or rotten, but neither of my Blu-ray drives wanted to get past this point. I had to copy the files to a folder on a hard drive and install from there to get the game installed.
  2. The setup process is hard coded to install a now ancient version of AGEIA PhysX, which will not work if you already have a modern Nvidia version installed. If the setup process cannot install it’s included PhysX, it will abort and refuse to continue.
  3. To get around this, I had to remove my existing PhysX, install the game with its included version, remove that and then install the Nvidia PhysX legacy package, which lets you run older games whilst at least having a more modern runtime than what was included with the game. Lastly, I had to reinstall my modern version of PhysX as well.

Apparently, the version of the game that is available on the EA Play platform doesn’t have these hassles, but I specifically wanted to go through the manual install process to see if I could work my way through the hassles.

Another game that I have is Pariah, made by Digital Extremes, using the Unreal 2 engine. The game installs fine, but cannot run with the disk in the drive as it uses SafeDisc. This game is not available on any digital platform for whatever reason, so the only way to play it is to download a crack to circumvent the dead copy protection, which I duly did. I have no moral qualms about that, as I paid for the game and I have all the original packaging. Is it my fault that the game can’t directly run in modern Windows? No, so I am fine with using a crack.

I would like to say that GOG.com does an amazing job of finding, fixing, stripping all the DRM and releasing all sorts of older games, but as wide as their catalogue is, there are plenty of games that aren’t available unfortunately.

It’s easy to be nostalgic about older games and maybe feel concerned that a service like Steam has become the metaphorical 800lb gorilla, but it really does make gaming easier. No need to worry about finding and managing patches manually, installation guaranteed to work and many other benefits. Don’t get me wrong, I would always prefer to get my games on physical media so that I always own a copy of my game plus have a nice manual and any other included items, but I realise that the industry will never really go back to that method of distribution again. I suspect that the current gen of gaming consoles might be the last to include an optical drive in them and once they are replaced in a few years time, the only way to really own your games anymore will be to hope its available on GOG so that you always have an offline installer and can back up the install files at will.

The agony and misery of load shedding

For the last 15 or more years, my country South Africa has been at the mercy of its national power company Eskom as the county’s supply of electricity has become more and more constrained. Regular, rolling blackouts have increasingly plagued the country as Eskom tried to prevent a complete grid collapse, which would then necessitate a “black start” situation as the country brings the power stations back up bit by bit.

As the years have gone by, Eskom’s fleet of mostly coal powered plants have become more and more unreliable, hampered by huge amounts of corruption at Eskom, a brain drain as skilled engineers and technicians have left and or retired and a general lack of maintenance as the political party in charge of the country forced Eskom to run all the power stations flat out and do no maintenance to somehow tell the story that they stopped load shedding and so were worthy of your vote. Now the chickens have come home to roost so to say as the plants are knackered, the company is overstaffed, corruption rings are fighting to protect themselves and their ill-gotten plunder and electricity demand never stops growing.

My school is lucky to have a 60kw solar panel installation for the last 2 years, but without battery storage, the panels can’t stop our school from suffering a blackout. The panels certainly help reduce load and can even send power back into the grid, but it’s not a guaranteed power system. My server and network racks have UPS units in them, but when your power is going out 3 times a day, you can barely get the batteries recharged before it goes off again. Servers were never designed to be as tolerant of constant on/off cycles, which increases wear and tear as well as the possibility of data loss due to unclean shutdowns. Sudden power surges after the power comes back on doesn’t help matters either as this can either immediately damage something or cause long term cumulative wear to a power supply.

We have had bad bouts of load shedding before, but we are now at a stage we’ve never been before. Going into its 3rd week now, we have had continuous disruptions 24/7. We have never been so bad before and trying to keep track of what load shedding is happening where and when is becoming an increasingly difficult task. Throw in the fact that the City of Cape Town is often able to mitigate some stages of the shedding, so the published schedules may not be accurate depending on what the City can do. In its own odd way, this also adds stress and you can’t adequately plan ahead until you know what the heck is going on later today or tomorrow and that’s hard when you are literally waiting for word from the City, which can also change at a moment’s notice.

Our city is trying to procure its own power independently of Eskom, but even though the ball got rolling a while ago, it’s still going to take time to bring this capacity online. Even then, the greatest problem is obtaining enough battery storage so that cuts overnight are mitigated as well. Solar can’t produce energy at night and wind is erratic. The cost of battery storage has fallen dramatically over the last few years, but still not far enough, not just yet.

As I write this, we are potentially looking at another week to 2 maybe of this constant interruptions until Eskom can stabilise enough of the plants to stop load shedding. I am not well off enough financially to afford a generator for my home, so we make do as best we can. For my school however, should this bout continue into the start of the new term, things are going to get extremely ugly.

Categories: General, Personal Tags:

PC RGB is a mess

I recently performed a bit of an upgrade on my personal PC, namely transplanting the innards from my old gigantic Cooler Master Cosmos II Ultra tower case into a much smaller Cooler Master CM 694 case. Besides being a decade old, the Cosmos II also lacks a window on the side and doesn’t have any front mounted USB-C ports. The CM 694 offers these amenities in a case that is a lot more friendly on my desk compared to the old behemoth.

As part of this upgrade, I decided to finally go full RGB and get everything matching as much as is humanly possible. What I quickly discovered is that the PC RGB space is a royal mess. Different connectors and different software packages for control are frustrating enough to deal with, but peripherals being largely incompatible between software packages is the real killer annoyance. It’s sort of understandable in the sense that each manufacturer wants to lock you into their ecosystem of devices, but the ideal situation may not go this way. For example, you may want an Asus motherboard with an EVGA graphics card, with Corsair RAM and case fans, Logitech mouse and Steel Series keyboard. All of these support beautiful RGB, but you would have to run at least 4 different software packages to control all the effects, which eats system resources, introduces potential security holes and can lead to the RGB programs not working correctly as they fight each other for control of items. Not to mention, if you aren’t running Windows, your options are seriously limited.

There’s a good video on YouTube that explains much of this as well as taking a good long look at the types of connectors you would commonly encounter on case fans and RGB strips:

However, this video doesn’t get into the annoying software side of things in that peripherals such as keyboards, mice, headsets, mouse pads, headphone stands etc are generally locked to whatever manufacturer ecosystem exists i.e. Asus mouse won’t work with Logitech software.

Thankfully, people got annoyed by this and there are at least 2 promising software solutions out there – OpenRGB and SignalRGB.

OpenRGB is open source software that aims to support every possible RGB device in one piece of software, across Windows, Mac and Linux to the maximum possible extent. Their list of supported hardware is long and grows almost every day.

SignalRGB has a similar goal, but is closed source software and only runs on Windows at this point. The hardware list is very impressive and like OpenRGB, seems to grow every day. The program also comes in two versions, one free and one paid that includes more features such as game integration and additional features.

That being said, both SignalRGB and OpenRGB are reverse engineered products and should something go wrong, the original manufacturer can be petty and refuse to honour a warranty if they find out you were using a 3rd party program for example. Also, some manufacturers really cut corners in their RGB implementations, so neither program will have good control over those devices – ASRock motherboards come to mind here for one thing.

It is my hope that eventually someone big steps up and forces an industry standard, but as with anything in the PC space, this seems unlikely – the USB interface for peripherals alone means that it’s too easy to get an ecosystem going and try to tie your users down into it. I guess that the two software applications above really are your best bet for cross brand RGB, but it still doesn’t solve the issue of needing the vendor software installed to update firmware, set configuration settings for keyboards or mice, things that can only be done through the original software. The other alternative is to stick to one vendor, do your research and get products that are guaranteed to work with each other.

Thoughts on the AMD Ryzen 3 PRO 4350G

Apparently this APU is not meant to be sold to the general public as a standard SKU but is restricted rather to OEM’s who build complete PC’s and sell those. My guess is that thanks to the great chip squeeze of 2020/1, AMD decided to enact this policy and focus rather on getting the Zen 3/Ryzen 5000 based APUs out to the public at a forthcoming stage.

The PRO branding simply means that some business security features have been added to an otherwise base APU, similar to what Intel does with v-Pro on some of their chips. Having not used v-Pro or the specific Ryzen PRO enterprise features on offer in any meaningful sense, I’m not particularly concerned if they are effective or not. All I was looking for was an entry level AMD CPU with built in graphics that wouldn’t break the bank and had just enough features to be something that could last for another 5+ years.

AMD have admitted that due to the silicon shortage, they have been focussing on the higher end chips that bring in more revenue for them. As such, finding something based on Zen+/ Ryzen 3000 APU has been nigh on impossible. Every local online shop I tried simply did not have stock and had no indication of when/if they would be able to restock. This was a problem as I was busy rebuilding my dad’s PC and I need an APU to complete the project. I was getting seriously frustrated and then a search one day turned things around out the blue.

Searching on Takealot, I found this Ryzen 3 PRO 4350G APU available, seemingly out the blue. I put the order in for the chip, along with the other parts I needed to finish my dad’s build. After waiting about 2 weeks for everything to show up, I went to go collect, only to discover that the APU had been cancelled out of my order. Yet when I followed the links, it was marked as on sale again in the store. Thankfully the refunded store credit matched the price and I immediately re-ordered the chip. 2 day later the chip was delivered to my work without any fuss. Unlike the retail boxes, I got a very plain box with no external markings of note. Included was a SR-2 cooler and the chip itself wrapped in some foam. I wasn’t impressed by this as that is just begging for the pins to be bent.

Sure enough, when I attempted to install the chip, it wouldn’t fit into the socket. Resisting the urge to force it in, I got a business card and ran it through the rows of pins where I saw the most resistance. It took a couple of tries but I was able to eventually straighten the bent pins enough that they went into the socket. Apart from my Ryzen 2700X, I haven’t worked with CPU’s with pins since the Pentium 3 days (I never fiddled with Socket 478 for Intel or any AMD socket prior to my 2700X)

Once powered up, the annoying ASRock motherboard detected the APU and has worked solidly ever since. I did have to flash the board first with my own 2700X to get the firmware updated so that it would support the APU. Unfortunately despite being a high end motherboard, ASRock did not see fit to include a BIOS Flashback type tool where you can update firmware without any other components available. Having that would have saved me a good few hours to say the least.

In the three weeks since the build was finished, the system as a whole has run super well. Thanks to the advancement of technology, the chip has the same 4 cores / 8 threads as the previous Intel chip based PC had, but is so much more powerful it’s not even funny. Power consumption as well is more than halved – proof enough of this is feeling how much cooler the metal on the case above the power supply in the PC is – it’s cool to the touch compared to the heat generated with the old system. The chip itself has more than enough grunt for my dad’s office type workloads and has accelerated his day to day job thanks to the improved response of the system.

I doubt this is a chip that will ever get any form of retail love and attention going forward, as AMD have already announced the 5000 series APU’s that are based on the Zen 3 architecture. That being said, this is a competent little chip that is more than powerful enough for any office or light work loads you may need. Since the built in graphics is decently powerful, you also don’t need to include a PCIE graphics card, reducing heat and lowering power consumption. If you are looking to build something entry level, this chip is seriously highly suggested.

Raidmax Exo Special Edition of cheapness

October 31, 2020 Leave a comment

I recently decided that I was going to resurrect my first PC I ever built and get it operational again, serving a second life as a retro Windows XP gaming PC. I still had the CPU, GPU, RAM, motherboard and Creative X-Fi sound card. All I needed was a DVD drive, case and power supply. I took a step forward this week by obtaining a case, namely the Raidmax Exo “Special Edition”

I did not want to break the bank buying an uber expensive case for a 13 year old computer system, but I did need an ATX sized case that had 2 x 5.25” drive bays, something that is becoming a rarity on modern cases as optical drives fade away. Bonus points for a front USB3 port of some kind to transfer files at faster speeds. The Raidmax Exo SE was one of the cheapest cases on Takealot, our local version of Amazon. It ticked all of my requirements boxes, so I bought it. Bizarrely enough, the case includes a built in SD/Micro-SD card reader, which is nice to have, though it is capped at USB2 speeds. The side panel is see through, but it’s not glass. Some kind of plastic or Perspex I think.

When it arrived and I opened the box, I understood why the hell the case was so cheap. It’s made of incredibly thin metal for one thing. One gets the impression that you could probably bend this case in half with your bare hands if you were determined enough. Not great, but I can live with it. However, the promised 2 x 5.25” drive bays is a lie. Technically they are there, but only the top bay is actually usable. The bottom bay is blocked by a metal grille that sits behind most of the front fascia of the case. The (only) 120mm case fan that is preinstalled fits onto this grille. Try as I might, I couldn’t see any way of removing this grille and the section wasn’t something that was designed to bend off or out the way. There’s no paper instruction manual included and the grille looks riveted in place anyway. I think the grill is also there as a supposed mount for up to a 360mm radiator, but I cannot see anyone hooking up water cooling of any sort in a case like this.

Damn. The case was officially not fit for use for my project. I could have returned it, but that would have involved extra effort. Luckily, my dad came up with the ultimate answer – swap the guts of his PC into the case and he’ll make use of it. This would work and let me use his current case, which was actually my original PC case – now the original system would be coming home so to speak.

I proceeded to swap the guts around into the Raidmax case. Nothing too complicated and that I haven’t done hundreds of times before. The full sized ATX motherboard fit as advertised, but space is definitely at a premium inside the case. I took the chance to swap out the enormous Geforce GTX 480 that was in my dad’s PC with a far more modest Geforce GT610, as my dad doesn’t play games and doesn’t need that monster card sucking up oodles of power for nothing. As a result, his PC now runs quieter and more energy efficient than before. I still need to replace the TIM on the heatsink, as I suspect that after many years of use, the stuff is quite dry and not conducting heat the way it should.

I haven’t connected the front 120mm fan yet, as it is powered via a Molex plug – another sign of cheapness and corners cut. This means the fan will run at full speed non stop, making excess noise. I have a couple of 120mm fans that are powered by a 3 pin connector which will connect to the motherboard instead. I just need to get a needle nose pliers from work to help me remove the front façade of the case so I can swap out the fan.

Overall, the system looks nice enough, even though it is extremely budget orientated. You are definitely not getting high end features, but this would work well for an average office or school PC type scenario. Come to think of it though, many systems can be replaced now with NUC mini PC type systems, as those contain all the power an office or school user will generally need. The heyday of the full tower case is long over for casual use to be honest. I wouldn’t ultimately recommend this case unless you were on an extreme budget or need something incredibly basic for a first build. There are much better cases out there that will serve your needs better in the long run. There’s potential in the design and looks of this case, but ultimately too many corners were cut in order to reduce costs and it shows in the build and quality of the final product.

20201031_15491320201031_154939

USB compatibility issues in 2020?!?

October 25, 2020 Leave a comment

USB is such a part and parcel of our daily computing lives that we don’t even think of compatibility issues anymore. With Windows 10 in particular, you don’t need to find drivers for the USB controller on the computer and in most cases if you plug in a device, Windows will go and fetch the driver files for you from Windows Update, provided the manufacturer supplied drivers to Microsoft of course.

However, I have run into an incompatibility issue with my AMD Ryzen based computer – my Canon EOS 5D Mark IV camera absolutely hates the USB 3 (front panel USB 3.0, back panel 3.1 Gen 1 and 2) ports on my ASUS ROG STRIX X-570-F Gaming motherboard, but will happily work on the front panel USB 2 ports – these are actually the only USB 2 ports available for this PC.

When connected via USB 3, the camera shows up and you can see the SD and CF cards in the body, but when you open a folder of images, the camera essentially hangs – File Explorer’s Green Bar of Doom™ crawls ever so slowly onwards but nothing further happens. Trying to import using the Windows 10 Photos app or Lightroom will fail and hang the application until you power off the camera or kill the application.

A bit of internet sleuthing shows I am not the only one who has this issue:

For completeness’ sake, here is a picture of my Device Manager with my USB Controllers section expanded:

USB

As per some of the troubleshooting tips in the links above, I plugged my camera into a couple of different systems I had access to. Results were:

  1. Intel Core i5 7th gen work computer – no problem.
  2. Apple 2014 MacBook Pro with Intel Core i7 4th gen – no problem.
  3. Dell Venue Pro 11 tablet with Intel Core i3 4th gen – no problem.
  4. Home built Asus ROG desktop with Intel Core i7 1st gen and Renesas USB – no problem.
  5. Huawei MateBook D15 with AMD Ryzen 5 3500u – no problem.

The fact that the Huawei MateBook D15 doesn’t have the same problem as my desktop computer is odd, since they are both AMD based systems. It could be due to the difference in chipsets, as my desktop is the X570 chipset and the MateBook is something lower end/mobile focussed.

It’s a bizarre issue and from what I have read at the links above, it also seems to only affect Windows. Users have reported that the issue doesn’t occur under Linux, which indicates that the issue is somewhere in the USB driver code AMD has given Microsoft for their systems, or that Microsoft’s USB drivers for AMD is somewhat buggy rather than it necessarily being a pure hardware issue.

I had the same issue with my previous ASRock X470 Taichi Ultimate motherboard and different versions of Windows 10, so the issue is either related to the high end AMD chipsets or some bad USB driver code somewhere that only affects AMD based systems. The fact that the MateBook worked ok makes we wonder if it’s not directly chipset related in the sense that Microsoft’s USB driver needs extra code for any quirks the higher ends chipsets have compared to the lower end/mobile chipsets.

Ultimately, I just bought a small cheap USB 3 card reader to solve the issue. I actually don’t like plugging in USB connectors into camera ports to be honest, as I always worry about wear and tear on those somewhat frail connectors. Replacing a USB connector on a camera motherboard is not cheap! It seems like a bit of mountain/mole hill situation in the end since the card reader is the preferred way to transfer lots of images anyway, but I do get annoyed with incompatibilities like this and my nature is to want to get to the bottom of things and fix it!

At this point it’s probably too much to ever expect some combo of AMD, Canon and Microsoft to fix the issue. Canon have moved on to newer cameras now and AMD probably won’t ever bother spending the time and effort to debug this issue and send in updated code to Microsoft so that the Windows 10 USB drivers can be updated.

Categories: Computer Hardware, Personal Tags: , ,

Samsung Pay is awesome

September 5, 2020 Leave a comment

Samsung Pay very recently turned 2 here in South Africa and in celebration, I thought I’d write this post detailing my experience using the service. First, some background.

Samsung Pay is a payment platform developed by Samsung that enables you to use select Samsung Android phones and Galaxy Watches to pay for goods at a Point of Sale terminal that takes tap ‘n go credit/debit cards, aka contactless payments. Your modern credit/debit card has a NFC chip embedded inside itself to provide this feature, whereas Samsung Pay uses the NFC chip inside your phone/watch to do so. On some phones you can also use Magnetic Secure Transmission (MST) in case the terminal is an older device that doesn’t support tap ‘n go (these are getting rarer as banks have been on a mission to get shops to upgrade their terminals)

Worldwide, Samsung Pay is one of a number of competitors in this field, the most well known probably being Apple Pay which does the same thing but is limited to Apple hardware only and doesn’t offer MST for older terminals. Huawei have Huawei Pay, but this is currently very region limited. LG have LG Pay, similar region limitations. Google’s own Android Pay will work with any compatible Android Phone, but again is also region limited. So as far as I can tell, the 2 best options world wide are Apple and Samsung Pay. Here in South Africa, only Samsung Pay has launched successfully and has been running for these past 2 years, beating Apple to the punch.

Banks have the ability to code their own payment solution into Android based phone apps so that they too can support tap ‘n go payments, but the problem with that method is that you lose out on the Apple market, as Apple doesn’t permit any other NFC wallet solution on their phones – it’s Apple Pay or nothing for tap ‘n go.

Anyway, back to Samsung Pay. I initially installed it on my Galaxy S9+ when my bank’s cards were supported by the app – it took about 6 months before my bank worked with SP. I performed about 3 or 4 payments with my phone, feeling very impressed with myself. Once I used MST to make a payment at a restaurant and the waiter was super surprised to see that work. However, the bliss did not last. About a month or so after signing up, SP suddenly wouldn’t work with my cards. I tried removing and adding, updating the app and more, but my payments failed rather miserably, which was not only embarrassing in store, it also meant I had to have my card on me and held up queues as the teller now had to repeat the transaction.

What I wasn’t aware of at the time is that there was an issue with SP and First National Bank, some tech issue on the backend that prevented FNB credit cards from working properly. Unfortunately as I didn’t know this, I gave up on SP and went back to fully using my card, hoping to come back to it another time down the line. Time passed by and I simply never did try again later with the S9+.

Earlier this year I upgraded to a Galaxy S10+ on a new contract. I installed SP and connected my cards again, but didn’t make use of the app simply out of a lack of urgency. However, I finally got around to testing it about 3 weeks ago and to my great joy, it worked again like it should. Open the app, prepare for payment, tap phone, payment done. I am starting to feel the confidence to not bring my wallet with me for outings or trips, which is one less things to worry about and carry around. It also increases my safety somewhat as the card on the phone is more secure than the plastic kind.

Here’s some screenshots of SP in action:

Screenshot_20200905-073432_Samsung Pay

Figure 1: Main screen of Samsung Pay after opening
InkedScreenshot_20200905-073457_Samsung Pay_LI

Getting ready to pay after verification with fingerprint
InkedScreenshot_20200905-073516_Samsung Pay_LI

Tap the card to see a purchase history

In addition to payment cards, you can also load up loyalty cards. I think these can only be cards with barcodes on them that a teller scans, not the kind of card that requires a magnetic stripe read or has a gold chip (not a NFC chip). I still need to experiment with adding something of mine to the app.

As mentioned earlier, SP is the only payment solution of it’s kind here in South Africa. Huawei Pay was mentioned as being close to launch, but after Huawei’s strangling by the US Government, I can’t see their Pay solution surviving. LG Pay is currently only available in the USA and South Korea. Google Pay is more widespread but doesn’t exist in SA. Apple Pay again also doesn’t exist here. I wish all of the payment options were available here in SA, which would just help spread the word and ease of use of using your phone to pay instead of having your cards on you. Admittedly the situation has gotten a lot better over the last 2-3 years as merchants have upgraded their terminals to support tap ‘n go cards and it’s now probably quite rare to find a non NFC enabled terminal.

I would like to get a Galaxy Watch at some point soon so that I can make life even easier for myself and maybe blow some more minds at the till, since I don’t think paying by watch is yet a very common sight.

Categories: Cellular phones, Personal Tags: ,

How things change over time…

February 16, 2020 Leave a comment

I recently came across a YouTube channel that goes by the name of RetroSpecter78. This channel has quite a few videos where the author restores older computers from the late 80’s and 90’s and gets them up and running again, using period correct operating systems as well. It’s nostalgic and a lot of fun, so check out the channel if you can.

That being said, watching these videos triggered some thoughts in my head about my time in IT, my education in IT, how things have changed and so on. I ended up having a chat with my father about this, reminiscing on the fact that my private college diploma really didn’t prepare any of us for the world out there, was overpriced and that the college was only too happy to bleed students dry of money by offering course extensions. That’s a rant for another blog post really, but combined with my earlier thoughts, it made me want to write this post.

I finished high school at the end of 2003. In that era you were dealing with late Pentium 3 and early Pentium 4 (plus AMD equivalent) based PC’s. Windows XP had been on the scene for a good 2 and third years almost as well as Windows 2000 Server on the server side. USB was established. Hard drive sizes were starting to go up and DVD-Rom drives were no longer high end options. RJ-45 connectors and 100Mb/s ethernet were the norm. Hubs had given way to switches in most places by then.

When I went to college in 2004, the course I did was a mix of technologies. A+ taught us about hardware on the PC side, N+ taught us about network fundamentals, Inet+ taught us about internet related things and so on. Unfortunately the course also had legacy stuff like Windows NT4 Server thrown in when Server 2003 was already on the scene. There was a Linux component as well, using the then already outdated Red Hat Linux 8.

Thinking back, the course was designed for an 8 month period and gave one a decent foundational education in the field, but it also lacked a lot of stuff one would encounter out in the real world:

  • Any form of hardware/software firewall technology.
  • Any form of corporate mail/groupware systems such as Exchange or Lotus Notes.
  • Proxy server for internet access – WinProxy, ISA Server, Kerio products, Squid etc..
  • Proper managed switch training, although I can understand the lack of this due to many different manufacturers.
  • Firmware updating PC’s and servers and switches etc..
  • Advanced networking concepts such as VLANS, routing and so on.
  • Working with actual servers and server grade equipment such as racks, rackmount servers, remote management tools, SCSI drives etc..

In the intervening years since then, a lot has changed. Hardware standardized around various ports, slots and standards. Wi-Fi went from a slow curiosity to something we can’t live without. Groupware systems have moved into the cloud. Hypervisors matured and virtualization became part and parcel of the job. Hard drive sizes increased stupendously and backup systems had to evolve. Networking moved to gigabit as standard and 10 gigabit is steadily making its way down into the mortal realm. Fibre optic cable runs are cheaper than ever. Broadband has totally dominated internet access and costs have come tumbling down. UEFI replaced the BIOS and despite a somewhat rough start has settled in well. Linux is in more places than ever before.

One thing I can say is that I have learned more on my 2 jobs than I ever did at college. I guess that is a given, but it’s always made me think that the diploma we did back then was designed to churn out low level PC technicians, not real network administrators. Exposure to new technologies and equipment plus the drive and curiosity to learn has taken me far over these last 16 years. PC equipment has also come a long way, lasting longer than before. Performance has largely plateaued CPU wise and NVME based SSD’s will ensure that PC’s stay snappy performance wise for much longer. It seems these days my job is more about management and paperwork and big ideas rather than the nitty gritty hands on, although I try and always stay involved with that where possible so that I don’t lose my roots.

I sometimes wish I could erase all my memories and start fresh in today’s world, there is so much to learn and keep your mind occupied. But then I would never have experienced co-ax network cables, Novell Netware, Mercury and Pegasus Mail, Exchange 2000 and all sorts of fun stuff over the years. What the future holds I cannot say, although I can predict that the cloud is going to be even more entangled in IT, printers won’t go away, PC’s will last longer (as will servers), technology will both get more complex yet also stagnate and incremental revisions to hardware, ports, slots and software will be the norm.

Saving old memories

The school I work at will be 60 years old in 2017 – a pretty decent milestone for a school, though there are many older schools here in Cape Town. As with any institution that has survived this long, there are bound to be many old photos of events in years gone by. A school is a very busy place with hundreds of events each year: sports matches, outings, camps, tours domestically and/or internationally, dramatic presentations, musicals, concerts, prizegivings and more all lead to many potential photographic opportunities.

Unfortunately, for the last 30 years or so, the school has largely relied on one person to take photos and keep a visual record of the school: my direct boss. From when he arrived in the mid 1980’s through to today, he has been building up a massive collection of photos. Since 2004, all the photos have been taken digitally, so there is a good archive that has built up the last 11 years. However, prior to that, everything in the school was done on 35mm film and this is where the problem comes in. All the photos on colour slides are in a slow race against time to be preserved. All colour slides will discolour and fade in time, more so if they are not stored properly. Once the colours are gone (or shifted too badly to rescue,) all the unique memories on those pieces of plastic are gone forever. Colour negatives are a bit more stable if stored in the plastic strips they came from the photo store. Black and white negatives are probably the most stable of the lot.

At the end of 2014 through a chance discussion with the school librarian, I discovered that there was a batch of slides sitting in a box in one of her cupboards. I asked her if I could take them to start scanning them, as we luckily have a Canon scanner that can scan slides and negatives. I was thinking of having them professionally scanned by a specialised company here in Cape Town, but the price would quickly become prohibitive for the number of slides that needed to be converted. As such, I’ve been slowly chipping away at the first box of slides, scanning them at 4800 dpi and saving the resulting large JPEG file. My boss has promised to colour correct and touch up these slides in Lightroom/Photoshop Elements when I am done scanning, after which we can upload these photos to our dedicated Past Pupils page on Facebook.

So far I’ve managed to scan about 165 slides, most of which I’ve taken out the holders to do so, especially the glass ones. It’s become clear that many of the photos were soft or slightly out of focus when taken originally, but it probably wasn’t noticed at the time. Also, 30 odd years of age on the film itself also doesn’t help either. There’s still a pile of probably about a hundred to go, though I’ve managed to whittle out private slides of my boss or slides that were too far gone to bother rescuing.

With the end of that box in sight, I went back to the library last week looking for anything more. As many slides as there were in the first box, they only cover a small time period of the school’s history – 3 or 4 years at the most in the 1980’s. After some more scratching and an impromptu spring clean by the librarian, I took possession of another box of slides, as well as dozens of packets of negatives, both colour and monochrome as well as some printed photos. Once the initial box of slides are done, I can focus on the negatives. Thankfully, scanning the negatives will be a little less time consuming, for the simple reason that I no longer need to take the film out of holders. I simply mount the strip of 4 negatives and scan away, estimating a saving of about 5 minutes per batch.

The biggest downside of the 35mm products is that in today’s digital world, you cannot share the memories on those pieces of plastic if you don’t digitise them. Digitised, you can share them online as well as use them inside the school for projection during events. Projecting slides today isn’t impossible, but getting a slide projector isn’t easy, not to mention that the mere act of displaying the slides will reduce their lifespan even more due to the heat of the lamp. For archival purposes, having the photos in JPEG format allows the files to be replicated all over the show, avoiding any one point of failure. If the film is damaged and destroyed, there is nothing to fall back on, especially in the case of slides. While JPEG isn’t up to true archival quality or standards, in computing terms it’s probably the closest thing there is. Every consumer operating system since Windows 95 can view the files, which is a good 20 year track record now. It’s of course nowhere near film’s 130+ years of service, but for now, it’s a good enough solution.

DStv Explora Setup and review

Here in South Africa, one doesn’t have too many options when it comes to TV channels. The public broadcaster has 3 free to air channels, while a fourth free to air, e-tv, is a private business. On the pay TV side of things, there is either DStv, or StarSat (previously known as Top TV,) both of which are satellite broadcasters.

We’ve had DStv since 2008, when we purchased at that time, the top of the line SD PVR decoder. The device could display 2 different TV channels at the same time, while also recording a 3rd channel in the background. The resolution was standard definition, which wasn’t a problem when all the TV’s in the house were small CRT based things. However, since I got the large TV in the lounge a few years ago, putting up with SD quality on that screen has been slowly driving me nuts. Throw in the at times instability of the PVR and I found myself itching to upgrade.

DStv introduced some HD decoders a few years back, but apart from one device that offered the same features as the SD decoder, they were limited to 1 view, 1 record. Throw in the fact that these decoders were even more unstable and I decided to wait a little longer.

Last weekend, I finally ended up purchasing the new DStv Explora. The Explora is a new and modern HD decoder, although still sadly limited to 1 view, 1 record. The interface on the decoder is a lot more modern than any other decoder DStv has ever produced, and it has a 2TB hard drive inside, which ensures much more space for recordings. With the SD decoder I found myself often butting up against the recording limit.

The Explora is securely packaged in the box, wrapped in a nice layer of bubble wrap. The device isn’t too heavy, but feels solidly built despite being mainly plastic. There were no creaks or other defects out the box. Unfortunately for whatever reason, the power supply has now migrated from being internal to being a power brick. I suppose it makes sense that if there is a power surge or something, it’s much easier to replace a power brick than the whole decoder. Still, power bricks are often unsightly and contribute to cabling clutter.

The old SD decoder is quite noisy, with a very distinct fan drone emanating from the machine at all times. The Explora is a lot quiter, and seems to run cooler as well, despite it’s vastly upgraded internals. Hard drive noise is also far less evident, thanks to modern drives which are a lot quieter than the 250GB model in the SD decoder.

I chose to install the Explora myself, without making use of an installer. There was no need to pay someone to do the job, since we already have a large enough dish and have a twin cable feed coming in from the dish. From there, the process is simple:

  • Screw cables from the dish into the top inputs on the included multi-switch.
  • Connect one output cable on the side of the multi-switch to the Explora.
  • Connect two cables from the bottom of the multi-switch into the inputs of the existing SD decoder.
  • Use a F connector splitter to split the feed from the RF output of the SD decoder. One cable goes to the RF input port of the Explora, the other cable runs to the secondary TV that was always hooked up.
  • Use HDMI cable to hook up Explora to my amp, which in turn feeds the TV.
  • The reason to interconnect the 2 decoders is to enable DStv’s Extraview feature. With this feature enabled, you are able to use 2 interlinked decoders on the same subscription for a nominal amount every month. With my particular setup, we can theoretically watch 3 completely separate TV channels, whilst recording 2 different programs at once.

The installation really isn’t difficult if you already have a previous DStv in your house and it meets the requirements for the Explora. The rest is just an exercise in patience as you connect multiple cables. Depending if you are making use of Extraview to interlink 2 decoders or not, you may need to purchase 3 extra co-axial cables and a F connector splitter.

So far, so good. The Explora has been running a week with no problems that I’ve detected. Most of the channels are still SD resolution, but they are being upscaled better than the old SD decoder could ever do. HD content on the other hand looks lovely, if not quite Blu-ray lovely. Still makes a huge difference in things like live sport though.

Overall, the Explora is a worthwhile upgrade. From any SD decoder it’s a big leap, while the increased space and stability puts it above the older HD decoders. Time will ultimately tell how stable the Explora will be, but I am strangely optimistic the device will hold up well over the coming years. Although the device is quite pricey, it has been on special a few times already.