Archive

Author Archive

Fun with Linux on old hardware

Most of the world’s desktop computers run some flavour of Microsoft Windows. Many corporates and enterprises have volume licensing in place that will use a key management server or Active Directory activation to keep their PC’s automatically licensed. When it comes time to dispose of these PC’s, they cannot in most cases be legally sold with Windows still installed and with a KMS/AD activation system in place, the PC’s licensing will expire at most about 6 months after disposal since it cannot contact the internal activation systems anymore.

The PC’s can either be disposed of after being software wiped and leaving whoever purchases it to handle installing their own operating system – or one could be charitable and install a Linux distro on the PC so that it’s ready for use out the box so to speak. This is what I have been doing at my school as I dispose of old computers, as even if the buyer intends to install Windows again, with Linux pre-installed at least the computer is usable for many tasks immediately.

In most cases, Linux Just Works™ and the computer is ready after about an hour or so of work. Whilst I am not a fan in general of Ubuntu and its family, I install Kubuntu on these PC’s due to KDE’s approximation of Windows, as well as the OEM install mode that makes first setup for the end user much more like a Windows out of box experience. Why more Linux distros don’t have this OEM install mode, I don’t know.

However, sometimes you hit issues thanks to hardware – more often than not it’s due to having an Nvidia graphics card of some sort. Most AMD and Intel graphics solutions will just work out of the box thanks to open source drivers in Mesa and you don’t have to do much of anything else. Nvidia cards though are a different story and this is where I hit a brick wall this past week.

The PC in question is a 2nd gen Intel Core i3 system, running on a DQ67SW motherboard, with an Asus Geforce GT430 dedicated GPU in place. Installing Kubuntu 23.10 wasn’t a problem and everything worked perfectly except for one thing – the cooling fan on the GPU ran at 100% using the open source Nouveau drivers. My research and fiddling didn’t make an improvement unfortunately. It also may be that Nouveau can’t control the fan on the Asus card in particular, as a note on one of the Nouveau pages says it can’t control I²C bus connected fans. I couldn’t in good conscience sell this PC with the fan running like that, so that left me looking at the official closed source Nvidia driver. This is where the pain/fun started…

The GT430 card is supported up the 390 driver version, which is not available for Kubuntu 23.10. More reading online pointed out the fact that Nvidia has ended all support for these old cards (totally understandable) but that the driver series is also abandoned. Because of this and of how Linux works, the 390 driver doesn’t work with kernels beyond version 6.2 apparently. Since an older kernel like this wasn’t available in the repositories for 23.10, I stepped back to 22.04, which has a wider range of older kernels available but also supports the 390 driver.

I downgraded to the 5.19 kernel and installed the 390 driver without much of a fuss. The GPU fan spun down to reasonable levels and the system appeared stable enough. Just one problem though – I had lost audio output on the onboard audio. Error logs showed an error about how the device couldn’t be configured during boot, so it was disabled. This was a showstopper, so I had to keep working. I knew the 6.5 kernel in 22.04 worked fine with the audio, so my thought was to see if a 6.1 kernel would solve the sound issue whilst still letting the GPU work with the 390 driver. The 6.1 kernel did not fully fix the issue – it would now properly load a driver for the sound card, but would complain about missing codec support. Different issue, same outcome of no audio.

No matter what I tried, I wasn’t able to get the onboard audio to work. Eventually I just gave up and went rummaging in our storeroom, where I was able to find a very basic C-Media based PCI sound card which I installed. I disabled the onboard audio, disconnected the front panel header and booted back up – lo and behold the audio worked immediately with the dedicated card.

Whilst the PC is now ready for sale, I still feel frustrated by the whole mess. I know a large part of it is due to Nvidia’s closed source driver, but the blame also lies with the Linux kernel devs who don’t want to create a stable API/ABI, not to mention the zealots who want all pure open source code for all drivers. I can’t blame the Nouveau authors for not fully supporting the card, as this thing is 15 odd years old and I don’t know if anyone even cares about such old hardware at this point to bother with improving and truly completing the drivers.

The funny thing is that even though the hardware is now ancient, I can install Windows 10 22H2 on the machine and install Nvidia’s 390 Windows driver to get the card going at least. Sure performance is horrible and the driver is abandoned and possibly even a security risk, but it works thanks to the stability of the API/ABI.

Rant aside though, this may also just be a very unique situation thanks to the GT430 card. I’ve prepared other PC’s with a Geforce GT610 card and the cooling fan on that never ran at 100% under Nouveau – the 610 cards were made by Gigabyte vs Asus for the 430, so maybe Gigabyte used a way to control the fan Nouveau can actually use. Thankfully I don’t have many of these ancient pieces left, so future systems will either be sold off with a GT610 or 710 card or onboard Intel graphics, all of which should just work properly without any fuss. I’m also entering the stage where machines being sold off are now all UEFI based and not old school BIOS or the fussy early UEFI implementations pre Intel 70 series chipset.

Install blues – Clive Barker’s Jericho

Continuing the theme from my last post, I tried to install my boxed copy of Clive Barker’s Jericho onto my PC as it’s a game I never completed and I would like to now take the chance to actually finish the game. Once again, it’s a game that isn’t currently available on any digital platform for some reason, neither GOG.com or Steam.

Popping the disk in, I am unable to run the Autorun menu on the disk as it requires Adobe Flash, long since discontinued of course. Fine, I next tried to run the setup.exe file directly off the DVD, only to be hit with a User Account Control prompt telling me that an administrator had blocked access to the program, with no option to override and continue the install – see screenshot below:

Jericho installer

Unfortunately, what has happened here is that the installer file was digitally signed by CodeMasters, the original publisher of the game. This is great, as it goes a long way to prove that the file is legitimate and isn’t malicious. Except there’s two problems – the signature is using SHA1, an algorithm that was deprecated by Microsoft a good few years ago now for security reasons and secondly, the signing certificate expired in 2010, whilst the rest of the chain has either expired, been revoked or both. I’ve only ever encountered one other piece of software that had this kind of problem and in that case, I gave up trying to get it installed. In this case, this was a game I wanted installed so I was more motivated to find a solution.

The internet wasn’t directly helpful as I couldn’t phrase my search query correctly to get a proper response. However, one post did lead me in the right direction eventually. Basically, enable the built in Administrator account in Windows 11, log in as said user and run the installer that way. To my relief, I was able to do so and install the game successfully. Once complete, I logged out and back into my normal account and disabled the Administrator account again.

To run the game, simply run the Jericho executable from inside the Bin folder of the install location and the game runs. The only other thing to watch out for is to make sure that the Nvidia PhysX legacy package is installed, as the game needs PhysX to run. Thankfully, unlike Medal of Honor Airborne, Jericho doesn’t install the old Ageia runtime by default, nor does the installer hard link it as a requirement before continuing.

This time horrible old DRM wasn’t to blame thankfully, but it’s still highly annoying to have to jump through the extra hoops to get the game installed. The game is long since abandoned by now of course, but unless it ends up on GOG.com where it would get a new installer, the game will continue to have this extra installation hoop to jump through for anyone who wants to play it.

LATE ADDITION: I discovered a small tool called delcert, which you can download from the XDA forums. Simply copy the install files off the DVD into a folder on your PC, copy and paste the delcert.exe file into the same folder and from a command line type delcert.exe setup.exe. Delcert will strip the signature off the executable, leaving you with a file that you can run as normal. Install the game using the setup file and all should be well. Since the game DVD is a read only medium, you have to throw in the extra step to copy the files to a local disk where you can modify files.

Retro Gaming blues

One of PC gaming’s biggest selling points is that there is an absolutely incredibly rich back catalogue of games reaching back decades at this point. Thousands upon thousands of games in every possible genre and in dozens of world languages, released by studios big and small all over the world. This is truly one of the great selling points of PC gaming, but it comes with a downside that doesn’t seem to get a lot of coverage that I’ve seen, but of course the internet is a big place and I don’t frequent the gaming sites the way I used to when I was younger.

For myself in particular, I am not even really talking 90’s games all that much but rather games made in the mid to late 2000’s. The thing is, the game engine itself more often than not will run even under Windows 11 x64 fine, but getting the game and getting it installed is by far the more difficult challenge. Let me explain…

Until the major rise of Steam and other digital download services, most gamers got their games on optical media – CD or DVD – and thanks to rampant piracy in the industry, publishers turned to copy protection systems such as SafeDisc and SecuROM amongst many other systems. None of these systems made headways in truly curbing piracy, but they left behind an unintended toxic legacy: legally playing these games years later on modern computers.

If you were someone who bought a lot of games on physical media and built up a collection, chances are there are games you own that are not available on any digital download service such as GOG.com for whatever reason. This means that the only way to play the game is to install it from the original disk. Unfortunately back in 2015 or so, Microsoft issued patches that totally kill the ability of SafeDisc and earlier versions of SecuRom to work. These protection systems will not work at all on Windows 10 and 11 and if you fully patched an older Windows 7/8 retro gaming PC, they too will not play these games. The only option left to you then is to find a crack so that you can run the game without having a disk in the drive. Of course, this is presuming you still have a PC with an optical drive in it – so very many computers no longer have optical drives at all.

Let me use an example of a game I own: Medal of Honor Airborne. The game uses the Unreal 3 engine and whilst it is a good 15 years old now, it is still more than playable. However, I had a terrible time installing it on Windows 11 off the disk:

  1. The setup process struggled mightily to go past a point whilst reading off the disk. The disk is not faulty or rotten, but neither of my Blu-ray drives wanted to get past this point. I had to copy the files to a folder on a hard drive and install from there to get the game installed.
  2. The setup process is hard coded to install a now ancient version of AGEIA PhysX, which will not work if you already have a modern Nvidia version installed. If the setup process cannot install it’s included PhysX, it will abort and refuse to continue.
  3. To get around this, I had to remove my existing PhysX, install the game with its included version, remove that and then install the Nvidia PhysX legacy package, which lets you run older games whilst at least having a more modern runtime than what was included with the game. Lastly, I had to reinstall my modern version of PhysX as well.

Apparently, the version of the game that is available on the EA Play platform doesn’t have these hassles, but I specifically wanted to go through the manual install process to see if I could work my way through the hassles.

Another game that I have is Pariah, made by Digital Extremes, using the Unreal 2 engine. The game installs fine, but cannot run with the disk in the drive as it uses SafeDisc. This game is not available on any digital platform for whatever reason, so the only way to play it is to download a crack to circumvent the dead copy protection, which I duly did. I have no moral qualms about that, as I paid for the game and I have all the original packaging. Is it my fault that the game can’t directly run in modern Windows? No, so I am fine with using a crack.

I would like to say that GOG.com does an amazing job of finding, fixing, stripping all the DRM and releasing all sorts of older games, but as wide as their catalogue is, there are plenty of games that aren’t available unfortunately.

It’s easy to be nostalgic about older games and maybe feel concerned that a service like Steam has become the metaphorical 800lb gorilla, but it really does make gaming easier. No need to worry about finding and managing patches manually, installation guaranteed to work and many other benefits. Don’t get me wrong, I would always prefer to get my games on physical media so that I always own a copy of my game plus have a nice manual and any other included items, but I realise that the industry will never really go back to that method of distribution again. I suspect that the current gen of gaming consoles might be the last to include an optical drive in them and once they are replaced in a few years time, the only way to really own your games anymore will be to hope its available on GOG so that you always have an offline installer and can back up the install files at will.

When optical disks suck…

November 13, 2022 Leave a comment

I recently purchased a brand new Pioneer Ultra HD compliant Blu-ray writer for my PC. In terms of optical media capabilities, I believe it’s capable of reading and writing to anything that is still relevant on these kinds of disks in 2022, which makes it a great backup unit for my decade old LG XL drive. What is the point of this intro? Simply put, anything but Blu-ray in optical media terms is dead to me.

I have a spindle of burned DVD-R disks that contains many older games (pirated from my younger days to be honest) and I thought that it would be a good idea to consolidate many physical DVD’s down to 1 or 2 Blu-ray disks. Near a decade ago, most of these games were burned to CD and I simply created ISO images of each disk, which was then burned to DVD to save physical space. I have long been aware of the issues with burned media failing to read after some time had passed, but as usual I didn’t think the problem would hit me personally until a few more years had passed. Sadly, the issue has hit me and it hit me a lot harder than I was expecting.

So far, 3 of the disks in the spindle have developed unreadable sectors, causing those disks to be unsalvageable. As mentioned, the content is old PC games, most of which I can now legally purchase from a site like GOG.com – not only would that be supporting a good company, but it would also ensure that the games actually work on modern PC’s and have no horrible and possibly non-working with more modern Windows copy protection on them. The problem however is for games that aren’t available on GOG and there’s no other real legal source for them, but then again I am not going to be going back to pirating the games and to be honest, most of them are so old that it’s not even worth my time and effort in the end.

This little experience though has shown me why optical media in general has basically fallen by the wayside and won’t be having a come back the way vinyl or photographic film has. CD-R and RW disks are basically limited to 700MB and dual layer DVD to about 8.5GB. Not only is that capacity seriously low in this modern era, read and write speeds are also really frustrating. Even a good USB 2.0 flash drive will read and write files faster than any DVD drive and will also be silent when doing so, not to mention being able to have stupidly large capacities.

That being said, I’m still a fan of Blu-ray – mostly for movies but I am coming around to it on the PC side as well. Granted I don’t know the long term stability of burned Blu-ray media just yet, but by all accounts it has got to better than DVD and CD, formats that date back from the 90’s and 80’s respectively. I must admit though that the process of burning my 1st BD-RE disk wasn’t a lot of fun. I copied and pasted files onto the disk as if it were a giant flash drive and whilst Windows had no problem doing this and burning the disk, the write speeds were slow and both my drives did a lot of spinning up and down whilst creating the BD-RE. It doesn’t help that the disk I was using was limited to only 2x speed. I’ve burned a BD-R before at close to 6x and that was a much better experience. Nonetheless, I managed to consolidate a large pile of DVD-R disks down to 1 BD-RE disk.

Whilst burned media is far more likely to cause the issues with readability like I mentioned above, commercially pressed disks aren’t immune. I’m currently struggling to make an image of my store bought copy of Medal of Honor Airborne for example. According to ImgBurn, layer 0 of the disk read just fine, but transitioning over to layer 1 and the read speeds have cratered. The disk is thrashing a lot in the drive and whilst there are no read errors currently, the disk is reading files 32 sectors at a time or roughly 44 kB/s. This makes me wonder just how many of my other disks may have developed issues that I don’t know about…

In closing, I’m feeling a mixed bag of emotions about optical media. Having a game or movie or TV series on disk means that you own the contents and that it can’t be arbitrarily taken away from you like media content on a streaming service can be when the rights owners decide they want to build their own platform or want more licensing money. Games and software that require activation is another story as if those activation servers go offline, you are sitting with a piece of useless polycarbonate plastic. Most people don’t miss optical disks anymore and with good reason. Optical disks can be good, but when they suck, man do they suck.

Updating the firmware on Ricoh MFD’s

In the not so old days, getting firmware updated on your multi-function devices a.k.a the photocopier required booking a service call with the company you leased the machine from, then having a technician come out with a SD card or USB stick and update the firmware like that. It was one of those things that just wasn’t done unless necessary, as by and large the machines just worked. Issues were usually always mechanical in nature, never really software related, so the need to update firmware wasn’t a major issue per se.

That being said, modern MFD’s are a lot more complex than machines from a few years ago. Now equipped with touch screens, running apps on the machine to control and audit usage, scanning to document or commercial clouds and more, the complexities have increased dramatically. Increased complexities means more software, which in turn requires more third party code and libraries. If just one of those links in the chain has a vulnerability, you have a problem. Welcome to the same issue the PC world has faced since time immemorial.

I can’t state as to what other manufacturers are doing, but I discovered that Ricoh bit the bullet and made a tool available to end users a few years ago so that they can update the firmware on their MFD’s themselves, without requiring a support technician to come in and do it. This saves time and money and also helps customers protect themselves since they can upgrade firmware immediately rather than waiting on the tech to arrive. Some security threats are so serious that they need urgent patching and the best way to get that done is make the tools and firmware available.

Enter the Ricoh Firmware Update Tool:

Ricoh 1

The tool is quite simple to use, though it is Windows only at this point in time. Either enter in your MFD’s by hand, or let the tool scan your network for compatible devices. Ricoh has a listing of supported devices here, which is a rather nice long list. I would imagine that going forward, all new MFD’s and networked printers released by Ricoh will end up being supported as well. I can understand that older machines won’t be on that list as they weren’t designed with that sort of thing in mind.

One the device(s) has been discovered, you let the tool download info from Ricoh to determine what the latest updates are. The tool will tell you under the Need Update column if an update is required or if the machine is on the latest version already.

Once you are ready, you click on the Update Firmware/App button, which brings up the next screen:

Ricoh 2

You can click on a device and click Detailed Information to see more info about the copier including the existing versions. Clicking on Administrator Settings lets you change whether to just download the firmware into a temp folder, download and install in one go, or install from a temp folder, downloaded from another PC perhaps. Once you’ve made your choice there, clicking the Execute button will start the update process. The tool offers solid feedback at every stage along the way. The firmware is download and pushed to the machine(s), where they then proceed to digest and install the firmware. There’s a number of steps that happens on the devices themselves and it can take between 30 minutes to an hour per machine to complete.

Once completed, you can see the results by clicking on a device in the main window and selecting Detailed Information. This will bring up this final window which contains a change log of what was updated:

Ricoh 3

As you can see above for our MP 2555 device, the update was pretty substantial. This makes sense as we’ve had this copier for almost 5 years now and it’s never been updated before this. 3 out of the 4 devices on the network needed to be updated, which has since been completed. The process was painless, if just time consuming – there are a lot of different parts of a MFD that needs attention so it’s understandable in the end. MFD’s have become a bigger security issue over the years and making firmware updates available for end users to do it themselves is a big step forward in the right direction. I hope that all manufacturers in this space end up doing the same thing as quite frankly having a tech come out to update firmware is a very old and inefficient way of doing things in this modern connected world of ours.

The agony and misery of load shedding

For the last 15 or more years, my country South Africa has been at the mercy of its national power company Eskom as the county’s supply of electricity has become more and more constrained. Regular, rolling blackouts have increasingly plagued the country as Eskom tried to prevent a complete grid collapse, which would then necessitate a “black start” situation as the country brings the power stations back up bit by bit.

As the years have gone by, Eskom’s fleet of mostly coal powered plants have become more and more unreliable, hampered by huge amounts of corruption at Eskom, a brain drain as skilled engineers and technicians have left and or retired and a general lack of maintenance as the political party in charge of the country forced Eskom to run all the power stations flat out and do no maintenance to somehow tell the story that they stopped load shedding and so were worthy of your vote. Now the chickens have come home to roost so to say as the plants are knackered, the company is overstaffed, corruption rings are fighting to protect themselves and their ill-gotten plunder and electricity demand never stops growing.

My school is lucky to have a 60kw solar panel installation for the last 2 years, but without battery storage, the panels can’t stop our school from suffering a blackout. The panels certainly help reduce load and can even send power back into the grid, but it’s not a guaranteed power system. My server and network racks have UPS units in them, but when your power is going out 3 times a day, you can barely get the batteries recharged before it goes off again. Servers were never designed to be as tolerant of constant on/off cycles, which increases wear and tear as well as the possibility of data loss due to unclean shutdowns. Sudden power surges after the power comes back on doesn’t help matters either as this can either immediately damage something or cause long term cumulative wear to a power supply.

We have had bad bouts of load shedding before, but we are now at a stage we’ve never been before. Going into its 3rd week now, we have had continuous disruptions 24/7. We have never been so bad before and trying to keep track of what load shedding is happening where and when is becoming an increasingly difficult task. Throw in the fact that the City of Cape Town is often able to mitigate some stages of the shedding, so the published schedules may not be accurate depending on what the City can do. In its own odd way, this also adds stress and you can’t adequately plan ahead until you know what the heck is going on later today or tomorrow and that’s hard when you are literally waiting for word from the City, which can also change at a moment’s notice.

Our city is trying to procure its own power independently of Eskom, but even though the ball got rolling a while ago, it’s still going to take time to bring this capacity online. Even then, the greatest problem is obtaining enough battery storage so that cuts overnight are mitigated as well. Solar can’t produce energy at night and wind is erratic. The cost of battery storage has fallen dramatically over the last few years, but still not far enough, not just yet.

As I write this, we are potentially looking at another week to 2 maybe of this constant interruptions until Eskom can stabilise enough of the plants to stop load shedding. I am not well off enough financially to afford a generator for my home, so we make do as best we can. For my school however, should this bout continue into the start of the new term, things are going to get extremely ugly.

Categories: General, Personal Tags:

Coming Soon

It took about 6 months to arrive thanks to the global chip shortage hammering all parts of the electronics industry, but our new Aruba 5412R switch arrived, to my immense joy. Behold our new core switch:

20220610_105552_004 (2)

As written about previously, this new switch is replacing 4 older switches in our network core. 3 of those switches will be retired, while one will move to another venue, replacing an older 2610 model. Once that happens, I will have no more switches that still use Java in their interface, which is a big step forward. Accessing management web pages with Java apps on a modern PC isn’t easy and this will go a very long way forward to completely removing Java from my PC at work. I still have a server remote KVM that runs on Java unfortunately, but I hope to replace said server next year with a new server that has HTML5 remote access instead.

Back to the switch above. Whilst it may look big and complex, the web interface is the same HPE/Aruba interface I’ve grown to love the last 5 years. Everything is immediately familiar including the configuration on the command line, unlike the cx6100 series switches that use a new OS. Once the switch is installed in the upcoming holidays, not only will a large amount of PC’s jump up to a gigabit link, my fibre backbone will also largely jump to 10gbit around basically the entire school. In the future, it would be easy enough to replace one of the modules with another to support such things as 2.5gbit links or QSFP connectors for 40gbit links and so on. Mixing and matching will be so easy due to the modular nature of this chassis. This chassis system is something a vendor like Ubiquiti is missing from their current Unifi line-up, which would prevent their kit from being used in some locations. Unifi has its own strengths and weaknesses, but now that I’ve handled a chassis switch like this, I am convinced it’s something they are missing in their line-up.

In a little over 2 weeks from writing this post, this big boy will go in and the biggest project I’ve ever undertaken at the school will essentially be complete. I still have some edge switches that need replacing, but those should be finished off next year. Eventually I would need to look at replacing switches again so that I could start getting 2.5gbit ports for Wi-Fi AP’s, but that is in the further future.

A bonus benefit is that being such a large machine is that the fans will run so much quieter than the existing 40mm screamers in 3 of the 4 switches that are being replaced. The server room will be a lot quieter after this big boy goes in!

Categories: Networking Tags:

PC RGB is a mess

I recently performed a bit of an upgrade on my personal PC, namely transplanting the innards from my old gigantic Cooler Master Cosmos II Ultra tower case into a much smaller Cooler Master CM 694 case. Besides being a decade old, the Cosmos II also lacks a window on the side and doesn’t have any front mounted USB-C ports. The CM 694 offers these amenities in a case that is a lot more friendly on my desk compared to the old behemoth.

As part of this upgrade, I decided to finally go full RGB and get everything matching as much as is humanly possible. What I quickly discovered is that the PC RGB space is a royal mess. Different connectors and different software packages for control are frustrating enough to deal with, but peripherals being largely incompatible between software packages is the real killer annoyance. It’s sort of understandable in the sense that each manufacturer wants to lock you into their ecosystem of devices, but the ideal situation may not go this way. For example, you may want an Asus motherboard with an EVGA graphics card, with Corsair RAM and case fans, Logitech mouse and Steel Series keyboard. All of these support beautiful RGB, but you would have to run at least 4 different software packages to control all the effects, which eats system resources, introduces potential security holes and can lead to the RGB programs not working correctly as they fight each other for control of items. Not to mention, if you aren’t running Windows, your options are seriously limited.

There’s a good video on YouTube that explains much of this as well as taking a good long look at the types of connectors you would commonly encounter on case fans and RGB strips:

However, this video doesn’t get into the annoying software side of things in that peripherals such as keyboards, mice, headsets, mouse pads, headphone stands etc are generally locked to whatever manufacturer ecosystem exists i.e. Asus mouse won’t work with Logitech software.

Thankfully, people got annoyed by this and there are at least 2 promising software solutions out there – OpenRGB and SignalRGB.

OpenRGB is open source software that aims to support every possible RGB device in one piece of software, across Windows, Mac and Linux to the maximum possible extent. Their list of supported hardware is long and grows almost every day.

SignalRGB has a similar goal, but is closed source software and only runs on Windows at this point. The hardware list is very impressive and like OpenRGB, seems to grow every day. The program also comes in two versions, one free and one paid that includes more features such as game integration and additional features.

That being said, both SignalRGB and OpenRGB are reverse engineered products and should something go wrong, the original manufacturer can be petty and refuse to honour a warranty if they find out you were using a 3rd party program for example. Also, some manufacturers really cut corners in their RGB implementations, so neither program will have good control over those devices – ASRock motherboards come to mind here for one thing.

It is my hope that eventually someone big steps up and forces an industry standard, but as with anything in the PC space, this seems unlikely – the USB interface for peripherals alone means that it’s too easy to get an ecosystem going and try to tie your users down into it. I guess that the two software applications above really are your best bet for cross brand RGB, but it still doesn’t solve the issue of needing the vendor software installed to update firmware, set configuration settings for keyboards or mice, things that can only be done through the original software. The other alternative is to stick to one vendor, do your research and get products that are guaranteed to work with each other.

Gigabyte Q-Flash Plus to the rescue

This past week, I built yet another new computer for my school. Nearly identical to the last build that I did for the school, the main difference was that I went with a Gigabyte B550M Gaming motherboard instead of an Asus Prime motherboard. Why? Simply put, to avoid the issue I had last time where the motherboard wouldn’t post with the Ryzen 5000 series CPU in it. The crucial difference between the two motherboards is that Gigabyte have included their hardware based Q-Flash system on the motherboard, which lets me update the UEFI even if the UEFI is too old to boot the new generation CPU’s.

Q Flash_LI

That little button on the motherboard is a life and time saver. Of course, the concept of this isn’t exactly new – Asus had a feature like this on their higher end motherboards a decade ago already. However, it’s one of those absolutely awesome features that have taken a long time to trickle down to the budget/entry level side of things and to this day, many motherboard still don’t sport this essential feature even though it would drastically improve the life of someone building a computer. There’s nothing quite like spending time building a PC, getting excited to hit the power button and all of a sudden seeing everything spin up but output no display signal. That scenario makes you start to question your sanity.

So what does it do? Simply put, whilst your motherboard may support a new generation of CPU’s, it more than likely requires a new firmware to do so. This becomes a chicken and egg situation whereby you buy the motherboard and CPU, but the motherboard came from the factory with the older firmware on it and as such can’t boot your new CPU. In the past, the only way to get around this was to use a flashing CPU, i.e. a CPU that the motherboard supports out the box and flash the firmware, remove the flashing CPU and then put in your new CPU. This of course works, but is tedious and does increase the risk of damages in the sense that you are now doing twice the amount of CPU insertion and removal.

Q-Flash Plus and other systems like it basically let you flash the firmware, even if your system has no CPU, RAM or other components installed. I used it to great success this week to get the new motherboard flashed to the latest firmware. The process is pretty easy and goes like this:

  1. Connect motherboard to your power supply correctly, so 24 pin and 8 pin connectors.
  2. Use a smaller capacity flash drive. Format it with FAT32 and place the latest firmware on the drive. Rename the file to GIGABYTE.bin
  3. Place the flash drive into the correct port – this is usually just above the button, but check your motherboard manual to be sure.
  4. Press the button once and sit back. The LED indicator near the button should start rapidly blinking, as should the USB flash drive if it has an indicator light. After a few minutes, the PC powers up whilst the Q-Flash indicator blinks in a slower pattern.
  5. When done, the PC in my case restarted itself and posts properly.

In my case, the PC restarted after the flash completed, but I already had all the components installed. Others flash the system completely bare, so in that case the motherboard may just turn off when complete. This isn’t something I have a lot of experience with, so your results may vary.

After using this to get the new PC up and running, I’ve come to the conclusion that it is a feature all motherboards should have, no matter what level they occupy. It’s one of those features that is just too handy, too useful to be reserved for higher end motherboards only. It no doubt does add some expense to manufacturing, but again, it’s one of those things that is just too handy to have.

Fun with a new Ryzen PC

This past week at work, I built a brand new PC for the school for the first time since I’ve been there. For pretty much all previous PC builds, I’ve used a past pupil of the school but since he became completely uncompetitive on laptops, I started sourcing those from other vendors. After sourcing costs for this new PC from our local e-commerce giant Takealot, the pricing was about what I would have paid the guy anyway, so I decided to just build the PC myself.

There was a delay on the order itself due to 3rd party vendors struggling a bit (thanks global supply chain mess!) but eventually all the parts arrived. A modern PC has even less parts and cables than ever before, so building one is faster and neater than ever really. I went with an AMD Ryzen 5600G CPU, Asus PRIME B550M-K motherboard, 8GB DDR4 3200Mhz RAM, 250GB NVME SSD, 350W power supply and a surprisingly nice Cooler Master chassis. Nothing else, as I already had the keyboard, mouse and monitor in place at school. Not going to set the world on fire performance wise but a very nice powerful system by comparison to the rest of the PC’s in the school.

Just one problem though – the motherboard supports 5000 series Ryzen CPU’s, but generally with a pretty high BIOS revision. I knew that this could pose a problem if the firmware wasn’t up to date on the board. It’s luck of the draw and depends on the date of when the board was manufactured. If an early production run board has been sitting on the shelf for two years, the BIOS revision flashed at the factory will be much older than what has been released since. Such was my luck, as the motherboard was sporting revision 1202, whilst the CPU needed revision 2403 to POST and work.

Now what to do? I’ve got a brand new system out the box with components that all seem to be working, but I’m not getting any POST. The fans spin and the small onboard RGB strip glows, but no output on any of the display ports. Returning the motherboard wouldn’t really do any good and would take a lot of time with no guarantee that a swapped board would be up to date. The solution? Drop a compatible out the box CPU into the system and flash the board to the latest firmware version, then drop the new CPU back into place and voila! Luckily for me, we have another Ryzen system in the school, using an older 3200G CPU.

Popping the 3200G into the B550 board worked like a charm, I had a POST at the 1st power on attempt. From there it was really just a case of getting into the Asus EZ Flash feature on the board and updating the firmware. Nothing I haven’t done hundreds of times before. After the flash was done, swap in the new CPU and cross fingers. Thankfully the new CPU posted just fine and it was now just a matter of screwing coolers back down in both systems and closing the cases back up.

20220309_084017

That being said, after the flash was done, I looked at the B550 motherboard box and noticed that it specifically said that the 3200G and 3600G CPU’s weren’t supported by the board! The reason for this would be that despite being named in the 3000 range, those two CPU’s are actually from the earlier 2000 range Ryzen CPU’s, but have the Radeon GPU bits added in. The AMD B550 chipset apparently doesn’t support pre 3000 series Ryzen CPU’s. Score one for AMD’s increasingly confusingly named CPU/APU series…I was just glad that the flash worked regardless, as I could have ended up in a situation with a bricked motherboard.

AMD used to offer something called a boot kit in the past where for situations exactly like this where they would send you a really low end CPU that was just enough so that you could do the flash. Unfortunately I think that program ended a long time ago, not to mention that the flashing CPU wouldn’t have been supported in this board either. The only real solution these days is to either have a motherboard with some sort of offline flashing capability “flashback” or to have a compatible CPU on hand to use for flashing. Either of those options unfortunately incur a cost, as the CPU would cost extra to have on hand to use all of once, whilst the “flashback” feature is something that hasn’t quite worked it’s way down to the entry level and budget motherboard segment as yet.

Nonetheless, I feel I still have a good solid combination of hardware with the PC and I am confident it will have a long life in service at the school. It’s also going to be the 1st of quite a few more that I build going forward, budget depending. I may just need to look at a slightly more expensive motherboard that has a “flashback” type feature so that I don’t need to go disassembling PC’s to swap out CPU’s every time!