A few weeks ago came the surprising but not totally unexpected news that Microsoft was purchasing Nokia’s handset division. Since the day Stephen Elop announced the partnership with Microsoft to run Windows Phone, many pundits had predicted that something like this would eventually happen. When the news broke, there were many snide comments of Trojan Horse being bandied around amongst other things.
Lots of comments lamented the sale, with people saying that if only Nokia had embraced Android or continued with Meego, they would have been in a much better state. I dispute these claims, and this is why:
- In the Android world, Samsung is the 800lb gorilla in the room. Samsung ran early with Android, and this came back to reward them as they racked up huge sales of the Galaxy S and S2. It’s also tough to compete when Samsung itself pretty much makes every part needed in a phone. This vertical integration has killed or wounded just about every other competitor in and out the Android world. Even Nokia at their prime didn’t have this level of integration. Witness how every other company is fighting for scraps.
- Meego was pretty and earned some very enthusiastic reviews. However, as Elop pointed out in his “Burning Platform” memo, it’s not so much about individual phones anymore as it is about platforms. Meego didn’t have a very large platform to start with, and in all honesty, Nokia didn’t have the financial muscle to make into a large platform. By the time Nokia switched, the Windows Phone platform had already started to pick up a nice head of steam, though it was still a bit player compared to Android and iOS. Without the apps that were popular on other platforms, Meego would have died a slow ignoble death. Look at BlackBerry, who chose to stick with their own new platform of BB10. That is what would have happened to Meego.
What I find very odd is all the people who would simply be happy with an iOS/Android duopoly in the phone world. That would simply lead to ultimate stagnation. The worldwide market is big enough to support 3 players, and Microsoft is a company who needs to be in mobile to help draw and retain customers to their entire ecosystem. BlackBerry is a faded star in the phone world, and their future increasingly looks like it will be either one of being sold, or broken up for various sub components.
There will be some interesting times ahead, no doubt. Microsoft needs to tread very carefully so as to not disrupt the steady growth of Windows Phone, thanks largely to Nokia’s Lumia phones. That being said, having the handset division as part of the mother ship now, Microsoft may be able to innovate quicker and churn out products even quicker. The handset division now has an effectively unlimited budget to work with, so expect greater marketing and hopefully even more wonderful phones.
Lastly, the Nokia mother ship itself frees itself of a division that was causing it to sink. While the company will now be a lot smaller than before, it frees itself to compete better with its NSN division and the HERE maps platform. Nokia becomes more nimble to move and continues to live on, having reinvented itself yet again.
It’s not often that I end up ranting about something twice. EA’s Origin unfortunately has now earned my wrath for a second time. You can read about my previous grumblings here.
Last week, the Humble Origin Bundle went on sale. At the price the bundle was offered, I couldn’t resist ordering. Of the lot, Dead Space 3 is the newest game (Feb 2013), while the other games are anything from 1-4 years old. A lot of people have purchased this bundle, since it is great value and the proceeds are going to charity. More cynical people acknowledge that part, but state that the real idea is to draw people in to use EA’s Origin platform.
Unfortunately, it seems EA were not prepared for the influx of new users. Keys could not be redeemed as the servers had for all intents and purposes, melted under the load. Luckily I didn’t suffer this problem. What has been a problem however is downloading the games. I’m on a paltry 1Mb/s ADSL connection at home, and seeing the size of some of these downloads made it clear I couldn’t do it at home. Luckily I have the capability of doing so at work. I installed Origin and started by trying to download Dead Space 1. Starts off well enough, but then the download simply hangs after a while. Clicking the Pause button does no good, as the download never pauses. At this point I have to hard exit or kill the Origin task with Task Manager.
The line at work could have finished Dead Space over the weekend, but since the connection simply wouldn’t stay alive, I’ve had to baby sit the download in chunks. Not fun, and not the way Origin used to work. Origin didn’t have this problem with Mass Effect 3, so I don’t know if it’s one of their updated releases that did it or what. Numerous reports on the EA forums bear witness to the fact that I’m not the only one suffering this problem. All I want to do is download the game at work and transfer it to my home PC, something else Origin is woefully inept at doing. Steam makes this process idiotically easy.
Also, the latest Origin update has now decided it wants to download and reinstall all my Mass Effect 2 DLC, which for whatever reason it can’t pick up the fact that it’s already installed. The previous version of Origin finally fixed that problem, but it now appears to be back.
Digital downloads keep being trumpeted as the future of how content is distributed, but experiences like this only make me want to cling onto my physical media all the harder. Add the fact that internet speeds here in South Africa are simply not great, and the situation becomes somewhat painful. I can’t help but feel that part of the reason EA offered the bundle was to load test Origin to the breaking point, and then finally get sort out their service. Some commentators have said that we should look at what Steam was like after its first two years on the market, but I’m not sure that is a valid point. EA would have seen Steam’s problems, and could have worked to avoid it from the start. Everything Steam does wrong, EA could have done right, but it seems that this will never happen.
And to think, PC gaming once used to be simple and enjoyable mere minutes after buying a game….
This past Friday, my colleague and I installed the Synology RS812 NAS into a cabinet in our second computer room, roughly 7 months after the device and cabinet was purchased. Powering it up and getting it running on the network was something of a minor victory for us, after what seemed like endless setbacks. Let me explain.
About a year ago, the head of IT at my school started talking about wanting to have the backups out of the server room, as far away from the server room as possible, so that if there was any sort of calamity there would be a much better chance of the backups surviving. For a few months the idea got talked about, but nothing was really done about it. The backups used to be performed to a FreeNAS virtual machine that was hosted on our big server in our server room. In nutshell, if the server room is destroyed, so are the backups.
Rackstation 812. Photo courtesy of Synology’s website
Near the end of the year, the topic came up again. After looking into the matter, I made it known that it what we needed was a device that could act as an iSCSI target. like the FreeNAS box that was doing the job. This unit was cheaper at the time than a dedicated 1U server, so we ended up purchasing this along with the rail mount kit, 2x2TB hard drives and in the end, a 6U swing frame cabinet so that the device could be mounted in our second computer room. The rail kit has gone unused, as it cannot fit into any cabinet except our big server cabinets in the server room. Since we weren’t installing in the server room, they are redundant.
The cabinet only got mounted earlier this year, as other priorities came up in the course of last year. After the install of the cabinet, I went to go check up on the work to make sure the ground staff had properly installed the cabinet. Turns out they hadn’t, and a simple tug by me ended up bringing the whole cabinet off the wall. After a second remount, the cabinet was firmly in place. I asked for a power lead to be run from the nearest electrical box to the cabinet, and I was promised it would be done “now now” I’ll let you guess as to how that turned out…
Eventually last week, my colleague and I decided to mount the thing and be done with it. It took a bit of effort, and it almost didn’t work due to the finger handles sticking so far out of the chassis. The unit just about made it into the rack, but if the “ears” were left on, we wouldn’t be able to close the cabinet door. Luckily, the “ears” are held on the rest of the mounting bracket by 2 screws. Once these are removed, the unit sits flush and the door could be closed and locked. The extension lead running down to the nearest plug isn’t exactly very pretty, but that can be solved by having one strip of trunking installed.
The device itself is a nice enough NAS. Dual gigabit Ethernet ports are very nice, and the drives are all hot swap capable. The unit runs quite quiet compared to some 1U screaming vacuum cleaners I’ve heard in my time. Once the NAS is turned on, you use one of the tools on the provided CD to find it, and then install the operating system. I think this may be embedded into the server somewhere. After that, configuring the NAS is pretty straight forward. Synology’s DSM operating system seems to want to be a jack of all trades, from a serious storage device to a wannabe server. Despite this, it’s pretty well laid out, and the GUI is really nice for configuring things. I find that it’s a bit more user friendly than the last version of FreeNAS I used, though that product has come a long way in its own way.
So far so good, the iSCSI part has worked like a charm. Performance is decent, even though it appears to be running an ARM processor and only has 512MB of RAM. Then again, for what I’m using the device for, it’s more than enough to get by. If there is any sort of memory problems in the future, I can use a DDR3 laptop sized module to upgrade the memory.
All in all, this is a decent rack mounted NAS/storage server, and for schools or medium sized business environments, it’s well worth a look. You may be able to create your own 1U or home brewed NAS somewhat cheaper than what this device costs, but the DSM operating system helps this unit to offer a completed polished package.
When our school started migrating to Windows 7, we discovered that our collection of HP LaserJet 10xx series printers were not very Windows 7 compatible. You could get a basic driver from HP for the device, but it lacked many features compared to the XP version. Added to that, printing a PDF document often hung the print spooler service. No matter how low and high we searched, the conclusion we drew was that HP were more interested in selling you a new model printer than providing robust driver support for a reliable work horse of a printer.
Late last year, we purchased 2 Samsung ML-2160 laser printers to start replacing the HP’s. They were cheaper than the similar modern HP LaserJets. The Samsung’s had the benefit of full Windows 7 compatibility and a smaller size than the existing HP. However, it lacked an envelope/bypass tray and the cartridge price was about on par with HP. Nonetheless, in they went and we had no problems with them as trial devices.
Earlier this year, we bought an additional 6 of them to replace most of our HP’s. At first, like the 2 from last year, we had no issues or complaints. Until one day when I got called upstairs, as after a cartridge change the printer refused to print. I swapped cartridges, inserted a new one, restarted the printer and even tried to do a printer reset. All to no avail – both of the LED lights on the printer stayed orange/red in colour. A few days later, another one of the batch did the same. We then took the printers back to the place of purchase, who then sent it away as it was under guarantee. About 2 weeks later we got the printers back. Apparently the printed circuit boards had been replaced.
Those 2 printers have gone back into service with no complaints so far. We breathed a sigh of relief, thinking this was just a freak stroke of luck. That is, until the Principal’s printer did the same thing last week. That printer too has gone back for a circuit board replacement. That’s a 50 percent failure rate for that batch of printers bought from the same shop. Either they got a bum batch of printers from Samsung, or the model itself is faulty. I think it may be a bum batch from Samsung, as it seems this particular printer is rather popular country wide. There’s been massive toner cartridge shortages, which indicates some sort of popularity I think.
In closing – if you have this model Samsung printer, be aware that the PCB might just randomly die one day, leaving you unable to print. There is nothing you can do to fix the problem yourself, so it’s best to return it if it’s still under guarantee. If not, a printer repair technician may be able to repair the device, but it may cost almost as much as simply buying a new printer would. That to me is the rather scary and almost ridiculous part.
Our school recently purchased 3 new computers using Mini ITX motherboards. I originally wanted an Intel based motherboard, but it wasn’t in stock and our supplier ended up getting us Gigabyte H77 Wifi motherboards that had more bells and whistles on it. However, said motherboard has Realtek network cards on-board, specifically the 8168 model. Nothing wrong with this card, except getting it to work with Microsoft Deployment Toolkit 2010.
These days, the computers I’m in charge of at the school are setup using MDT rather than installing from a DVD drive. The process is largely automated after the initial first steps, and my task sequence has been set up to pull in all the latest Windows patches from our WSUS server. However, this all depends on the network card driver being able to work in the Windows PE environment. The Realtek card is not natively supported by the Windows PE version of MDT 2010, so you have to get drivers installed. The same goes for any network card not supported out the box by Windows 7, as that is what the Windows PE version of MDT 2010 is built from.
Thankfully getting things like storage and network drivers injected isn’t a difficult task. I used the drivers off of the support DVD and injected them into the Windows PE image. I started up the new workstation, only to be greeted by the error message below (sorry about the poor quality of the photo, it was taken using my phone)
I found this odd, since the drivers were off the motherboard DVD, and they work fine in Windows itself. When you double click on a driver file in MDT, you can see all the models that the driver supports. It only became clear after a while what the potential problem is. The error message has a “&REV_06” part at the end of the vendor ID. When I looked closer, I realised that though there was indeed a line with the exact same vendor ID, it was for “&REV_01”
This means that while the driver will indeed support the network card, Realtek or Gigabyte have forgotten to add this extra line into their *.inf file that comes with the driver. I copied and pasted the existing line, and modified the end part to read REV_06. This in theory should add support for this chip in Windows PE and let me image these computers.
However, my theory did not work out. Despite modifying the driver package and completely regenerating the Windows PE images, I could not get the client PC to pick up the network card at boot. Manually loading the driver using drvload at the command prompt worked, but it defeats the purpose of using MDT in the first place.
My hope is that when I upgrade to MDT 2012, the problem will be solved. Since the Windows PE version in MDT 2012 will be based off of Windows 8, I am hoping that there will be a generic or native driver built in that will avoid me having to inject Realtek drivers into the image. While I would prefer to only use Intel network cards, it sometimes simply isn’t possible – many laptops I’ve seen lately are all using Realtek network cards for their Ethernet connection.
It’s 2013. Most, if not all new motherboards have transitioned over to using the UEFI standard for computer start up instead of the legacy BIOS. There are numerous benefits to this, though there will still be some quirks as manufacturers find their feet with the (relatively) new technology. Prior to the introduction of UEFI, many motherboard manufacturers allowed you to update your BIOS directly during the POST sequence. Press the marked key, insert your flash drive with the BIOS file on it and you were pretty much good to go. Much more convenient than booting of a floppy drive to run a DOS based flashing tool…
That being said, if you work in an environment where you are exposed to older computers from around 2005-2008, you may encounter motherboards that cannot be flashed at boot time. To make matters worse, most of these boards were not designed to be updated from flashing inside Windows. The only supported method is to flash from MS-DOS. All very well and good, but getting down to MS-DOS in 2013 is not that easy. Floppy drives all but vanished 4 (or more) years ago. Also, for something as vital as a BIOS flash, you would want to use a new fresh disk that stands less chance of being corrupted halfway through the flash. Problem is, it’s not easy to find new disks these days, nor should you have to struggle with such antiquated methods.
The most logical next step is to use a bootable flash drive. Most motherboards will allow you to boot off of a flash drive. You could try to use FreeDOS, though I have no idea how well that would work as many of the tools expect MS-DOS and are hardcoded as such.. There are ways of preparing a MS-DOS bootable flash disk, though it’s a little time consuming. See this post for more info on how to create a MS-DOS boot flash disk: http://www.sevenforums.com/tutorials/46707-ms-dos-bootable-flash-drive-create.html
However, there is another way. There exists an open source tool called flashrom that will update your computer’s BIOS for you, if it supports the chipset on the board. For motherboards of the 2005-2008 era, it shouldn’t be a problem to update. Flashrom comes bundled with many Linux distro’s, including Parted Magic, which is the distro I used to perform some updates. Updating the BIOS on a computer with flashrom is as simple as flashrom –w <filename> I’ve used it with some Foxconn 965x7AA and MSI P965 Neo motherboards in the last week, and it worked without a problem.
I must admit, I vastly prefer having the ability to flash a BIOS at boot, but flashrom is the next best tool. Now you don’t need to keep multiple versions of DOS based flashing tools around for each type of motherboard you want to flash. One more useful tool for any techie to have around when the need arises to flash an older computer.