Archive

Archive for the ‘Internet’ Category

The perils of software activation

September 27, 2020 Leave a comment

In our modern internet connected era, most people who use computers probably don’t even think much about software activation anymore. Generally speaking, things just work and as long as you don’t try to exceed your license limit, there aren’t usually any issues to worry about. Yet I can remember a time 20 years ago when the idea of software activation raised a huge ruckus, especially as the world was a lot less internet connected back then.

Microsoft Office XP wasn’t the first Microsoft or any software product to require activation over the internet, but it was the most high profile at the time by a mile. Microsoft instituted activation mainly due to rampant software piracy of its products – keys may have been unique on packaging or the certificate of authenticity on a computer, but those keys could be used anywhere else that used the same software versions. This made piracy stupidly simple and easy. Next up was Windows XP. Whilst the later leaking of corporate activation keys did enable piracy, for many casual users Windows became a lot harder to suddenly pirate at will.

Of course, the open source movement and the Electronic Frontier Foundation decried what they saw as intrusive, monopolistic and unethical practices in limiting users that way. Much internet ink was spilled predicting doom and gloom and the end of all freedoms that we had known until that point. Next, your computer would be spying on you and all the rest of the melodrama. Despite this, many more companies followed suit with online activation of software to help cut down on piracy. It became the norm eventually.

If you are a big company like Microsoft or Adobe, you can keep licensing servers running almost indefinitely and even if you’ve discontinued products, you can usually still keep the licensing servers running for users who need to reactivate the old products. However, if you are a smaller company, you may not have that luxury and this is where things get tricky. Take the following screenshot as evidence:

SMART server issue

This is a screenshot of the SMART Notebook 15.2 activation checker, built into the activation process itself. About a month ago, I wanted to reinstall this version of Notebook onto a laptop for a teacher. I have a valid license key for this software and though the premium features have expired, the key is still valid for this version of Notebook which includes the basic core functionality of the product.

Since none of those activation servers are available, I was unable to activate the product. These values are hard coded into the software and cannot be changed by us as end users. SMART have long since moved on to more modern versions of Notebook and have heavily pushed their subscription service for licensing. They have not issued any patches for older versions of Notebook rectifying the change in licensing servers or procedures, with the end result being that I am no longer able to reuse a validly purchased piece of software that is only 5 years old…

Hit with a situation like this, I understand why open source folk and the EFF decry this kind of activation and licensing, as I’ve been locked out of my purchased software with no means of fixing the issue. Contacting SMART will only result in a case of being told to upgrade to version 19 and that they are sorry for the problem, but that is the only remedy.

I do actually have another piece of software that also appears to be broken by the shutdown of licensing servers. Luckily I do have a newer version of that software that was made freely available, but it was still another nasty surprise.

Protecting your software from piracy is a valid and good thing to do, but if you can’t keep licensing servers up forever, consider releasing a patch that either removes activation for older software or find a way to clearly notify all users that the old version X will no longer be able to activate after XYZ date. But I guess in today’s non stop world, such courtesies are maybe too much to ask for.

The long road to Fibre is finally completed

Around July 2016, our school applied for a 100mb/s fibre optic internet connection. We couldn’t cope on the tired old speed deprived ADSL any longer and a brief opening up of our Wi-Fi to staff immediately killed all bandwidth to the desktop computers. While the cost would go up quite a bit, the benefits were deemed enough to outweigh the cost concerns.

After the contract was signed, the wait began for the fibre infrastructure to be built and brought into the school. Around November 2016, a site survey was completed and the build route laid out. Plans were formally drawn up and submitted and we hoped that construction would be starting late January or February 2017, so that we could go live before the end of the first term 2017. Sadly, this never happened.

What did happen was a long descent into utter frustration as we played the waiting game. The contractors who were building the link, Dark Fibre Africa, seemed to hit every possible brick wall and bureaucratic pitfall possible. One delay after another saw months slip away a lot like sands through the hour glass (so are the Days of Our Lives)

The excuses and explanations stayed largely the same however, that the real hold up was the City of Cape Town. Eventually it got to a point where I had to call in the ISP to have a meeting with the principal of our school, as he wanted answers. The ISP and an account manager from DFA came in and placated the principal somewhat, but they couldn’t give him anything more I hadn’t already told him. This was frustrating because in the end, it reflected poorly on my annual performance appraisal.

It actually got the point where in November 2018, we gave notice to cancel the contract, as no construction had started and we were tired of waiting. During the waiting period, we’d contracted to a wireless ISP who helped us a lot, but by the end of our time with them were suffering majorly degraded speeds, not to mention that the bandwidth wasn’t guaranteed and was shared with a neighbouring primary school.

I’m not sure whether the threat to cancel or if the ducks were all lined up in a row by that time, but the ISP pleaded with us not to cancel as DFA were ready to begin the build. We were sceptical in the school, but decided to give it one last shot at redemption. From experience, we knew that applying for fibre with a new ISP would start the cycle of wayleave issues and permitting problems all over again, whereas the existing ISP had fought through all of that already.

Lo and behold, construction began in late November 2018 and moved at a rapid pace. I was completely surprised at how fast DFA and its subcontractors were able to move, but before the school closed on 15 December, all the construction work was complete. The trenching, ducting, manholes and all the rest were complete. Now it was just a case of blowing the fibre down the tubes and splicing us into the network in the new year.

Sure enough, during January this year the remainder of the work was completed. Technicians came to blow the fibre, another came to terminate the fibre into a splice tray. Finally DFA came with their layer 2 media converter/CPE unit and put that in. The device essentially converts the fibre into Ethernet, but also does some sort of management on the line I assume. Configuration details were sent to me, which didn’t work initially. It was also interesting in that the connection was completely IP based, no dialling up or router needed. Just set your switch or firewall to the same VLAN as the connection and the rest was simply configuring your IP address and default gateway.

A day of troubleshooting later and it turns out that I was given the wrong address for the default gateway – that had been transposed with the network address actually. A simple enough mistake and once the correct value was punched into my firewalls, internet flowed. Come Monday 28 January, I started switching traffic over to the fibre and by end of day, everything was now running over fibre. Solid, stable and fast, the connection has not had an issue since.

The wait was long and 100mb/s is no longer quite as speedy as it once seemed, but I now have an incredibly stable connection that I can rely on. No interference on Wi-Fi spectrum, no Telkom exchange issues with ADSL or wet copper lines, just bliss and peace of mind really. It was worth the wait in the end and best of all, it just takes an email to the ISP to boost the speed of the connection upwards. No other install, no new equipment, just a quick mail or phone call to make the change. Internet life at work is now bliss!

Categories: Internet, Networking Tags:

The road to DMARC’s p=reject

DMARC is sadly one of the more underused tools out there on the internet right now. Built to work on top of the DKIM and SPF standards, DMARC can go a very long way to stopping phishing emails stone cold dead. While SPF tells servers if mail has been sent from a server you control or have authorised and DKIM signs the email using keys only you should have, DMARC tells servers what do when a mail fails either DKIM, SPF or both checks. Mail can be let through, quarantined to the Spam/Junk folder or outright rejected by the recipient server.

Since moving our school’s email over to Office 365 a year ago, I have had a DMARC record in place. I have had the record set to p=none, so that I could monitor the results over the course of time. I use a free account at DMARC Analyzer to check the results and have been keeping an eye on things over the last year. Confident that all mail flow is now working properly from our domain, I recently modified our DMARC record to read “p=reject;pct=5”. Now mail is being rejected 5% of the time if a destination server does checks on mail coming from our domain and the mail fails the SPF and DKIM checks. 5% is a good low starting point, since according to DMARC Analyzer, I have not had any mails completely fail DKIM or SPF checks in a long time. Some mail is being modified by some servers which does alter the alignment of the headers somewhat, but overall it still passes SPF and DKIM checks.

My next goal is to ramp up that 5% to 20% or 30%, before finally removing the pct variable completely and simply leaving the policy as p=reject. Not only will I be stopping any potential phishing incident arising from our school’s domain, I am also being a good net citizen in the fight against spammers.

Of course, this doesn’t help if a PC on my network gets infected and starts sending mail out via Office 365, as then the mail will pass SPF and DKIM checks and will have to rely on being filtered via the normal methods such as Bayesian filtering, anti-malware scans etc. That is the downside of SPF, DKIM and DMARC, they can’t prevent spam from being sent from inside a domain, so domains still need to be vigilant for malware infections, bots etc. At least with the policies in place, one avenue for spammers gets shutdown. As more and more domains come on board, spammers will continue to get squeezed, which is always a good thing.

Categories: Internet, Networking Tags: ,

Being a good net citizen: SPF, DKIM and DMARC records

Spam and Phishing emails are some of the more visible scourges of the modern internet. No one enjoys opening up their mailbox and seeing junk clutter up the place, or seeing a mail that tempts you to enter credentials somewhere because it looks legitimate. The war against Spam and Phishing is an on-going battle, with many tools deployed to try and keep a user’s inbox clean.

If you own or manage a domain on the internet and that domain makes use of email, it’s only right to be a good net citizen and set up SPF, DKIM and DMARC records. Together those 3 make a 3 pronged fork that can be stabbed into the heart of junk mail, but they each do a slightly different thing. Let’s take a look at them:

SPF essentially denotes who is allowed to send mail for your domain. Anything that doesn’t match the details in the record is to be considered an attempt to spoof your domain and should ideally be rejected, provided the record is set up as such. If you have a small domain with simple records, SPF is incredibly easy to set up. It becomes harder if you are a giant corporation or have lots of mail being sent from third party bulk mailers, but even those use case scenarios can be brought into line so that you have a valid SPF record. If Microsoft, Google and others can do it, why can’t we?

DKIM is a little trickier. DKIM enabled servers sign outgoing mail with a digital signature and lets receiving servers validate the signature using the published key in the DKIM DNS record. This way, mail can be verified as having been sent from domain abcdefg.com because the signature can be verified by consulting the DKIM record in abcdefg.com’s domain. If the validation fails it’s either because the mail was forged or the message modified on route. Since spammers aren’t running your mail server, they can’t validly sign outgoing messages with your private key, so when a destination server checks the signature, the check will fail.

DMARC sits on top of SPF and DKIM. While SPF contains syntax for what to do when mail fails a check, DKIM does not. DMARC essentially tells a recipient mail server what do with those mail if they fail the SPF/DKIM checks. Mail can either be allowed through to be processed as the destination sees fit, sent to the Spam/Junk folder or rejected outright. Set to reject mode and along with an –all syntax in SPF, this will ensure that spammers cannot spoof mail from your domain (in theory)

It’s not perfect though. In order for the 3 records to be effective, the destination mail server needs to check the records. If the server doesn’t and simply accepts mail as is, junk mail will make it into the inbox from forged senders. The records also don’t help if a spammer compromises a legitimate account in a domain with all 3 records, as when the mail is sent out via that domain, it will pass all checks on the destination end, as it was sent from a domain with valid records. To prevent this, you’ll need to set up rules to detect outgoing spam and block it from being sent. Each mail server will have different instructions on how to do this.

Office 365 and G-Suite all include records for SPF, while DKIM takes a few more steps to set up in Office 365. G-Suite also supports DKIM as far as I know, but since I don’t use the product, I don’t know how hard or easy it is to set up.

While nothing is ever perfect in the war against spammers, a huge amount of junk mail could be stopped cold if more domains published valid SPF, DKIM and DMARC records. Banks and financial institutes that are a favourite target of fraudsters could save themselves a lot of grief by having destination domains reject all mail that isn’t legit. IP block lists and content filtering will remain an important part of the game, but if more junk mail could be stopped at the edge before being accepted and processed, the better off the entire internet will become.

Categories: Internet, My tips and tricks Tags: , ,

Testing Exchange Server connectivity

If you are fairly new to administering Exchange Servers, you’ll often wonder if you have configured all the connectivity options correctly. One way of checking this is to test things from both inside and outside your organisation, but sometimes you don’t have the necessary hard/software to do these tests. Enter a great solution provided by Microsoft: the Exchange Remote Connectivity Analyzer

Test Exchange  Exchange Remote Connectivity tester

This tool will let you test things such as ActiveSync, Outlook Anywhere and so forth. All you need to do is use a valid user account, and point the tester to your server(s), and wait while it attempts to connect. If it isn’t successful, it shows a log of all the steps it took, along with the point where it failed. This makes it an excellent tool for troubleshooting.

Credit for this goes to a post on the EduGeek.net forums, where I discovered this little gem. I don’t know if this works with Exchange 2003, but it definitely works with Exchange 2007 and 2010.

Internet Explorer 9 Platform Preview 2

I downloaded this in the week, as I was curious to see what the fuss was all about. Being a Platform Preview, the application is a minimal GUI over the Trident engine that powers IE. The install file was about 16MB, and it didn’t require a restart. You can download a copy here: http://ie.microsoft.com/testdrive/

The browser window is as spartan as Notepad, which in a bizarre way is actually quite cool. When you start it, it takes you to the Microsoft Demo site, where you can play around with the demos that show off some of the new features in the engine, its speed and improved web standards. Most of these are pretty boring after a while, so I opened up some real websites. I tested my school’s website (built in Joomla) and some other fairly heavy websites.

I was pleased that IE9 PP2 feels faster already than IE8, though of course this is just the raw Trident engine. My school’s website, which relies on a lot of PHP and JavaScript due to being built in Joomla, opens a lot faster than IE8. A lot of things don’t quite work in the admin backend, but I’m sure this will improve as IE9 continues to grow. Other websites showed a similar boost in speed.

I’ve often been asked why we don’t install Firefox or Chrome in the school network, and I reply that until the day you can manage it out of the box like you can with IE through Group Policy it won’t be installed. Most people never knew that, especially the students who think to show off their “l33t skillz”

I don’t know when IE9 will be released, nor if it will help Microsoft regain some points in the browser wars, but increased speed and standards support is a very big welcome. It’s unlikely to get users of other browsers to switch back, but it may stop new users from switching in the first place.

SEACOM bandwidth first hand

March 17, 2010 1 comment

Last year, the SEACOM undersea fiber optic cable landed in South Africa. This cable was many times faster than the existing SAT3 cable that most of South Africa relies on to get internet access. Along with everything else, it was expected that SEACOM would bring competition and innovative packages to market as ISP’s got connected to the cable.

Two weeks ago this became a reality for my school. Our ISP only caters for schools, and they had managed to obtain a slice of bandwidth for new packages they would be offering schools. We were offered an uncapped 2Mbit ADSL line for less than what we were paying for our uncapped 1Mbit line. Contention ratio would be fixed at 2:1, which was guaranteed unlike our previous solution with Internet Solutions.

A new router was installed on site, our firewall quickly reconfigured and we were away. A few DNS records had to be changed on their side, but once that was done everything was back to normal.

Speed is fantastic on the line. One could say that of any 2Mbit line I guess, but with contention ratios fixed, we are getting steady solid bandwidth. YouTube videos can often now be viewed without a lot of buffering for example. Downloads go so much quicker than ever before.

General browsing speed has also increased nicely, though sometimes it isn’t as noticeable.

It has reached the point that using this crummy 384k ADSL line at home is almost physically painful at times. As ADSL competition continues in South Africa, I can hope that the 384k lines will eventually get upgraded for free. I can’t live with this slow speed anymore after having SEACOM based goodness for 2 weeks now at work.

Although SEACOM hasn’t yet totally opened the market up, it is having many good effects already. Once the rest of the fiber cables land in SA, we may just start living the internet good life.

Telkom Mega 105WR: Follow up after almost a year

December 16, 2009 1 comment

I previously wrote about the Telkom Mega 105WR router here, and after almost a year’s worth of use, I thought I would do a follow up.

Most of my original comments still stand on the device, it is quite powerful and feature rich for a telco supplied device. However, one thing that has not improved is the routine freezing of the device. Almost like clockwork, the router needs to be restarted every 7-8 days, as I can’t access the web console, and wireless struggles to remain connected.

Doing some research on the router revealed that it appeared to have a high return rate, for this reason and diverse others. I looked on the manufacturer’s website for an updated firmware, to no avail. I even tried emailing their support address, to which I have still not gotten a reply to this day.

A few weeks ago, I decided to take the router back into the shop to try and get a replacement model before the guarantee ran out. I duly packed up everything and went to the Telkom shop. I was duly informed that they could not exchange the router directly, I first had to call the national ADSL support line, go through the troubleshooting tips, and then get a reference number. Come back with that they said, and we’ll do a swap. Then came the interesting news: “Telkom no longer supply this router, so you will be getting another model”

I am guessing that the problem rate with this router was high enough to cause Telkom to move to another model. I looked in the shop, and it appeared that the most likely replacement model would be another Telkom branded router, called the Duoplus 300WR. After looking up the specs of this thing on the internet, I was left less than pleased. The wireless antenna is only 3dbi, half the strength of the current model. It would mean me having to buy a new aerial, which I don’t want to do. I could be wrong however, as there were also Netgear routers in the shop, which may also be an option. I’m not fully sure.

To cut a long story short, I am still running the old Mega 105WR at home, and I’m not yet sure what I plan to do about it. I am tempted to buy a proper name brand router, but at this point I’m trying to find one that has all the features I need and is not too expensive.

My advice is that if you have one of these devices, and it routinely freezes up like mine, get the reference number, swap the model out and hopefully the newer router will perform better. I can’t recommend this router anymore, it just doesn’t perform as expected. Rather buy a name brand router, it will serve you better in the long run.

Microsoft Web Platform Installer

October 11, 2009 Leave a comment

For years, the LAMP stack has been a success story of the open source world. Based on the components of Linux, Apache web server, MySQL and PHP/Perl/Python, it has enabled many fantastic applications to be built: WordPress, Joomla, Drupal, Moodle and more. The ease of use for developers and admins has lead to this stack being almost the de facto standard for hosting these applications.

Generally these packages also worked under Microsoft’s web server, IIS, but usually with some difficulty in setting up and maintaining the site. Most of the projects help forums are for those who run it on the LAMP stack. Downloads are usually zipped, and need to be set up manually. Linux distros such as Debian may have the packages in their repositories, but they may be a little out of date.

Microsoft finally decided to help people who run Windows Servers to be able to easily join in on the party. The result was the Web Platform Installer, more information can be found here

While the focus of the tool seems to be to promote ASP.Net based applications, PHP based applications are also available and supported. Indeed, supporting these PHP applications has made installing products like Moodle and Gallery a lot simpler, as the package has scripts that set up the correct rights on folders, connection to the database and more.

The Installer sets up various aspects of IIS for you, installs a Microsoft SQL Express Database, PHP and other tools you will need for the applications.

At the moment, the number of applications is still quite small, but it is growing. Applications I wish to see in the future include PHPBB3, MediaWiki, and most importantly, Joomla. Microsoft does not package the applications themselves, but rather provides the guidelines and tools to create the packages. Hopefully members of the above mentioned communities will band together to package the apps to eventually have them appear in the Installer.

The Installer makes life easy in many ways, but it is not the be all end all. Admins still need to test the security of their websites, directory permissions and so to ensure the most secure website possible.

Overall, I really like this tool and hope that in time it will continue to grow and offer more and more killer applications. Microsoft have done sterling work to get PHP to run better and faster under Windows, and this is hopefully a sign of even more things to come.

Using multiple web browsers

September 10, 2009 Leave a comment

Using 2 internet browsers is an increasing trend for many people these days, as they switch between browsers for speed, security or features. The most popular browsers are Internet Explorer and Firefox, and between those 2, most people have all they need.

However for some users, more browsers are needed, whether for web design work or a craving to experiment with different browsers. Here is a snip of my Quick Launch Toolbar:

BrowsersFrom left to right: Internet Explorer 8, Maxthon 2.5, Firefox 3.51, Opera 10, Chrome 2, Safari 4

That is a total of 6 browsers. The reason I have so many is because of the work I do in Joomla for my job’s website. I like to test the pages in multiple browsers to get a feel for speed, font and layout issues as well as general usability.

I find that each of my browsers has pros and cons. Some have more features that are useful, some are faster, some have the right balance. Chrome is useful for raw speed on sites, Opera for layout issues due to its strict method of handling web standards, IE for certain sites, especially some useful Microsoft websites. My default browser however is Maxthon. It has the perfect blend of features I need to make my life easier. While not the fastest, I am more than happy with it.

For all the hype around it, I’ve never been all that huge a fan of Firefox, and as such I rarely use it. Safari rarely gets used as well, as the layout and controls feel strange to me.

Thanks to competition, IE has been forced to grow and improve, which is a good thing for those who just want to get on with the job. In turn, other browsers have been forced to be innovative to stand out, and this has lead to a win win situation for the customer really. There’s certainly never been a better time to be a web surfer.