The calm after the storm – coping with an email spoofing campaign

We’ve had a stormy week in Worcester. On Tuesday evening I had to drag two trees from the lane to get home – one I managed myself, the other was a touch bigger and needed a tractor and friendly local farmer. Yesterday presented a very different set of blockers for us!

Things were going fine until about 1:15pm when we noticed our email server struggling. That was followed by a big increase in our bandwidth use. And then the phones, they went bonkers! The calls were from people who’d received an email, apparently from us confirming payment for £120. It can’t actually be from us can it? No, it couldn’t be us… we’re locked down like a bunker at the best of times but the messages started to ring panic bells.

Emails have data hidden in them to help diagnose delivery problems. These ‘headers’ showed the message originated from one of our servers. How? Worse, the email contained malware and was being sent on a massive scale. Our email server was choked processing all the out of office replies and there were 10,000s of them.

We locked everything down immediately. This made no sense at all.

All hands on deck

We couldn’t see how anything could have got out of the building but hunted around. Meanwhile the phones were in meltdown! Literally everyone in the building was answering them (I’m sure some of them have never used a phone with a cable before!).  It was brilliant to see everyone pulling together so quickly.

Now, interestingly, the people calling us weren’t our customers. This was good:  at least we hadn’t lost any data but where were they coming from? Suddenly we realised that as well as copying the contents of the original message from us, they had copied the headers, including our internal server names. Sneaky. Digging a bit deeper took us east. East, to a large Russian bot-net that was sending them out… about 1.5 million of them.

So even though we’re not the source or the cause, we’re now at the centre of a massive communications mess. First things first, we recorded a message on our phones to let people know what’s going on. That was the single best thing we did. It did take about 30 minutes to work out how to get it actually into the phone system but once it was there things calmed down. Next the website – we cleared everything off the homepage and put up a simple message. Again much better.

Spam-email

Then one of our devs who goes by the name SillsForce (he is basically the love child of Salesforce and Gloucestershire) noticed that the emails contained a banner logo which was still pointing to our site. That explained the increase in traffic and an opportunity to tell people about it. So we switched out the logo with a large red notice highlighting that this email is spam. That got us 1,000,000 internet points from a twitter follower and probably saved a huge number of people from a nasty virus. That then gave us time to give the Information Commissioner a heads up as well as the local Police.

Things have broadly returned to normal now but there were some easy lessons learned. Essentially we had to work the problem and communicate it. Doing both isn’t easy with every alarm and phone and warning system crying around you. But we also realised the strong position we’re in being a techie house. We know how to reroute phone calls on the fly; we have the ability to change our site in a heartbeat and deal with a surge in bandwidth that we were expecting next week for Black Friday, not yesterday!

It also highlights how rubbish email can be. There are many techniques to stop people spoofing other email accounts (all of which we already employ) but very few email providers actually respect or look at them. This would not have been an issue if they had taken on board some very simple sender checks. Ironically Google were the next to be spoofed that day – at least Gmail had put it in spam and identified the attachment as a virus – let’s hope that others consider implementing the same precautions.