How to discourage spammers from sending a lot of requests to my sites

Hi,

When I checked my Awstats I noticed that I am getting a lot of hits from certain countries (I am sure you can guess which they are) and there is no doubt that those hits are coming from low webmasters who are on the dark side of the Internet. I have no idea what their goal is but they surely are keeping my server busy for nothing.

So, without blocking a whole country on an IP level, what are my options to discourage such people from visiting my website, sending excessive requests, sending spam sites to my Awstats logs?

[FONT=verdana]

No, I can’t guess what countries they are. Why don’t you tell us? Just because you’re getting traffic from a given country, that doesn’t mean that it’s somehow disreputable. There are good guys and bad guys everywhere in the world.

I also see a lot of traffic from countries where I wouldn’t expect my site to attract an audience. But it’s not something I worry about.

Of course, if you have good reason to believe that this traffic is from “low webmasters who are on the dark side of the Internet”, that’s another matter. In that case, it would be good to know exactly why you think that and what you are worried about.

Mike
[/FONT]

Look at your logs and use mod_rewrite to block the offending IP addresses - in the httpd.conf or htppd-vhosts.conf if possible because they’d only be loaded once (on Apache’s start) rather than MULTIPLE times for each request.

However, I have to agree with Miki that not all foreign visitors are bad. Some may actually be SE’s attempting to register your website, some may be possible visitors looking for information and some may be, as you suspect, probing for attack vectors on your website (don’t worry, the “pros” won’t even let you know that they’re about before you’ve been disabled - IMHO, it’s the “script kiddies” that you have to worry about entertaining).

Regards,

DK

I know what you mean nayen; I hate seeing all the crap in my logs as well. I update mod_rewrite every now and again to remove the referers but there are always new ones.

in the httpd.conf or htppd-vhosts.conf if possible because they’d only be loaded once (on Apache’s start) rather than MULTIPLE times for each request.

I will have to check this out @dklynn ;

David, thanks for your suggestion. I know that blocking IP addresses one by one is one of my options but I will not do that because not everyone uses a static IP, especially spammers and hackers. I am thinking about a more systematic solution like “captcha to prevent spambot comments” if possible.

nayen,

I"m glad to read that you understand the uselessness of blocking IP addresses (proxy and botnet issues for those who didn’t) even more than dynamic IP addresses.

CAPTCHA is an excellent tool but only that. As was probably mentioned above, even CAPTCHA (or encoded links to e-mail addresses which will open e-mail clients) cannot get around a human manually gathering e-mail addresses and sending SPAM.

It was mentioned above that you should use a form to e-mail messages to you (NEVER to anyone else) and that at least one CAPTCHA method should be used but I would also recommend that your PHP script examine the contents of the message and simply use the “bit bucket” (stop processing and DELETE the message) to eliminate SPAMmy messages (with links, HTML code or “bad words”). This can be done without the SPAMmer’s even knowing that he’s been wasting his time … without wasting yours!

Regards,

DK

David,

I don’t know why you continued with spam comments but preventing spam comments is something else and that is not my concern here. I want to discourage webmasters who keep my servers busy. When I check my Awstats, I see the following:

Ukraine, Russia and China are always in top 5 countries. I know that most of this traffic is coming from blackhat webmasters trying strange things on websites. Some IPs make hundreds of pageviews and hits and cause bandwidth loss. They populate my “Links from an external page” section with .ru sites and all that crap. Their hits cause hundreds of 404 errors which clearly show that they are using scrapers or bots to check security vulnerabilities on websites.

This is the type of traffic I am trying to discourage if I can’t prevent. There is nothing to do?

Security vulnerability scans use very little resources as they will return a 404 response. In the case of high volume scans, you can block them with mod_evasive or similar, that will block rapid successive requests. You can also use a web application firewall, but the cost of implementing this is generally a lot higher than an alternative option of increasing hosting resources if these requests are pushing the limits.

Hi nayen,

SPAM is e-mail. What you’re trying to control is visitors from specific countries. Unfortunately, IP blocks are not assigned in massive blocks by country so you’ll need to use a service (or build your own massive database) to determine the IP blocks for the countries you want to ban. Adding those by block will overwhelm your server for EVERY request so doing anything is a waste of time (IMHO).

If you have a half dozen IP addresses that waste a lot of your bandwidth, however, simply block them using the %{REMOTE_ADDR} variable and specifying the IP addresses. If they are hackers, though, or even SE’s with an IP block rather than a dedicated IP address for their spider, they’ll be around the %{REMOTE_ADDR} block in a flash.

IMHO, best to relax and make sure that your website is secure.

Regards,

DK