Website search feature (mysql db based) being exploited

Hello,

I have a website with all my data stored in a mysql database. A few weeks ago I added a search feature. Everything was going fine, until a few days ago. My website become extremely slow and unresponsive. After troubleshooting with my hosting company, as well as cloudflare (anti-ddos company), we figured out the problem was with the search script. Once I disabled the search feature, the load on the server went down significantly.

I created this search feature all by myself, and I think something may not be written as good as it should. The script works exactly as I want it to work, it’s just that there seems to be room for exploitation by nefarious individuals which results in the server load going very high thus slowing down the website.

Some background: this is a high traffic site, and it resides on it’s own managed dedicated server, with lots of cores and RAM. Normally the site loads very fast, but after the search feature becomes exploited, the website load sky rockets and the site becomes very slow.

So, I would very much appreciate if someone could take a look at my code, and suggest any edits to make it safer.

My code is as follows, and resides on 3 pages. First one is the search form:

< form method='post' action='search.php' >
< input type='text' name='search' size='10' >
< input type="image" src="images/search.gif" alt="Go"  />
< /form >

Then the search.php file is as follows:

$search = $_POST["search"];
header("Location: http://www.domain.com/find.php?find=$search");

and the find.php file:

$searchterm = $_POST["find"];
$query  = "SELECT * FROM websiteDB WHERE keywords LIKE '%$searchterm%' OR title LIKE '%$searchterm%' OR description LIKE '%$searchterm%' OR url LIKE '%$searchterm%' ORDER BY time DESC";        
$result = mysql_query($query) or die("Error: " . mysql_error()); 

That script is very vulnerable to SQL injection, it must change.
Mysql has now been removed from php, so should not still be used.
Moving to mysqli or PDO will not just keep you current, but enable you to use “prepared statements” for queries which can help protect your database.
You should also sanitize any user input.

I don’t get the three scripts. :confused:
You first use post, then in the second pass the post data to a url variable.
Then in the third, you use the post data again. I don’t see the logic in that.

3 Likes

Yes, the logic of having the search.php file does seem wasteful.

form sends to search.php
search.php sends to find.php
find.php queries database

when it could be

form sends to find.php

In addition to what SamA74 posted about the PHP needing a rewrite, depending on the size of the websiteDB table I think doing open-ended wildcard full text searching on 4 fields is prone to eventually becoming a real pain point if it isn’t already.

The exploitation issue is a PHP code issue, not a Database issue.

So, focusing on the poor query

SELECT * FROM websiteDB
 WHERE keywords LIKE '%$searchterm%'
  OR title LIKE '%$searchterm%'
  OR description LIKE '%$searchterm%'
  OR url LIKE '%$searchterm%'
 ORDER BY time DESC

Do you really need every field (the *) or only certain known fields?
Are the WHERE - OR fields in the order in which they are most likely to match first?

5 Likes

Can you be more specific about traffic and database record count.

A similar search problem arose and was solved using Pdo.

Dynamic search demo

Source is included.

Edit
Are there indexes for all of the search fields?

Given the explanation above, I would suppose that someone is scraping your site’s information.

If you’d implemented the search properly, using GET method instead of POST, you would have gottent the clear picture from the site access log - what has been searched.

So first of all I’d suggest three things:

  1. change the method to GET
  2. at least make this line as
    $searchterm = mysql_real_escape_string($_POST["find"]);
  3. Check the timing for the above query. If it runs slow then you have to optimize it.
1 Like

You can use this info to identify bot agents and IPs to set up blocks in htaccess or elesewhere to rebound a number of them.
Creating blocks has greatly reduced the number of attempted SQL attacks on a site I manage, where I monitor and record such activity.

Cloudflare didn’t block the SQL attacks?

From my experience I can see that bots more easily target GET requests than POST so I don’t think GET would help in this respect. However, GET is the more proper method for search result pages and I think should be used in such cases. To reduce the number of bots doing the search it may be a good idea to make the search only available via javascript (for example, creating the search form in js or using ajax to fetch result pages, etc.).

Also, searching large database tables like this can take long. If that is the case then consider caching the results for a period of time.

I merely meant that GET requests can be simply traced from the access log. The respect I were concerned of is to investigate first, whether it’s bots at all.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.