FILTER EVERYTHING - awesome or foolish? you decide

$_GET = filter_var_array($_GET, FILTER_SANITIZE_STRING);
var_dump($_GET);

The above code would sanitize everything in $_GET according to the FILTER_SANITIZE_STRING flag. Try it. It removes html, for example.

Do you think this is AWESOME? Or a really bad idea.

Share your thoughts… because I like it.

There is certainly nothing wrong with globally applying filtering to all fields to get rid of characters that would not be valid in any of them. That may simplify the additional validation required to ensure that each field only contains valid values.

For what it is worth, if you really did want to always apply a certain filter (you already are, whether you know it or not) to user-input then there are the filter.default and filter.default_flags INI options (manual).

Each field needs to be separately validated for what that particular field is allowed to contain. Only if all your fields need the same validation is it worth applying processing in bulk.

Removes HTML? What if your user is trying to update a HTML template/ file in a CMS?

Use the right tools for the job. If you need to strip and sanitize all input, then this is not awesome, it’s simply common sense. :slight_smile:

I think I read that that variable should not be used as your only way of safegaurding against attacks.

(I didn’t read Salathe’s link)

I think the best way is to create your own functions for your needs and these functions will check data by some rules.
as more you make generalization than more data loss you’ll get

Or anywhere you may be offering TinyMCE or similar editing.

I had no idea about the filtering default flags in php.ini, which would be a better solution if you want to filter everything than what I had.

If you are accepting html, then obviously you wouldn’t want to use my example, but I think for a lot of simple sites, filtering everything could save them some headaches.

With that being said, the majority of you don’t seem to care for the idea, but it really depends on the type of site you’re running. If your sites is taking in many forms of data, filtering everything would be a bad idea. For example, I don’t see a file upload working if you filtered your strings to death, because they are sent by post.

So, I would conclude, the consensus is that filter everything is not awesome.

Filtering globally is a bad idea in my opinion, as it lead to possible exploits.

Instead, filter the data you will use. That way you will only spend “server resources” filtering the parts you will need.

If you filter the enter GET, POST etc array, you can end up filtering a lot more data than you expected if I send more data your way. For example you might expect 10 values over POST, I can send you thousands per request, and put in content that takes longer time to be filtered. Then by doing multiple requests with this to your server at the same time I will use up your servers resources, and deny your real users from having a good experience on the website or even to access it.

I dont need to “kill” your server to do a successful denial of service attack, the most efficient methods to do that is find places to exploit like that one, as at one point the server(s) will be overloaded and it will affect your legit users.

Though, please note I am not saying this will happen in your case, the chance for that is extremely small. I just want you to keep the possible security risks in mind while programming.

There’s nothing stopping you from filtering everything with something more aggressive than the defaults, and then doing the usual filtering of individual values as one normally would. At least that way, $_GET, $_POST and family will be affected globally (with the bonus side-effect of making you use the filter extension if you want a different/more-/less-sanitised value).