Bandwidth and disk space abnormally high?

If you were getting abnormally high bandwidth in your stats and abnormally high disk space used, what would be the first couple of things that you would check please?

  1. check to see that the stats were calculated correctly

for instance, in my stats package, the date range for the stats extraction is a default (last 7 days) but this setting might have got lost and you’re looking at stats for a whole month instead, or something

  1. check the files in the disk space used

you should definitely know which files you have, and how big they are, so if the disk space used is high, there’s gotta be one or more new big files in there, so find out what they are and where they came from

  1. check which module has the highest bandwidth associated with it

does the increase correspond to a legitimate increase in traffic, i.e. http requests

when several checks like this have been made, only then should you worry about having been h4x0red

:slight_smile:

Many thanks - it’s appreciated.

This all seems to be for the correct period.

How do I check this please? Is there somewhere where you can see the files in order of size?

Sorry, don’t understand about a module?

One other thing - when you’re ftp’ing files, is the ftp’ing included in the bandwidth or disk space?

Look at what was using the extra bandwidth / where the large files are and are the 2 linked - i.e. has someone been “borrowing” your FTP server to dump large files on and sharing them.

You can do this by checking logs - if your apache log processor shows an increase, what files, what scripts? is your ftp log showing anything abnormal?

Is it a Windows or Linux server, you can write a script or run an app to show you where the disk space is being used.

Thanks Tim, it’s really appreciated. How do I do the above please? I’m on Linux

Have a look into the find command - http://www.computerhope.com/unix/ufind.htm

you can search with that for files over a certain size for example, or order by size.

Thanks Tim, but where / what do I do with the code :

find -name '*' -size +1000k

Login using SSH and run that command, it’ll show find files that are over 1MB in size as an example.

For high space usage, I would check to see if anyone is uploading onto my servers. Also look around for old files you don’t need anymore and delete them. If you have a blog that is wildly popular, look for the content storage. Users may have posted images and uploaded them.

For high bandwidth usage, look for leaching. A lot of people could be using your media files such as pictures, videos, etc and hot-linking them.

can anyone suggest me some web hosting companies for unlimited space and bandwidth … thanks in advance…

No-one will offer truely unlimited space / bandwidth - there will always be some form of limit on there.

Disk space - Thanks Tim, I can’t run a script as I don’t have access to the shell. Any other way to do it?

Also, for the high bandwidth, it’s using 60Mb for unknown robots?? How do I stop that please?

I would guess you should have some view through a control panel to view the disk usage then.

60Mb for unknown robots is nothing to worry about.

It’s plenty to worry about - anyone know how to stop or greatly reduce the bandwidth used by unknown robots or unknown crawlers?

60Mb is absolutely nothing in terms of traffic - and it’ll just be a spider that your log reporter doesn’t know about.

ok, but with enough domains, all showing roughly the same, it adds up to quite a lot. Any ideas much appreciated.

Surely you want your site to appear in search engines, leave it…

Definitely do, but all the googles, yahoos and msn’s etc, are not unknown robots - are they ?

There are other search engines / systems to just those few… in this case its just one your stats package doesn’t know what it is.

hey i was away from my desk … any suggestions pls let me know…