SitePoint Sponsor

User Tag List

Page 1 of 3 123 LastLast
Results 1 to 25 of 57
  1. #1
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    Massive, automated SQL injection attack menaces the Internet

    It appears that a massive, automated SQL injection attack is wrecking havoc across thousands of sites (600000+ pages and counting).

    The attack apparently leverages Google to find candidates for vulnerable pages. The automated process then probes the candidate page for potential SQL injection vulns. If it likes what it sees (i.e. error messages directly from the sql server) it will immediately launch an all-out attack, injection SQL to find every text field in the database and append a malicious javascript to the value of every row.

    The SQL dialect used is SQL Server specific, so at this time only sites using this server are infected. However, it would be trivially easy for the attackers (or anyone else) to write a MySQL, Oracle or PostgreSQL variant. This is not a server-specific vuln, it is an automated attack which finds badly coded applications which are vulnerable to SQL injections. Most applications infected are based on the older ASP (i.e. not ASP.NET), although any script language can be vulnerable. This is not ASP, PHP, CF specific. The attack doesn't care what script technology is used or what web server technology is used.

    The injected javascript will attempt to redirect visitors to a malicious website hosted in China (although they may be moved). The malicious site attempts to infect visitors. At present the malicious sites have been taken down, but the attackers could redirect to other sites, as the attack is ongoing.

    So far the malicious sites have only tried to exploit already patched vulns of IE, but that could easily change. There's know 0-day exploits in Quicktime which could potentially affect both PC and Mac users (only verified on XP and Vista at present).

  2. #2
    l 0 l silver trophybronze trophy lo0ol's Avatar
    Join Date
    Aug 2002
    Location
    Palo Alto
    Posts
    5,329
    Mentioned
    1 Post(s)
    Tagged
    0 Thread(s)
    It seems to be fairly Microsoft-centric right now, but it's not really surprising that someone's brute forcing SQL injection attacks for any architecture at this point. XSS has been well-known for quite a long time now, and it's always been that thorn in your side that you should watch out for. Just goes to show that you should always try to counter these sort of exploits in your own code, no matter how old the attack is.

  3. #3
    Non-Member
    Join Date
    Jan 2004
    Location
    Seattle
    Posts
    4,328
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    If your websites are targeted by SQL injection, then you can fix the problem by merely replacing your online database tables with the original DB tables from your local computer - right?

  4. #4
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by geosite View Post
    If your websites are targeted by SQL injection, then you can fix the problem by merely replacing your online database tables with the original DB tables from your local computer - right?
    Unless your database actually held transactional data, in which case you're toast.

    And you still have to plug the attack vector. This is an automated attack (nothing personal) and unless you plug the hole it will get you again.

  5. #5
    In memoriam gold trophysilver trophybronze trophy Dan Schulz's Avatar
    Join Date
    May 2006
    Location
    Aurora, Illinois
    Posts
    15,476
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    And folks, this is a great reason why we should continuously update our code and our online applications.

  6. #6
    SitePoint Author silver trophybronze trophy
    wwb_99's Avatar
    Join Date
    May 2003
    Location
    Washington, DC
    Posts
    10,629
    Mentioned
    4 Post(s)
    Tagged
    0 Thread(s)
    Also a great reason to use Defense in Depth to protect your apps. Random web users should not be able to enumerate tables and such in most cases.

  7. #7
    SitePoint Member
    Join Date
    Nov 2006
    Posts
    17
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Has this attack made any splash in the media?

  8. #8
    SitePoint Evangelist praetor's Avatar
    Join Date
    Aug 2005
    Posts
    479
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    It has, on some news sites where they blame an ISS vulnerability...

  9. #9
    SitePoint Wizard Darren884's Avatar
    Join Date
    Aug 2003
    Location
    Southern California, United States
    Posts
    1,616
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Hmmm interesting that it uses Google
    Have a good day.

  10. #10
    Function Curry'er JimmyP's Avatar
    Join Date
    Aug 2007
    Location
    Brighton, UK
    Posts
    2,006
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I assume they're just querying google for strings matching SQL errors... right?

  11. #11
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by Darren884 View Post
    Hmmm interesting that it uses Google
    Attackers have been using that resource for some time now. If an attacker becomes aware of an exploit in a product he will try to find sites using that product by searching for characteristics specific to that product. There is no such thing as "security through obscurity".

    By googling you can even find database dumps (text files) with sensitive information. Basically the admin dumped the database onto a file for downloading, but google got it indexed. Even deleting the file will not suffice because google keeps a cache. Bugger.

    This particular attack just searched for anything with e.g. an ?id=xxx parameter. Having found that it would probe with an apostrophe character to see if an SQL error ensued. If so, the site would probably be vulnerable.

    Because SQL Server allows command batching it was easier for the attacker to automate this attack, basically letting the site's own servers do the job. As wwb pointed out, there must have been several additional faults, e.g. running the site under an account which 1) can modify product tables and 2) can access sysobjects. Regardless of command batching, any site that synthesize SQL from user input is vulnerable. Some database systems may require the attacker to do some more work to automate the attach, but it is still entirely possible.

    Expect to see more attacks like this in the future.
    Last edited by honeymonster; Apr 30, 2008 at 03:08.

  12. #12
    Function Curry'er JimmyP's Avatar
    Join Date
    Aug 2007
    Location
    Brighton, UK
    Posts
    2,006
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by honeymonster View Post
    By googling you can even find database dumps (text files) with sensitive information. Basically the admin dumped the database onto a file for downloading, but google got it indexed.
    How does google know these files exist if there is no link to them?

  13. #13
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by JimmyP View Post
    How does google know these files exist if there is no link to them?
    I suppose that there must be some link, possibly on a page believed to be private - or - the server allowed directory enumeration. Some sites treat requests from google's spider differently (SEO) and some even configures those requests to use directory enumeration in an attempt to have every content indexed. It could also be caused by an error message which spilled too much information - an often seen vulnerability.

  14. #14
    SitePoint Wizard silver trophy Karl's Avatar
    Join Date
    Jul 1999
    Location
    Derbyshire, UK
    Posts
    4,411
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    I often think that Googles toolbar plays a part in that, if you've got it installed and you visit your private page/file that isn't linked from anywhere - Google gets the URL back to them, then adds it to their index. I don't know if it's true, but that's how it has appeared to me in the past.
    Karl Austin :: Profile :: KDA Web Services Ltd.
    Business Web Hosting :: Managed Dedicated Hosting
    Call 0800 542 9764 today and ask how we can help your business grow.

  15. #15
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Good point, Karl- Never thought of that. Would be scary - really scary - if that's the case.

  16. #16
    Function Curry'er JimmyP's Avatar
    Join Date
    Aug 2007
    Location
    Brighton, UK
    Posts
    2,006
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    @Karl - surely that would be in breach of some legislation!

  17. #17
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by JimmyP View Post
    @Karl - surely that would be in breach of some legislation!
    Sure about that? If you read the license for the toolbar they reserve the right to monitor your browsing and report back (anonymously) what pages you visit. I believe that is one of the inputs to the pagerank system.

  18. #18
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Somewhat related, check out this attempt at SQL injecting automatic number plate scanners.

    Most likely a Photoshop job, but funny just the same.

  19. #19
    Non-Member
    Join Date
    Oct 2007
    Location
    United Kingdom
    Posts
    622
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)

    Talking

    You can block google from indexing certain pages or directories using a file called
    robots.txt
    in that file you should put say to block "mycms" directory the following:
    User-agent: Googlebot

    Disallow: /mycms/
    or to block a page, say "mypasswords.php" you should inculde the following:

    User-agent: Googlebot

    Disallow: /mypasswords.php
    It should probably be a process that should be carried out from now on to help protect sensitive information better.


    *edit start*
    I have been informed that actually this not the best way to do it (read on further in the thread)

    however this does the same job, more effectively. Quote from Google:

    To prevent all robots from indexing a page on your site, you'd place the following meta tag into the <HEAD> section of your page:

    <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
    It might be a good idea to include that in the head of documents that should not be listed on search engines, particularly with the new SQL attacks described.

    Please note that NO sensitive files should be kept online, and if they MUST be online, they should be effecively password protected.

    *edit end*


    hope this could be of some help to someone

    ro0bear
    Last edited by ro0bear; Apr 30, 2008 at 08:12.

  20. #20
    SitePoint Wizard
    Join Date
    Feb 2007
    Posts
    1,274
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by ro0bear View Post
    It should probably be a process that should be carried out from now on to help protect sensitive information better.
    I disagree. This advice would merely promote more of the "security through obscurity". I doesn't work; there is no such thing. Far too many ways the content will spill out. The *only* way to secure information is to ensure only authenticated (by password or certificate) users/browsers can get to the content.

  21. #21
    Function Curry'er JimmyP's Avatar
    Join Date
    Aug 2007
    Location
    Brighton, UK
    Posts
    2,006
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by honeymonster View Post
    Sure about that? If you read the license for the toolbar they reserve the right to monitor your browsing and report back (anonymously) what pages you visit. I believe that is one of the inputs to the pagerank system.
    That's one addon that I'll never use again!

    Reminds of all the media hype surrounding the phorm initiative ... http://news.bbc.co.uk/1/hi/technology/7301379.stm

  22. #22
    SitePoint Addict chestertondevelopment's Avatar
    Join Date
    Dec 2005
    Location
    Essex, UK
    Posts
    241
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by ro0bear View Post
    You can block google from indexing certain pages or directories using a file called


    in that file you should put say to block "mycms" directory the following:


    or to block a page, say "mypasswords.php" you should inculde the following:



    It should probably be a process that should be carried out from now on to help protect sensitive information better.

    hope this could be of some help to someone

    ro0bear
    This also allows any human reading your robots.txt file to see where sensitive information is stored....

  23. #23
    Non-Member
    Join Date
    Oct 2007
    Location
    United Kingdom
    Posts
    622
    Mentioned
    2 Post(s)
    Tagged
    0 Thread(s)
    hmmm I see your point, is it better to make sure google doesnt display the files but people being able to look at the robots.txt file? or for google to pick up the files and display them?

    I certainly agree with
    The *only* way to secure information is to ensure only authenticated (by password or certificate) users/browsers can get to the content.
    What are your thoughts?

    ro0bear

  24. #24
    SitePoint Addict chestertondevelopment's Avatar
    Join Date
    Dec 2005
    Location
    Essex, UK
    Posts
    241
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Quote Originally Posted by ro0bear
    hmmm I see your point, is it better to make sure google doesnt display the files but people being able to look at the robots.txt file? or for google to pick up the files and display them?
    Neither, if you don't want people viewing something, don't put it online. Whether it be photos on Facebook or something like this, it still applies, you don't know who will access it and you don't know where it will be copied.

  25. #25
    SitePoint Wizard Hammer65's Avatar
    Join Date
    Nov 2004
    Location
    Lincoln Nebraska
    Posts
    1,161
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    We continually see code from new coders here that they got off of "some web site" somewhere that has these vulnerabilities. There are far too many books and tutorials that prefer to make it "easy" rather than teaching security from the start. If someone posts code from one of these sources that is insecure, find out where it came from, and give the person responsible for it hell. We can do all we want to secure our own code, but all of us use the internet, and at some point could have our personal information at risk because of the spread of insecure code by those that don't know any better.
    Visit my blog
    PHP && Life
    for technology articles and musings.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •