Can you subpoena Google if somone published copyrighted content on their website?

We had a customer purchase thousands of copyrighted PDF’s and upload it to their web server and put them into a public directory.

Google crawled their site and now if you search for our product in Google, our website shows up in the results as #1 and their PDF of the product shows up as #2. If you click on their link it opens the actual PDF.

Can we subpoena (or request) Google to tell us how many times they referred someone to our customers URL, since every time a user clicked that search result it gave the product away for free and would prove damages? For example

theirwebsite.com/directory/product3.pdf
…etc

I don’t see why you shouldn’t ask nicely, and ask for the results to be removed from search results. It’s worth a try. But it sounds as though your real beef is with your customer for uploading the documents to their website.

1 Like

The results are removed by the client, but what we dont know now is the damages done by anyone googling the content looking to purchase and getting it for free because google indexed our clients website.

So if google can tell me they referred 200 people to theirwebsite.com/directory/product1.pdf, and 100 people to theirwebsite.com/directory/product2.pdf, we can say our customer gave 300 people our product for free. Since the product is the actual pdf (hundreds of pages of proprietary research), anyone who clicked on a google link essentially downloaded the product in full

make sense?

It sounds like the problem was caused by uploading the documents to a publicly accessible folder, search engines will index the contents of those folders unless you deny access via robots.txt (and even then some engines will ignore robots.txt!).

The better way of doing this would be to upload the documents above web root so they can’t be indexed.

You can’t really blame Google, all they have done is index content which your client put there, as they do for all sites.

Not blaming Google at all, I’m asking if Google would cooperate to provide the stats on how many times these indexed links were accessed via their search. If they could provide something like:

301 clicks - theirwebsite.com/directory/product1.pdf
104 clicks - theirwebsite.com/directory/product2.pdf
502 clicks - theirwebsite.com/directory/product3.pdf

This would give us something to gauge what kind of damages were caused

I wouldn’t worry about what damage has been done. It will likely only cause self-torment.

Much better to focus on not letting it continue or happening again.

Sorry if I misread things :slight_smile:

Look in your server logs, you should find out how many times each file was accessed in there.

Wooo Hooo! That was a blunder by your customer.
First please ask them to block the directory that had the PDFs previously using robots.txt or password protecting that directory.
Do not expect google to give you any such data. They don’t provide case by case reports of such issues, especially when its not a problem created by google or if it is not a life threatening event.

Improve your policies and sales terms and conditions.

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.