Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Limit all non-USA crawlers? The USA traffic is probably the worst offenders for scraping content.

Personally, I wouldn't even bother with this. As long as other sites aren't outranking you with your own content then I don't see a problem. If they are outranking you with your own content, then you need to evaluate your SEO strategy.



Google will apparently remove sites that are copying copyright content: http://www.google.com/support/websearch/bin/answer.py?hl=en&...

So he will definitely outrank them if he owns the content.


You could add some unique words/sentences to the pages which will make googling for "mirrors" easy (i.e. automated) so sending copyright notices to Google can be almost fully automated.


Google will also pay those sites infringing your copyright to display their pay-per-click ads.

They will comply with some removal requests, but sending a letter for each case of infringing content does not scale well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: