How do I provide anti scraping feature to a website?

By Dextor
Oct 11, 2013
Post New Reply
  1. I am producing some text content on a site and it is very easy to copy. Are there good ways to protect the content from web scrapers?
  2. jobeard

    jobeard TS Ambassador Posts: 11,166   +986

    The sole means is to force user login to the site and that doesn't stop scraping - - just limits access to known users and inhibits things like the Google crawler. Clearly that also stops indexing for google search.

    Users can also perform SAVE PAGE AS and store pages on their PCs.

    Consider how a website operates; open access to the website and then issue http GET /page requests. All of that can be done using simple socket programming - - even various forms of scripting, eg Perl & PHP.

    Your sole protection is
    (c) nnnn DomainName. All Rights Reserved.
    as a footer to every page and to likewise watermark every graphic. They can still access, save and steal anything but it is also an enforceable offense.
    hellokitty[hk] likes this.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...