How do I provide anti scraping feature to a website?

I am producing some text content on a site and it is very easy to copy. Are there good ways to protect the content from web scrapers?
 
The sole means is to force user login to the site and that doesn't stop scraping - - just limits access to known users and inhibits things like the Google crawler. Clearly that also stops indexing for google search.

Users can also perform SAVE PAGE AS and store pages on their PCs.

Consider how a website operates; open access to the website and then issue http GET /page requests. All of that can be done using simple socket programming - - even various forms of scripting, eg Perl & PHP.

Your sole protection is
(c) nnnn DomainName. All Rights Reserved.
as a footer to every page and to likewise watermark every graphic. They can still access, save and steal anything but it is also an enforceable offense.
 
Back