Restrict Website Copier/Offline Browsing Tools from Copying your Blog

There are many reasons as to why you not want your blog to be copied by those offline browser or website copier tools or softwares namely to list few:

1. Your Blog have copyrighted material.
2.Your Blog contains private information such as adress or phone numbers of your users. 
3.You don't want your user emails to be grabbed by email grabbers. and much more.


Httrack ( which is an Offline browser ) websites has a page which tells you more about what are the problems with these tools and the harm they can do to websites. One of the problems it explains is that it leads to "bandwidth abuse" which means your website bandwidth is used by only some users and it causes delays to other users. For example you have a website at one of the hosting companies which gives you a limited bandwidth usage of 50 gb more or less. Hence to save your websites bandwidth you must protect your website from these tools.
However we are here concerned for only blogs on blogspot.com and hence we don't need to take care of the bandwidth problem. But if your blog has some copyrighted content and obviously you want to prevent its illegal distribution you need to read this post.

Restrict Website Copier/Offline Browsing Tools from Copying your Blog

To secure your website from being copied we will make use of custom robots.txt feature of blogger platform. One do not needs any website protection software to do the task
If you donot know what a robots.txt file is, get some info. on Google support page.

1. Open your Blogger Dashboard and Go to Settings > Search Prefrences >  Crawlers and Indexing.
2. Now open a new tab page in your browser and go to "http://www.yourblogname.blogspot.com/robots.txt " and copy all the content in browser in notepad ( this is the default robots.txt file of your blog and hence we need this code too). For example here is my default robots.txt file.

Restrict Website Copier/Offline Browsing Tools from Copying your Blog

3. Now add the following code to the previously copied text in notepad.
User-agent: httrack
Disallow: /
User-agent: NetCaptor
Disallow: /
User-agent: Offline Explorer
Disallow: /
User-agent: SpiderKU/0.9
Disallow: /
User-agent: Steeler
Disallow: /
User-agent: WebCopier v3.3
Disallow: /
User-agent: WebCopier v3.2a
Disallow: /
User-agent: WebCopier
Disallow: /
User-agent: webcrawler
Disallow: /
User-agent: Web Downloader/4.9
Disallow: /
User-agent: Web Downloader/5.8
Disallow: /
User-agent: WebGather 3.0
Disallow: /
User-agent: WebStripper/2.56
Disallow: /
User-agent: WebZIP/3.65
Disallow: /
User-agent: WebZIP
Disallow: /
User-agent: Wget
Disallow: /
User-agent: Zao
Disallow: /
User-agent:  Zeus 2.6
Disallow: /
User-agent: *
Disallow: /cgi-bin/
These httrack, web downloader , webzip etc are all the names of offiline browser utilities and disallow: / disables these tools from crawling or copying your blog completely for offline use.

4. Now paste the complete text to custom robots.txt box and click save changes.
Restrict Website Copier/Offline Browsing Tools from Copying your Blog

Now your blog is protected from these Offline web sftwares. 
Check your blogs' robots.txt file by typing "http://www.yourblogname.blogspot.com/robots.txt" in browser.
Restrict Website Copier/Offline Browsing Tools from Copying your Blog

This security trick will help your site from getting mirrored or copied. 
Thanks for Reading. 

1 comment:

  1. A niche website offers a promise of specialised content but all too often fails to deliver on this promise. Instead of unique insights, articles written from a personal point of view and advice born out of experience they mostly consist of a disappointing array of advertisements. online website copier

    ReplyDelete

Powered by Blogger.