Skip to main content

Admin Notes: Setting up robots.txt on your Domino server

So, you're running a Domino server, and one of your co-workers finds the Domino help databases on your server indexed through Google.

What do you do?  

The development team wants them left there so that they can be used, but they don't want to have to login to use them.

How do you get them unlisted from Google's index?

Easy, using a robots.txt file.   But you say, how do I create it, what database do I put it in?   And how do I get Google to remove the listing of my databases?

Create a Robots.txt
I like this site for creating my robots.txt files.   Simple, easy to use.   It handles 99% of my needs.

Where to put Robots.txt on a Lotus Domino server
The Robots.txt file goes in the 'domino/html/' sub-directory of the Lotus Domino data directory.  Not in a specific database.   It will get ignored if it is not in the root of your site.  (http://server.com/robots.txt)

Validating the Robots.txt file
There are lots of tools that will validate your robots.txt file.  I like this one.

Removing sites from Google
With Google, you do have the option of blocking your server with a Robots.txt and then waiting for the results to drop off the radar.

If you need it faster, you need to use the Google Site Removal tool.   It's located here: https://www.google.com/webmasters/tools/removals

Comments

  1. You can put the robots.txt in a database. The only thing you should do is Redirection document on your server, with the name robots.txt and redirect this to the url of the robots.txt in a database.
    We use this method for some years, and Google is not complaining

    ReplyDelete
  2. Any tips on how to restrict access across mulitple web domains all hosted on the one server, using robots.txt. We have all sites under /data/web/sitedomain1/site1.nsf, /data/web/sitedomain3/site3.nsf, etc. Can each /sitedomain#/ foldder or site#.nsf have a robots.txt file?

    Cheers
    Ian

    ReplyDelete
  3. You can put the robots.txt in a database. The only thing you should do is Redirection document on your server, with the name robots.txt and redirect this to the url of the robots.txt in a database.
    We use this method for some years, and Google is not complaining

    ReplyDelete

Post a Comment

Popular posts from this blog

Policies and Controls are King in the IT Security world

I came across an article by Roger Grimes over at Infoworld on how security policies and controls are the real power when it comes to IT security. Roger mentions the SANS 20 Critical Security Controls for Effective Cyber Defence , which are a great read for anyone looking at updating or auditing your policies for completeness. The SANS top 20 controls are a must for any organization: Inventory of Authorized and Unauthorized Devices Inventory of Authorized and Unauthorized Software Secure Configurations for Hardware and Software on Laptops, Workstations, and Servers Secure Configurations for Network Devices such as Firewalls, Routers, and Switches Boundary Defense Maintenance, Monitoring, and Analysis of Security Audit Logs Application Software Security Controlled Use of Administrative Privileges Controlled Access Based on the Need to Know Continuous Vulnerability Assessment and Remediation Account Monitoring and Control Malware Defenses Limitation and Control

Fun Little Earthquake

It's 1:45pm EST in Ottawa, Ontario, Canada. We just had an earthquake.  Not strong enough to damage anything, but enough that I watched people run out of buildings. What a fun Wednesday.

Error 217 - Error creating product object on Domino 64 bit

I'd like to share something with you.   An error that you'll get if you are trying to use ODBC with Domino 8.5.1 64bit. It starts out with an agent error of Error 217.  The text of the error is "Error creating product object" You can read about it here on the Notes/Domino forum . You can find the solution here as well . I guess I'm now waiting for Domino 8.5.2 for a solution for this.   It would have been nice to have had this in the release notes.  It would have help me greatly.