Gowikibot is the web crawler for the privacy-focused Gowiki search engine which is currently in development. Our bots are identified by the following user-agent:
Mozilla/5.0 (compatible; Gowikibot/1.0; +http://www.gowikibot.com)
Our bots respect the robots.txt instructions, the meta robots tag, and the link rel follow/nofollow directives.
In the robots.txt file, our bots will respond to either User-agent: Gowikibot or User-agent: gowikibot or, if neither is present, User-agent: *.
We attempt to not burden servers by only crawling a few pages of a site at a time and by using a reasonable crawl delay between each request.
At this time, we only crawl html and pdf files, as well as robots.txt files. We appreciate your understanding and you continuing to allow access to our web crawlers.
Please note that our search engine is only accessible from the United States at this time (https://www.gowiki.com)