# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file # Ticket #8398 # To ban all spiders from the entire site uncomment the next two lines: User-Agent: * # Disallow: / ### Search Engines # allow Google User-Agent: Googlebot # allow Yahoo User-Agent: Yahoo! Slurp # allow Bing User-Agent: bingbot # allow WaybackMachine, archive.org User-Agent: archive.org_bot # allow Twitter User-Agent: Twitterbot # allow Baidu User-Agent: Baiduspider # allow Yandex User-Agent: YandexBot # allow Nutch User-Agent: nutch # allow DuckDuckGo User-Agent: DuckDuckGo-Favicons-Bot ### Browser # allow Mozilla User-Agent: Mozilla # allow Internet Explorer User-Agent: MSIE # allow Lynx User-Agent: Lynx # allow Opera User-Agent: Opera # allow Safari User-Agent: MobileSafari # allow Google User-Agent: Googlebot-Mobile # allow Google User-Agent: Googlebot-M # allow WGet User-Agent: Wget # allow Curl User-Agent: curl ### SEO sites # allow https://bot.onpage.org/ User-Agent: OnPageBot # allow https://www.seobility.net User-Agent: Seobility # allow https://www.seobility.net User-Agent: SEOkicks-Robot ### others # allow http://www.domaincrawler.com/ User-Agent: DomainCrawler # allow http://commoncrawl.org User-Agent: CCBot # allow http://doczz.net/ User-Agent: doczz_net # allow http://www.uptime.com/uptimebot User-Agent: Uptimebot ### Disallows Disallow: /users/