Google可以访问受密码保护的子域吗?

时间:2014-03-05 05:03:29

标签: .htaccess

我将以下内容添加到子域文件夹中的.htaccess

AuthName "beta server"
AuthType Basic
AuthUserFile /var/.htpasswd
Require user username

我只想仔细检查一下,这会阻止Google(和其他漫游器)抓取内容,因此我不必处理重复的内容问题。

2 个答案:

答案 0 :(得分:0)

有几种选择可供选择。你可以很容易地实现它们,因为它们不会发生冲突,这样你可能会更加放心。

第一个是阻止坏机器人,查看.htaccess文件并阻止目录列表

            //Block bad bots
            RewriteEngine On 
            RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR]
            RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Custo [OR]
            RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR]
            RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR]
            RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR]
            RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR]
            RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR]
            RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR]
            RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR]
            RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR]
            RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR]
            RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR]
            RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR]
            RewriteCond %{HTTP_USER_AGENT} ^HMView [OR]
            RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
            RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR]
            RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
            RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR]
            RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR]
            RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR]
            RewriteCond %{HTTP_USER_AGENT} ^larbin [OR]
            RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR]
            RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR]
            RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR]
            RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR]
            RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR]
            RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR]
            RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR]
            RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR]
            RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR]
            RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR]
            RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR]
            RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR]
            RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR]
            RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR]
            RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR]
            RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR]
            RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Widow [OR]
            RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR]
            RewriteCond %{HTTP_USER_AGENT} ^Zeus
            RewriteRule ^.* - [F,L]

            //Prevent viewing of .htaccess file
            <Files .htaccess>
            order allow,deny
            deny from all
            </Files>

            //Prevent directory listings
            Options All -Indexes

这是从这个免费的.htaccess生成器中获取的:http://www.htaccessredirect.net/index.php

接下来要创建一个robots.txt文件。 Google有一个很好的页面: https://support.google.com/webmasters/answer/156449?hl=en

然后第三个也包括您已经拥有的密码保护。

其他人发表他们的建议,想法和批评会很棒。我们可以将此页面设为目录保护页面。

答案 1 :(得分:0)

是的,这确实会阻止Google为您的内容编制索引。

Google本身推荐这是阻止网址的“最简单,最有效的方法”:https://support.google.com/webmasters/answer/93708