无法用squid代理打开文件

时间:2017-04-25 10:07:39

标签: squid

运行Nikto工具检查PC。

- Nikto v2.1.6
---------------------------------------------------------------------------
+ Target IP:          10.xx.xx.xx
+ Target Hostname:    10.xx.xx.xx
+ Target Port:        8028
+ Start Time:         2017-04-25 04:46:05 (GMT-4)
---------------------------------------------------------------------------
+ Server: squid/4.0.17
+ Retrieved via header: 1.1 localhost.localdomain (squid/4.0.17)
+ The anti-clickjacking X-Frame-Options header is not present.
+ The X-XSS-Protection header is not defined. This header can hint to the user agent to protect against some forms of XSS
+ Uncommon header 'x-cache-lookup' found, with contents: NONE from localhost.localdomain:8028
+ Uncommon header 'x-cache' found, with contents: MISS from localhost.localdomain
+ Uncommon header 'x-squid-error' found, with contents: ERR_INVALID_URL 0
+ The X-Content-Type-Options header is not set. This could allow the user agent to render the content of the site in a different fashion to the MIME type
+ No CGI Directories found (use '-C all' to force check all possible dirs)
+ Entry '<li><p>Illegal character in hostname; underscores are not ed.</p></li>' in robots.txt returned a non-forbidden or redirect HTTP code (400)
+ "robots.txt" contains 1 entry which should be manually viewed.

在端口8028上找到squide 4.0.17并找到robots.txt文件

但如果我尝试在网址中打开它会收到错误: http://10.xx.xx.xx:8028/robots.txt

ERROR

The requested URL could not be retrieved

我如何看到此文件?

1 个答案:

答案 0 :(得分:1)

您可以将nikto与代理参数一起使用。例如nikto -h http://example.com/ -useproxy 10.xx.xx.xx:8128。 通过这种方式,它将使用正在使用的代理进行扫描,它将找到robots.txt。您也可以在浏览器中设置代理地址,然后在url中打开它。在firefox浏览器中设置代理地址;首选项 - &gt;高级 - &gt;网络 - &gt;设置 - &gt;手动代理配置

相关问题