Linux
如何說服 Nginx 重定向除 /robots.txt 之外的所有請求?
我正在使用 Nginx 並將幾個舊域名指向一個新站點。
此配置中的第一個塊適用於我在重定向到new.domain時需要old.domain的行為方式。
在第二個塊中,我試圖將除/robots.txt之外的對oldmediaserver.domain的任何請求轉發到 new.domain的首頁。在目前狀態下,每個請求都會重定向,包括 /robots.txt - 我不知道為什麼。
(原因是我有一些被 Google 從舊域索引的內容,我正試圖通過網站管理員工具將其從搜尋結果中刪除 - 這可能不起作用,但這不是我在這裡尋求幫助的原因! )。
# Old site to new site config server { listen 80; listen [::]:80; server_name old.domain www.old.domain; rewrite ^ $scheme://www.new.domain$request_uri permanent; } # Media server Redirect and Robots directive server { listen 80; listen [::]:80; server_name oldmediaserver.domain www.oldmediaserver.domain; location / { rewrite / $scheme://www.new.domain/ permanent; } location /robots.txt { return 200 "User-agent: *\nDisallow: /"; } rewrite ^ $scheme://www.new.domain/ permanent; } server { listen 80 default_server; listen [::]:80 default_server; root /var/www/website-name/html; # Add index.php to the list if you are using PHP index index.php index.html index.htm index.nginx-debian.html; server_name www.new.domain; location / { # First attempt to serve request as file, then # as directory, then fall back to displaying a 404. try_files $uri $uri/ /index.php?$args; } # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000 location ~ \.php$ { include snippets/fastcgi-php.conf; # # With php5-fpm: fastcgi_pass unix:/var/run/php5-fpm.sock; } # include a file for any 301 redirects include includes/website-name-redirects; location /members/ { try_files $uri $uri/ /index.php?$args; auth_basic "Members Login"; auth_basic_user_file /var/www/website-name/html/.htpasswd; location ~ \.php$ { include snippets/fastcgi-php.conf; # With php5-fpm: fastcgi_pass unix:/var/run/php5-fpm.sock; } } #!!! IMPORTANT !!! We need to hide the password file from prying eyes # This will deny access to any hidden file (beginning with a .period) location ~ /\. { deny all; } }
感謝您提供的任何光線!
感謝gf_和Drifter104的評論。Drifter104關於匹配位置的評論讓我研究了不同的匹配模式並最終登陸下面的配置。
# Media server Redirect and Robots directive server { listen 80; listen [::]:80; server_name oldmediaserver.domain www.oldmediaserver.domain; location ^~ / { rewrite ^ $scheme://www.new.domain/ permanent; } location ^~ /robots.txt { return 200 "User-agent: *\nDisallow: /"; } }
我仍然不確定我是否完全理解為什麼這個有效而另一個無效,所以如果有人能提供更多的資訊,那就太好了!