English 中文(简体)
以机器人. txt 屏蔽一个域
原标题:Blocking one domain with a robots.txt

如果博客在Mydomain.com和Mydomain.blogspot.com两个网站都提供网站服务,

问题回答

您不能使用机器人. txt, 它只能包含文件夹/ 子文件夹的指令, 而不是域/ 子域。 您可能想要为我的domain. com 和 mydomain.example. com 创建单独的机器人. txt 文件 。

另一种解决办法是将301年的永久性数据从一个领域转到另一个领域。 这在博客中是可能的吗?

The best thing to do is to make sure that the mydomain.blogspot.com gets redirected to your custom domain mydomain.com. The redirect will need to be a permanent redirect (301) and it can be achieved with blogs hosted on Blogger. Here this should help you through:

http://support.google.com/blogger/bin/static.py?hl=en&ts=1233381&page=ts.cs

and here is another one but with the previous look of Blogger. http://www.blogbloke.com/redirect-you-blogger-custom-domain-to/

我建议你使用第一个链接 一步一步一步地执行程序!





相关问题
disallow certain url in robots.txt [closed]

We implemented a rating system on a site a while back that involves a link to a script. However, with the vast majority of ratings on the site at 3/5 and the ratings very even across 1-5 we re ...

Multiple Sitemap: entries in robots.txt?

I have been searching around using Google but I can t find an answer to this question. A robots.txt file can contain the following line: Sitemap: http://www.mysite.com/sitemapindex.xml but is it ...

热门标签