Discuz! Board

 找回密碼
 立即註冊
搜索
熱搜: 活動 交友 discuz
查看: 5|回復: 0

The one exception is that if you create multipl

[複製鏈接]

1

主題

1

帖子

5

積分

新手上路

Rank: 1

積分
5
發表於 2024-3-5 17:46:46 | 顯示全部樓層 |閱讀模式
For example. Let's say you want to block search engine spiders other than google. Here's how to do it: user-agent: * disallow: / user-agent: googlebot allow: / you nee to know that in the robots. Txt file. You can specify an unlimite number of user agents. That being said. Each time you specify a new user agent. It is independent. In other words. If you create rules for one more user agent in succession. The rules for the first user agent do not apply to the second. Or third. User agent. e rules for the same user agent. These rules will be execute together.


Important hint spiders will only follow instructions that accurately indicate France WhatsApp Number Data  the detaile user agent . So the robots.Txt file above will only exclude search engine crawlers other than google spiders (and other types of google spiders). Google spiders will ignore some less specific user agent declarations. Directives directives refer to the rules you want the user agent to follow. Currently supporte commands below are the commands currently supporte by google. And how to use them. Disallow command use this directive to instruct search engines not to access files and pages at specific paths. For example. If you wante to block all search engines from accessing your blog and all of its posts. Your robots.



Txt file would look like this: user-agent: * disallow: /blog hint. If you do not give a detaile path after the disallow directive. Search engines will ignore it. Allow command use this directive to specify that search engines nee to access files and pages at a specific path - even in a path blocke by the disallow directive. If you block all article pages except a specific article. Then robots.Txt should look like this: user-agent: * disallow: /blog allow: /blog/allowe-post in this example. The search engine can access: /blog/allowe-post. But it cannot access: /blog/another-post /blog/yet-another-post /blog/download-me. Pdf both google and bing support this directive. Hint. Like the disallow directive. If you do not declare a path after the allow directive.

回復

使用道具 舉報

您需要登錄後才可以回帖 登錄 | 立即註冊

本版積分規則

Archiver|手機版|自動贊助|GameHost抗攻擊論壇

GMT+8, 2025-4-19 10:53 , Processed in 0.846672 second(s), 55 queries .

抗攻擊 by GameHost X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回復 返回頂部 返回列表
一粒米 | 中興米 | 論壇美工 | 設計 抗ddos | 天堂私服 | ddos | ddos | 防ddos | 防禦ddos | 防ddos主機 | 天堂美工 | 設計 防ddos主機 | 抗ddos主機 | 抗ddos | 抗ddos主機 | 抗攻擊論壇 | 天堂自動贊助 | 免費論壇 | 天堂私服 | 天堂123 | 台南清潔 | 天堂 | 天堂私服 | 免費論壇申請 | 抗ddos | 虛擬主機 | 實體主機 | vps | 網域註冊 | 抗攻擊遊戲主機 | ddos |