Skip to content

PolicyOptions

Properties

allow?

optional allow: string | string[]

Description

[ At least one or more allow or disallow entries per rule ] Allows indexing site sections or individual pages.

Example

policy:[{allow:["/"]}]

Path-based URL matching, refer to SYNTAX via Google.

Defined in

index.ts:92


cleanParam?

optional cleanParam: string | string[]

Description

[ Optional ] Indicates to the robot that the page URL contains parameters (like UTM tags) that should be ignored when indexing it.

Example

# for URLs like:
www.example2.com/index.php?page=1&sid=2564126ebdec301c607e5df
www.example2.com/index.php?page=1&sid=974017dcd170d6c4a5d76ae
policy:[
{
cleanParam: [
"sid /index.php",
]
}
]

For additional examples, please consult Yandex’s SYNTAX guide.

Defined in

index.ts:143


crawlDelay?

optional crawlDelay: number

Description

[ Optional ] Specifies the minimum interval (in seconds) for the search robot to wait after loading one page, before starting to load another.

Example

policy:[{crawlDelay:5}]

About the Crawl-delay directive.

Defined in

index.ts:120


disallow?

optional disallow: string | string[]

Description

[ At least one or more disallow or allow entries per rule ] Prohibits indexing site sections or individual pages.

Example

policy:[
{
disallow:[
"/admin",
"/uploads/1989-08-21/*.jpg$"
]
}
]

Path-based URL matching, refer to SYNTAX via Google.

Defined in

index.ts:109


userAgent?

optional userAgent: UsertAgentType | UsertAgentType[]

Description

[ Required ] Indicates the robot to which the rules listed in robots.txt apply.

Example

policy:[
{
userAgent: [
'Googlebot',
'Applebot',
'Baiduspider',
'bingbot'
],
// crawling rule(s) for above bots
}
]

Verified bots, refer to DITIG or Cloudflare Radar.

Defined in

index.ts:82