Mastering Robots.txt Configuration in Webflow
Table of contents
The robots.txt file is a text file placed on the website server to instruct search engine crawlers on how to interact with the site's pages. It specifies which areas of the site should be crawled and indexed, and which should be excluded. This file helps control search engine access and ensures privacy for sensitive content.
Ensure proper search engine crawling and indexing by setting up your robots.txt file on Webflow. Control access to your website's content effectively.
Setup your robots.txt file on Webflow
- Go to the Project Settings
- Go to the SEO tab
- On the robots.txt area field, add the following : User-agent:* if you want all bots to crawl all your pages
- If you want to exclude some page, you can add Disallow: /your-page-name
- If you want to exclude folder, add Disallow: /folder-name*