GD SEO Toolbox

Robots.txt

Robots.txt file is used by search engines and other robots and crawlers when accessing your website. Rules specified in this file can be used to prevent access to some areas of your website. But, it is very important to know that robot files are not mandatory, and robots can ignore this file.

Robots Rules Editor
Robots Rules Editor

The plugin creates a virtual robots.txt file and allows you to customize rules in this file and even add rules targeting only specific robots or spiders based on their user agent string.

Based on your website settings and location, the plugin will also test if the robot file can be used at all (only for websites located on the domain top level and only if you don’t have real robots.txt placed in the website root folder).

Spread the word about this plugin, share on social networks: