![]() |
The robots.txt file in SEO acts as a gatekeeper, before any good bots entering to your website they first visit the robots.txt file and read which pages are allowed to crawl and which are not. A robots.txt file tells the Google crawler bot which URLs the crawler can access on your website. Example of Robot.txt FileYou can also visit our robots.txt file by this URL: /archive/robots.txt
Components of Robot.txt FileNow lets explain above code
For example:
Above code block all web crawler to visit any page of website. Note: If you want any URL to deindex from Google Search quickly you can use Google Search Console removal request from your GSC account. |
Reffered: https://www.geeksforgeeks.org
Search Engine Optimization (SEO) |
Type: | Geek |
Category: | Coding |
Sub Category: | Tutorial |
Uploaded by: | Admin |
Views: | 11 |