Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.
Robots bile is crucial for your blog which directs Google search engine which pages to crawl and which page not.
You can select which search engine to allow and which do not.
Simply, create a Robots.txt file and add it into the root directory so that you can control how search engines view your website.
You may likeour popular Blog for SEO,Blogging and Affiliate Marketing Learnings
Copyright © 2021 Bloggingos. All rights reserved.