How to enable custom robots.txt file in Blogger ?
Enabling a custom robots.txt file in Blogger is a straightforward process, but it's important to understand its implications before proceeding. Here's how you do it:
2. Locate the "Crawlers and indexing" section:
- Scroll down until you see the Crawlers and indexing section.
3. Enable the custom robots.txt option:
- Look for the Custom robots.txt toggle switch.
- Turn it ON by clicking on it.
4. Paste your robots.txt code (Optional):
- If you already have a prepared robots.txt code, click on the Custom robots.txt link.
- A text box will appear. Paste your code into the box.
- Click Save.
5. Consider the implications:
- Modifying your robots.txt file can affect how search engines crawl and index your Blogger content.
- Ensure your code is accurate and aligns with your SEO goals.
- Use online tools or resources to validate your robots.txt code before implementing it.
After enabling the toggle button >> Click on “Custom robots.txt”
Add this simple piece of code to “Custom robots.txt” and Click “Save”
User-agent: * Allow: / Sitemap: http://<subdomain_name>.<domain_name>.<tld>/sitemap.xml
Example:
User-agent: * Allow: / Sitemap: http://www.example.com/sitemap.xml
Additional points:
- Blogger also offers default robots.txt settings. You can choose to use these settings instead of your custom code.
- If you encounter any issues or have further questions, refer to the official Blogger Help Center or consult with an SEO expert.
Here are some helpful resources for understanding and creating robots.txt files:
- Google Search Console robots.txt tester: https://support.google.com/webmasters/answer/6062598?hl=en
Remember, proceed with caution when modifying your robots.txt file. If you're unsure, consult with an SEO professional for guidance.
I hope this clarifies the process! Let me know if you have any other questions.
Post a Comment
0Comments