Do you know how? to create custom robots.txt
file for blogspot? . All of the bloggers who are doing blog should
definitely know about it. Mainly if you are a search engine
optimizer, You will learn about it. So, let’s see what is custom
robots file and how it works.
Fundamentally a
robots. txt file can be a simple text file including instructions for the
search results spiders. A search engine leveling bot or spider uses to crawl a
niche site entirely. They crawl the complete accessible site and index them
around the engine. But sometimes a webmaster wish to include or exclude a
single or some files.
User-agent:
Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://gazitech.blogspot.com/sitemap.xml,?Disallow:
User-agent: *
Disallow: /search
Allow: /
orderby=UPDATED
Examination
This code is segregated into three sections. Let’s first study each after that we will discover how to add custom robots. txt submit in blogspot blogs.
User-agent: Mediapartners-Google
This code is ideal for Google Adsense robots and help them to serve better ads for your blog. Either you are selecting Google Adsense on your website or not simply leave it simply because it is.
User-agent: *
It's for all robots noted with asterisk (*). In default settings some of our blog’s labels links are on a indexed by search crawlers imagine the web crawlers are not going to index our labels page links by reason of below code.
Disallow: /search
Imagine the links having keyword search right after the domain name shall be ignored. See below example the link of label internet page named SEO.
http://gazitech.blogspot.com/.com/search/label/SEO
When we remove Disallow: /search within the above code then robots will access our existing blog to index and crawl every bit of its content and pages.
Disallow Particular Post
Now suppose if we need to exclude a particular place from indexing then you can easily add below lines during the code.
Disallow: /yyyy/ mm /post-url. html
To build this task easy, you can simply reproduce the post URL and get rid of the blog name right from the start.
Disallow Particular Page
If came across disallow a particular page then you can easily use the same system as above. Simply copy the internet page URL and remove blog address traditional hunting had which will something are similar to this:
Disallow: /p/page-url. html
y the website URL and remove blog address than it which will something mimic this:
Disallow: /p/page-url. html