top Add

Friday, December 30, 2016

How To include Custom Robots. txt Document in Blogger?





Do you know how? to create custom robots.txt file for blogspot? . All of the bloggers who are doing blog should definitely know about it. Mainly if you are a search engine optimizer, You  will learn about it. So, let’s see what is custom robots  file and how it works.

What is Robots.txt?

Fundamentally a robots. txt file can be a simple text file including instructions for the search results spiders. A search engine leveling bot or spider uses to crawl a niche site entirely. They crawl the complete accessible site and index them around the engine. But sometimes a webmaster wish to include or exclude a single or some files.

Each blog organised on blogger have it's default robots. txt file that is something look like this particular:

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://gazitech.blogspot.com/sitemap.xml,?
orderby=UPDATED

Examination

This code is segregated into three sections. Let’s first study each after that we will discover how to add custom robots. txt submit in blogspot blogs.

User-agent: Mediapartners-Google
This code is ideal for Google Adsense robots and help them to serve better ads for your blog. Either you are selecting Google Adsense on your website or not simply leave it simply because it is.
User-agent: *
It's for all robots noted with asterisk (*). In default settings some of our blog’s labels links are on a indexed by search crawlers imagine the web crawlers are not going to index our labels page links by reason of below code.
Disallow: /search

Imagine the links having keyword search right after the domain name shall be ignored. See below example the link of label internet page named SEO.
 http://gazitech.blogspot.com/.com/search/label/SEO

When we remove Disallow: /search within the above code then robots will access our existing blog to index and crawl every bit of its content and pages.

Here Allow: / looks at the Homepage that would mean web crawlers can get and index our blog’s web page.


Disallow Particular Post
Now suppose if we need to exclude a particular place from indexing then you can easily add below lines during the code.

Disallow: /yyyy/ mm /post-url. html

To build this task easy, you can simply reproduce the post URL and get rid of the blog name right from the start.


Disallow Particular Page

If came across disallow a particular page then you can easily use the same system as above. Simply copy the internet page URL and remove blog address traditional hunting had which will something are similar to this:
Disallow: /p/page-url. html

y the website URL and remove blog address than it which will something mimic this:
Disallow: /p/page-url. html




11 comments:

  1. ★★★ Ranking Website in Google 1st Page Guaranteed Results ★★★

    This is a whole SEO service, website ranks active monthly SEO service which has all the required on-page and technical fix and current off-page optimisation.

    How long for results?
    Low competitive keywords: 1-2 month
    Medium competitive keywords: 3-4 month
    High competitive keywords: 4-6 month

    ReplyDelete
  2. Hello ...
    Also I always check your articles Please send me an article on
    buy telegram subscribers

    ReplyDelete
  3. ★★★| monthly SEO service | Ranking Website in Google Top Page | Guaranteed Results |★★★

    This is a complete monthly SEO service which has all the required on-page and technical fix and current off-page optimisation.
    Both white hat SEO services (on page + off page) in just one package to get your website on 1st page of Google.
    monthly SEO service

    ReplyDelete
  4. Hey i was searching on this topic i found your site.I liked your content.i have bookmarked your site.i will visit again.I have also similar site sw sol do check it out

    ReplyDelete
  5. Leo Oscar
    Thank you so much for this useful information. looking more from your side to update us on more updates and advancements

    ReplyDelete
  6. Unique content I appreciate your work, and that i will see the trouble you putten into the content. smart to understand you- I conjointly run the same journal, please liberated to visit خبير سيو

    ReplyDelete