TOP SEO AGENCY
(03) 9988 2341  Get Support

Sitemap and Robots.txt

While optimizing a website for search engines, prime focus is on developing ‘search engine friendly’ content. But, what does the term search engine friendly mean? It means that crawlers should be able to scan and index your website easily and organize it in relevance to keywords. But, how does a spider understand whether a webpage should be crawled or not? Sitemap and Robots.txt give directives for search engines and hence, they can be effectively used for SEO.

What is Robots.txt?

Robots.txt is a plain text file that is uploaded to the root directory of a website. Bots or spiders first look for this file and process it. Processing the file tells bots which pages of the website should be crawled and which pages should not be crawled.

 

If Robots.txt file is absent, bots assume that they can crawl and go anywhere on the site. If there are certain parts of website that you do not want bots to crawl, these directories need to be specified in the Robots.txt file. It can also be used to tell bots or search engine where sitemap of that website is located.

To allow indexing of all pages, format of Robot.txt is –

User-agent: *

Disallow:

 

To disallow indexing of all pages, format of Robot.txt is –

User-agent: *

Disallow: /

 

To disallow indexing of specific folder, format of Robot.txt is –

User-agent: *

Disallow: /folder/

 

 

 Sometimes, websites have print version of website that allow users to print information from website easily. But, this results in duplicate content on the website. With Robots.txt, you can prevent bots from accessing duplicate content available on the website.

While handling Robots.txt, it is important to ensure that bots are not restricted from indexing main content of the website, else it will have negative influence on SEO efforts.

What is Sitemap?

Sitemap is an XML file that informs search engines which pages are available for crawling. It lists out URLs of the pages along with their meta information including last update. Sitemaps can be submitted to search engines as well as added to the website to make navigation easier for visitors. Though submitting sitemap to search engine does not guarantee indexing, it has several advantages including –

 

  • - Add visibility of website for search engines
  • - Speed up process of crawling initially
  • - Make site accessible to crawlers
  • - Give pathway for bots to follow
  • - Utilize keywords in an effective way

 

Robots.txt and Sitemap.xml influence the crawling phase of bot and SEO visibility. It helps visitors understand and find information on your website.   

 

 

Company Profile

Maximize the potential of your business on Internet! Internet marketing is the best way today to drive lost of traffic to your business. Webplanners is a leading SEO company Melbourne.


For more information please contact us .

Rate this blog entry:
5 SEO Trends – 2014
Google Analytics – Tips for Small Businesses
 

Comments

No comments made yet. Be the first to submit a comment
Already Registered? Login Here
Guest
Thursday, 28 November 2024
TOP SEO AGENCY