Tag archives for Robots.txt
How To robots.txt – Control Search Engine Website Access Using robots.txt
Using file and disallow and allow rules you can control which search engines should crawl your site's content. It is is a convention or protocol to prevent cooperating web spiders…
How to create HTML and XML Sitemap (Site Map) for your website?
A site map or sitemap is a hierarchical representation of a Web site's content that helps visitors and search engine bots find pages on the site. The Sitemaps protocol allows…