What is


Robots.txt is a text file that is used to give instructions to web robots, also known as web crawlers or spiders, about which pages or sections of a website should not be crawled or indexed. This file is placed in the root directory of a website and tells the web robots which directories or pages they are allowed to access and which they should avoid. This file is commonly used by website owners to prevent search engines from indexing certain parts of their website, such as private or sensitive information, duplicate content, or pages that are not relevant to search results. By using robots.txt, website owners can control how search engines crawl and index their site, helping to improve overall SEO performance. Overall, robots.txt is an important tool in managing the visibility of a website in search engine results and ensuring that only relevant and important pages are indexed.

Articles about


Latest articles

All articles

Let’s kickstart your Shopify business together

All our stores are beautifully designed (#eyecandy). More importantly, though, they’re optimized to sell. Customer love, guaranteed. Just don't tell the competition.

Get in touch