Robots.txt

What is the definition of
Robots.txt
?

A file that indicates to search engine spiders which pages and sections of a website they are not allowed to crawl

Return to Glossary index

Ready to transform your marketing?

Use our expertise to help your brand grow and be a major player in your category.
Book a call and let’s get started...