What is the robots.txt file
The robots.txt file is used to instruct search engines which files they're allowed to access, the main purpose of this is to prevent a crawler from overloading a website.
The problem with the robots.txt file is many people use it as a method of security, they feel that if it cannot be found on a search engine then people cannot find it. However, as the file can be accessed through any web browser it can potentially lead to sensitive files or locations. See below for a sample robots.txt file.
Disallow: /secret_uploader.php
Disallow: /admin/
User-agent: *
Allow: /
Sitemap: http://www.example.com/sitemap.xml
Notice something not quite right? That's fine we're not perfect. Why not make a suggestion on our community forum, you should be able to see the correct formatting below.
Suggestion/Improvement:
Still need help?
Are you having trouble using arctil? Why not try reaching out to our Community Forum.
Alternatively, you can try contacting us through the Contact page.