Looking to find the robots.txt file on your website? You’re in luck - this article will cover where to find your robots.txt file, both on the frontend and backend.
The robots.txt file is a text-based document that tells search engines what they can and cannot crawl on your site. For example, you may want to block the search engine from indexing certain pages or folders of your website. You can also use this file to tell the robot: "Please crawl all my pages."
Finding the robots.txt file on the frontend of your website is simple.
All you need to do is type “ /robots.txt “ at the end of your root domain to pull up the file. An example would be https://brandonlazovic.com/robots.txt.
Finding the robots.txt file on the backend of your website to directly change rules is a little trickier compared to discovering it on the frontend. Luckily we’ll walk through where to find it on the backend for three of the most popular CMSs - WordPress, Squarespace, and Wix.
In most cases, you’ll need a WordPress SEO plugin in order to change the robots.txt file on the backend of your site.
While you don’t have direct access to make updates to your robots.txt on squarespace, they do a relatively decent job creating one on behalf of a webmaster. However, if you need to make custom edits, I would reach out to Squarespace’s customer support for more assistance in this regard.
Wix is relatively painless to update for robots.txt files. Simply go into your site’s dashboard >> Marketing & SEO > SEO Tools > Robots.txt File Editor > View File.
In other cases, you may have your robots.txt file set up via your website hosting as a FTP client. Simply navigate into the FTP client, find the robots.txt file, and make edits as needed inside your website’s root folder.
Be sure to check out our latest search engine optimization guide to learn about the best practices of SEO for your website.