Edit robots.txt file using google cloud ssh

If you are hosting your WordPress website using Google cloud hosting then your robots.txt will be set to default that disallows all search bots from crawling your site . You will need to change the robots.txt file uploaded on your server and edit it so that search engine bots are allowed to crawl your website.

First of all you should check if your robots.txt file actually needs editing or not by viewing it on “www.yoursite.com/robots.txt”  . If somewhere is says ” Disallow: / ” then it is blocking all search bots from crawling your website . You need to change “Disallow:/” to “Allow:/” so that bots can crawl your whole site .

If you need to change your robots txt file then in your Google compute instance initiate an SSH session and then use the following command to access and edit your robots file .To edit Robots.txt file for wordpress using google cloud ssh use following command in SSH .

~$ sudo nano /var/www/html/robots.txt

After editing your file use ctrl+X and Y to save it and exit the session .