Skip to main content

1.4K Messages

 • 

262.9K Points

Sun, Oct 13, 2013 4:59 PM

Closed

How Do I Prevent Duplicate Content By Blocking Google From My Index Page?

Changing the robots.txt file will prevent your site from being crawled by search engines. Do not add something unless you are positive you do not want it to be seen in search results.

When building your store with Bigcommerce, there is a slim possibility that your store will appear in a search engine before it is ready to go live. This can lead to customers viewing your store while it is still under construction, and if you are recycling information from an existing store, Google may penalize you for duplicate content. You can update your robots.txt file to prevent search engines from indexing your site before it is ready. However, you must change it back once your site is live.

Modifying Robots.txt 1. In the Bigcommerce control panel, go to Setup & Tools › Advanced Tools › Robots text file.

2. Delete everything after the first Disallow:. See the example below./

Robotstxt with everything disallowed

3. Click Save when done.

4. To allow Google to index your store again, return to this page and click Revert to Default, thenSave.

The Revert to Default button at the bottom of the page

This conversation is no longer open for comments or replies and is no longer visible to community members.

Responses

No Responses!