Another thing to consider when dealing with the SEO optimization of your X-Cart store is the robots.txt file.

The file robots.txt is used to give instructions about the site to web robots, and its primary function is to block web robots from accessing the pages of your site that do not need to be indexed by search engines. Your X-Cart store comes bundled with a robots.txt file that should be good for any store installed in the domain's root (top-level directory). However, if your X-Cart store is installed in a subdirectory off the domain root, you will need to take a few additional steps to ensure that the robots.txt file for your store works indeed (see further below for instructions).

Described in basic terms, when a robot looks for the robots.txt file, it strips the path component from the URL (everything from the first single slash) and puts robots.txt in its place. For example, on www.example.com, the robots will expect to find the robots.txt file at www.example.com/robots.txt. If your store's address is www.example.com, you can find the store's robots.txt file precisely at that location, which means it can be accessed by web robots ok, and nothing needs to be done.


Note: Unlike 5.4.x, in X-Cart 5.5.x, the robots.txt file is located not in the store root folder but the /public directory. The file, however, will still be available for robots at www.example.com/robots.txt.


If your X-Cart store is installed, for example, at www.example.com/shop, the location of the robots.txt file is www.example.com/shop/robots.txt. That is not ok and needs to be fixed.

So, move the robots.txt file to the domain root level if your X-Cart store is installed in a subdirectory. If there is already another robots.txt file at that level, copy the instructions from the X-Cart robots.txt file and add them to the root robots.txt file. In both cases, you will also need to adjust the paths in the robots.txt file at the domain root level. For example, after moving from www.example.com/shop/robots.txt to www.example.com/robots.txt, the directive Disallow: /Includes/ should be replaced by Disallow: /shop/Includes/. Do the same for any of the paths mentioned in robots.txt.

Related pages:

Did this answer your question?