Another thing to consider when dealing with the SEO optimization of your X-Cart store is the file robots.txt.

The file robots.txt is used to give instructions about the site to web robots, and its primary function is to block web robots from accessing the pages of your site that do not need to be indexed by search engines. Your X-Cart store comes bundled with a robots.txt file that should be good for any store installed in the root (top-level directory) of your domain. If, however, your X-Cart store is installed in a subdirectory off the domain root, you will need to take a few additional steps to ensure that the robots.txt file for your store actually works (see further below for instructions).

Described in basic terms, when a robot looks for the file robots.txt, it strips the path component from the URL (everything from the first single slash) and puts “robots.txt” in its place. For example, on, the robots will expect to find the robots.txt file at If your store’s address is, your store’s robots.txt file can be found exactly at that location, which means it can be accessed by web robots ok, and nothing needs to be done. However, if your X-Cart store is installed, for example, at, the location of the file robots.txt is That is not ok and needs to be fixed.

So, if your store is installed in a subdirectory, you will need to move the file robots.txt to the level of your domain root, or - if there is already another robots.txt file at that level - to copy the instructions from the robots.txt file in your X-Cart store directory and add them to the robots.txt file at the domain root. You will also need to adjust the paths in the robots.txt file at the domain root level in both cases. For example, after moving from to, the directive Disallow: /Includes/ should be replaced by Disallow: /shop/Includes/. The same should be done for any of the paths mentioned in robots.txt.

Did this answer your question?