Another thing to consider when dealing with the SEO optimization of your X-Cart store is the file robots.txt.
The file robots.txt is used to give instructions about the site to web robots, and its primary function is to block web robots from accessing the pages of your site that do not need to be indexed by search engines. Your X-Cart store comes bundled with a robots.txt file that should be good for any store installed in the root (top-level directory) of your domain. However, if your X-Cart store is installed in a subdirectory off the domain root, you will need to take a few additional steps to ensure that the robots.txt file for your store works indeed (see further below for instructions).
Described in basic terms, when a robot looks for the file robots.txt, it strips the path component from the URL (everything from the first single slash) and puts “robots.txt” in its place. For example, on www.example.com, the robots will expect to find the robots.txt file at www.example.com/robots.txt. If your store’s address is www.example.com, you can find the store’s robots.txt file exactly at that location, which means it can be accessed by web robots ok, and nothing needs to be done. However, if your X-Cart store is installed, for example, at www.example.com/shop, the location of the file robots.txt is www.example.com/shop/robots.txt. That is not ok and needs to be fixed.
So, for a store installed in a subdirectory, move the robots.txt file to the domain root level, or - if there is already another robots.txt file at that level - copy the instructions from the robots.txt file in your X-Cart store directory and add them to the robots.txt file at the domain root. You will also need to adjust the paths in the robots.txt file at the domain root level in both cases. For example, after moving from www.example.com/shop/robots.txt to www.example.com/robots.txt, the directive
Disallow: /Includes/ should be replaced by
Disallow: /shop/Includes/. Do the same for any of the paths mentioned in robots.txt.