Error pages & Rules of your site

This page presents the manner and value of customizing a site error pages as well as the different important files that are defining the rules applicable to your website, particularly their relation with internet search engines.

Custom error pages

By managing custom error pages, your visitors will stay on your website even when they encounter Internet HTTP errors returned by your hosting server.
If you do not handled and customized pages for HTTP errors, then your visitors will only see a blank page containing a short text (generally in English) with the name of the error displayed by your web server but as your visitors will not see anything else but that, they may wrongly think that your website is not working or no longer existing.
In order to avoid that, TOWeb allows you to manage all HTTP these errors by displaying not only your own customized message per error (if you want to) but also by automatically generating your message inside a full web page of your website that is to say a page containing the theme, page footer and menu of your website so your visitor can directly go to any other page of your website when they encounter error.
We recommend you to activate all the HTTP errors on your site so that they can be automatically taken in account by your web server. The most frequently seen errors are the HTTP 500 and HTTP 404.

HTTP ERROR 404 (NOT FOUND)
Most people are bound to recognize this one. A 404 error happens when you try to access a resource on a web server (usually a web page) that doesn’t exist. Some reasons for this happening can be for instance a broken link, a mistyped URL, or a web page that you have moved, renamed or deleted on your website.

HTTP ERROR 500 (INTERNAL SERVER ERROR)
This general-purpose error message appears when a web server encounters some form of internal error. For example, the web server could be overloaded or needs to be restarted by you or you host provider because it is unable to handle all web requests properly. According to Google’s search statistics, this problem is more than twice as common as 404.

Please note that the HTTP error messages are reported by your web server back to the visitor but if you can’t access a website at all, for example if his/her network (or yours) is down, you won’t get an HTTP error back but your connection attempt will simply time out.

File .htaccess

The .htaccess file is a configuration file that will be used by your web server to configure for instance access rights, URL redirects or filename extension associations with a MIME type.
The creation of a .htaccess file therefore requires web server administration knowledge and a great caution as an error on it may cause malfunction on your web site or even crash your web server.
Finally be aware that the content of an .htaccess file may depend on the type of server you are using as well as any constraints or limitations from your host provider. So we invite you to contact your host provider for more information if you ever have any specific rules or requirements to apply to your web site or server.
Note: a .htaccess file can also be used to manage custom error messages but there is no need for that because automatically handled if you have enabled the custom error pages for your site in TOWeb.

File robots.txt

A "robots.txt" file tells search engines whether they can access or not certain parts of your site. This file is automatically created and updated by TOWeb in the root directory of your web space. We recommend that you enable the "Automatically add non-public pages and internal scripts of your site to the robots.txt file" so that TOWeb update it automatically at every publication of your site based on its content and the features used in your site.
You do not normally have to do more at this level unless you have specific needs, like for example if you publish sub-sites or if you have created subdirectories & files of your web space that are not managed by TOWeb. In such cases, if you feel that your directories (or files) should not be seen by Internet search engines then you can prevent the exploration of these directories and content adding your own robots.txt file.
Note that if you use the "Automatically add non-public pages and internal scripts of your site to the robots.txt file", TOWeb will use the content of your robots.txt file, but also will append to it all the necessary exclusions of files useless for search engine that TOWeb has generated.
If necessary, you can also find on the website of Google Webmaster Tools robots.txt generator easy to use. Finally, if your site uses sub-domains and you want to prevent exploration of certain pages from a specific sub-domain, then you will need a separate robots.txt file for each sub-domain concerned. For more information about robots.txt files, you should also consult the guide of the Google help center for webmasters on the use of robots.txt files.


Page 27 of 38
Link copied to clipboard