HTML and CSS Reference
In-Depth Information
Likewise, some developers put JavaScript iles at the top level of the docu-
ment root with the HTML iles. I like having client-side scripts in their own
directory because I can restrict access to that directory, banning robots and
people from reading test scripts and other works in progress. If a particular
JavaScript function is needed by more than one page on a site, it can go into
the functions.js ile instead of being replicated in the head sections of each
individual page. An example is a function that checks that what the user
entered into a form ield is a valid email address.
other WeBSite FileS
A number of other iles are commonly found in websites. hese iles have
speciic names and relate to various protocols and standards. hey include the
per-directory access, robots protocol, favorites icon, and XML sitemap iles.
.htaccess
his is the per-directory access ile. Most websites use this default name
instead of naming it something else in the web server's coniguration set-
tings. he ilename begins with a dot to hide it from other users on the same
machine. If this ile exists, it contains web server coniguration statements that
can override the server's global coniguration directives and those in efect for
the individual virtual web host. he new directives in the .htaccess ile afect
all activity in the directory it appears in and all subdirectories unless those
subdirectories have their own .htaccess iles. Although the subject of web
server coniguration is too involved to go into here in any detail, here are some
of the common things that an access ile is used for:
Providing the directives for a password-protected directory
.
Redirecting traic for resources that have been temporarily or
permanently relocated
.
Enabling and coniguring automatic directory listings
.
Enabling CGI scripts to be run from the directory
.
robots.txt
he Robots Exclusion Protocol ile provides the means to limit what search
robots can look for on a website. he ile must be called robots.txt and must be
in the top-level document root directory. According to the Robots Exclusion
Protocol, robots must check for the ile and obey its directives. For example,
 
 
Search WWH ::




Custom Search