JavaScript Editor Ajax software     Free javascripts 



Main Page

// return the new URL
return $new_url;
}
We’ll leave understanding this regular expression for you as homework.
Summary
We would summarize the entire chapter right here in this paragraph, but that would be duplicate content!
Because we don’t wish to frustrate our readers, we will keep it short.
Ideally, every URL on a web site would lead to a page that is unique. Unfortunately, it is rarely possible
to accomplish this in reality. You should simply eliminate as much duplication as possible. Parameters in
URLs that do not effect significant changes in presentation should be avoided. Failing that, as in the prob-
lems posed by using breadcrumb navigation, URLs that yield duplicate content should be excluded. The
two tools in your arsenal you can use to accomplish this are
robots.txt
exclusion and meta-exclusion.
118
Chapter 5: Duplicate Content
c05.qxd:c05 10:41 118


JavaScript Editor Ajax software     Free javascripts