robots

M

ME

I've just added a custom 404 error page to our site, as there are quite a
few links in various search engines that point to pages that no longer
exist. I've been reading here a little about robots.txt. I'm not sure
whether I need to set up a robots.txt file to keep search engines from
including the custom 404 page or if I should just not worry about it.
 
J

Jon Spivey

Hi,
On a windows server, not sure about *nix, a custom 404 returns a 200 OK
status message rather than a 404 so the search engines will actually see it
as a normal page. If you have several pages indexed that are now 404s I can
see the search engines penalising you because all the links effectively
point to the same page.

Better would be to use a 301 redirect to point the search engine to the
correct page location, this will also remove your old urls from the index
over time
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top