301 Redirects & Server Errors

On the web, as in life, the only continuous rule is that nothing is constant and all things must change. However, on the web, when things will need to change, or, in our situation, move, specifically from 1 URL to another, there are crucial best practices to observe.

Let us first assume that you have a very simple scenario – a URL that needs to re-point to some other address permanently.

There are multiple choices for accomplishing this feat, but in general, a single one, the 301-redirect, is more preferable for both users and search engines. Serving 301 indicates to both bots and browsers which the page has moved permanently. Search engines interpret this to imply not only has the page changed location, but the content, or an upgraded version of it, can be located in the new URL. The motors will carry any connection weighting in the original page into the new URL, as below:

Be aware that when transferring a page from 1 URL to another, the search engines will take some time to find the 301, comprehend this, and credit the new page with the rankings and weight of its predecessor. This procedure can be longer in case your page hasn’t changed in a long time and the spiders rarely visit it, or when the new URL doesn’t properly resolve.

Other options for redirection, like 302s (temporary redirects), meta refreshes, or Javascript are inferior substitutes, as they typically won’t pass the positions and search engine worth such as the 301.

Transferring content becomes much more complex when an entire site changes its domain or when content moves from one domain into another. Due to abuse by spammers and intuition from the search engines, 301s between domains occasionally require more time to be properly spidered and counted.

Handle Server Errors

However, once missed, they can spiral into massive difficulties, and so are worthy of our review. The following are server and hosting problems that can negatively affect search engine positions:

Server Timeouts – If a search engine creates a page request that isn’t served inside the bot’s time limitation (or that produces a server timeout answer ), your pages may not make it into the index at all, and will almost certainly rank quite poorly (as no indexable text content was found).

Slow Response Times – Although this isn’t quite as harmful as host timeouts, above, it still presents a possible issue. Not only will crawlers be less likely to wait for the pages to load, but surfers and possible linkers may choose to see and connect to other sources because accessing your site becomes a problem. Basic concerns include speed, the capacity for getting spammy or untrusted neighbors sharing with your IP address, and possible concerns about receiving the complete benefit of hyperlinks to your IP address (discussed in more detail here).

Blocked IP Addresses – As search engines crawl the web, they frequently find complete blocks of IP addresses full of nothing but egregious internet spam. As opposed to blocking each individual website, engines do sometimes take the extra measure of blocking an IP address or even an IP range. If you are concerned, search to your IP address at MSN/Live using the IP:address question (or SEOmoz’s Who Else is Hosted on My IP Tool).

Bot Detection and Handling – a few SysAdmins will go a bit overboard with security and will limit access to files to any single visitor earning over a specific number of requests in a particular time frame. This can be catastrophic for search engine traffic, since it will constantly limit the crawling ability of the spiders.

Bandwidth & Transfer Limitations – Many servers have put limitations on the quantity of traffic which can run through the site. This can be potentially disastrous when content on your website becomes very popular along with your host shuts off access. Not only are potential linkers prevented from seeing (and thus, linking to) your work, but search engines can also be cut away from spidering.

Server Geography – This isn’t always a issue, but it is good to be aware that search engines do use the location of their web server when deciding where a site’s content is relevant to from a local search perspective. Since local search is a significant portion of many sites’ campaigns and it’s estimated that close to 40% of all queries have some local search intent, it’s very wise to host in the nation (it is not necessary to get more granular) where your content is most applicable.