SEO/PPC12.03.13

7 Common SEO Mistakes to Avoid

Dark
Light

When you work in the world of SEO, you get used to seeing the same issues repeat themselves throughout different sites. In fact, there are some issues you can almost expect to see on a site that has never been optimized or touched by an SEO. Being aware of these common SEO issues is a critical start to making sure your website earns high visibility in the search results.

What are these SEO problems, and how are they corrected?

1. Canonicalization

Confused already? It is a long word. Canonicalization refers to normalizing multiple URLs. In our canonical URL guide, we shared how users (and the search engines) are sometimes able to access the same page on your website through multiple URLs. For example, without proper canonicalization, a user may be able to use ALL of the URLs below to reach your homepage.

  • http://www.yoursite.com/
  • http://yoursite.com/
  • http://www.yoursite.com
  • http://yoursite.com
  • http://www.yoursite.com/index.html

It’s confusing for a user and it’s especially confusing to Google which will view each of these URLs as distinct pages. It is important that you specify the preferred version to Google to help it better understand your site. . You do so by adding the rel=“canonical” tag.

Another way to ensure that only one URL is being indexed by Google is to have the alternate URLs 301 redirect into the preferred URL. I personally like to use the 301 method, though it is not necessary. It is important to use at least one of the above methods so Google knows the preferred URL. Why is this so crucial? To Google, multiple URLs leading to the same page will create a duplicate content issue (see more on this later).

2. Meta Tags

Another issue I run into frequently has to do with the formatting of sites’ title tags and meta descriptions. It’s an issue easily solved if you understand how title tags and meta descriptions should be written and what the purposes of them are.
Your title tag is what will be used to link to your site from the search engine results. It should include important keywords that give an overview of the page’s content. This will give search engines a good idea of what the page is about when it is being indexed, and it’s what the users will read in the search engine’s results pages as they review which URL to click.

When writing a title tag, it is important to keep it under 70 characters so that it is not truncated in the results. Put the most important keyword first, followed by the second and third most important keywords. These should be separated by a hyphen or a pipe.

It is also important to include your brand in the title tag. Whether you put it at the beginning or end (I prefer end), remember not to list any keywords more than once, as there is no benefit and that can look like spam. Every page on your site should have a unique title tag.

The meta description is what will show underneath the URL in the search engine results, as long as it is an accurate representation of what the page is all about. Using the meta description to list a unique selling point that makes you stand out is a great strategy and will increase click through rates.

Unique meta descriptions must be created for every page of your site and should be approximately 160 characters in length. Write a brief summary of the page so both the search engines and users know what your page is providing.

3. Robots.txt issues

The robots.txt file is an important file on your server because it informs search engine spiders on how to engage with your site to index your content. Just a minor change of one character in the robots.txt file can cause specific indexing issues, and, unintentionally, you could be blocking your whole site from being indexed. Your site then will not show up in search results for anything – including your own brand’s name.

Here is an example of what your robots.txt would look like if you were to allow it to crawl everything on your website:

User-agent: *
Disallow:
If you see a specific robot crawling your site that you do not want to give access to any longer you would add this in your robots.txt file and it will block the specific robot:
User-agent: NameofBot
Disallow: /

If you suspect there may be issues with your robots.txt file, this is a great tutorial to help you get a handle on how the robots.txt works.

4. 302 redirects instead of 301s

A 302 redirect reflects a temporary redirect. If you are removing a page for a short period of time to make changes you want to use the 302, which will inform the search engines that the URL being redirected will be put back at this address. When the URL is permanently removed or changed, you must always use the 301 redirect. Search engines will know that the redirect is permanent and will continue to assign the content with the new URL.

5. 404 errors

Having a lot of 404 errors tells search engines that there is a quality issue with your site, so any old or no-longer-used pages that have links pointing to them should be redirected to new versions of the page, or to relevant pages when a replacement is not available.

Google Webmaster Tools is an extremely valuable resource when checking for 404s. You can find the crawl errors section in your Webmaster Tools in the left sidebar under the “Crawl” dropdown. Once you click on “Crawl Errors,” you will see the “Not found” section. It will show how many pages are not found, with a graph and a list that have the URL, a response code and the date it was found.

6. Internal linking

Another big issue is having a poor internal link structure, which can cause many problems both from a user standpoint and with the search engines.

When a search engine crawler is going through your site, it counts on your internal link structure to bring it to every page of your site so it can be indexed easily and clearly. With a poor internal linking structure, there may be pages the crawlers cannot reach because there are no internal links providing direction. Make sure that every page of your site has links going to it.

Internal linking is important for other reasons as well. Visitors need to easily find what they are looking for, and good usability can make or break your site and have an impact on your rankings by lowering your bounce rate, since many people will go from one page or post to the next by following internal links. Bounce rate plays a role in how search engines will rank your site.

7. Duplicate Content

Duplicate content is having the same content on more than one page of your site. In many cases, duplicate content is something that can happen without you even realizing (one of the causes for this is the homepage duplication we discussed earlier). When search engines see this content, they do not know which one to index if there are no rel=“canonical” tags. After a while, they will stop showing the content in search results altogether. While the rel=“canonical” tag is important, you should also 301 redirect alternate version of pages to the preferred search engine friendly page.

There are many people that do just one or the other but Google will sometimes ignore the rel=“canonical” tag if it feels a different page is more relevant. When using both you know for sure that the pages being indexed and ranking are the best ones in your eyes.

Duplicate content can also be created because you decided there was not enough time to write unique content for every page of your website. You copied and pasted some content from one page to put on another (think of different location pages with the same services). It may save you time now, but it will cause problems for your site. Copyscape is a great tool for checking the content throughout your site to make sure it is 100% unique.

Concerned about your website’s SEO, or need another set of eyes to review what issues may be affecting your rankings? An SEO audit may be necessary to review your website’s current status – and what can be done to improve it.