Conferences11.08.17

Tips and Techniques for an SEO Site Audit #pubcon

Dark
Light

Welcome back! It’s time to talk SEO Audits, and how you can DIY. Up on stage to lead us on our journey are Rob Woods and Overit’s own Senior Strategist Jen VanIderstyne. You would have thought Jen would have shared her slides with me beforehand to make this part of my job a litle easier but, alas, then you would be wrong. I’ll remember this moment, Jen.

Rob is up first.

First, we want to talk about why you should do your own SEO Audit. What are the benefits?

Why Do Your Own SEO Audit 

  • Diagnoze issues that might be holding you back
  • Professional audits from great auditors can have a wait list
  • Professional aduits can be expensive
  • They help you uncover problems that may be simple to correct
  • Tell you when to get a specalist involved

Limits of af DIY Audits

  • Problems may be very complex
  • Indicate problems but not solutions
  • Solutions may be complex
  • It takes experience to prioritize problems and solutions
  • Generally limited to onsite issues

The Technical Audit

Rob’s going to dive into many of the aspects he looks at when performing SEO Audits. Let’s start.

 

Analytics

What does the analytics say? Has traffic remained constant or has their been a hit? If there was a hit, why? Did you change URLs that might have changed the site? Did you get hit by an algo update? Did you get hit by a manual penalty? He recommends using SEMRush to help diagnose issues.

Rob recommends Google Search Console and Screaming Frog as the two big tools that he uses to do Site Audits. Deep Crawl is another good one. If you’re just doing your own site, you can do Deep Crawl for a month and then cancel it (I’m sure they love that recommendation ;)). Screaming Frog is free for under 500 URLs. He uses it for a ton of stuff. He could not do site audits without these tools.

Analyze your crawling data. Rob says he looks at the ratio of indexed pages to crawled. He also looks for outliers. If all of a sudden Google starts crawling way more pages – why? Did someone changed the robots.txt? Is there an explanation? Check it out.

Indexation: Check how many pages are indexed vs how many pages you think should be indexed. Check Google’s web cache using the site command to get a good estimate. It will give you the top 500-600 pages. For smaller sites, that’s a really good way to check which pages Google has in its index. Google Search Console will tell you how many pages are indexed. If there’s a big discrepency between the web cache and what GSC says, something is off there.

Another thing he does with Google web cache is to use a plugin called Link Clump that allows you to copy all of the a pages/URLs so you can paste them into Excel.

Errors

  • Focus on real ones
  • Prioritize errors with external links
  • Provide a prioritized list with targets
  • Break out any that can be solved with rules
  • Tools: GSC 404 Report, Screaming Frog, Deep Crawl, Majestic

Robots.txt

  • Watch syntax for Googlebot
  • Test using GSC robots tester

Meta Robots

  • Check for noindex where it shouldn’t be
  • Index, follow means nothing
  • Save time analyzing dup content, titles, etc.

Sitemaps

  • Look for pages not indexed
  • Doesn’t match number of indexed pages
  • Non-canoncial URLs
  • Crawl sitemap for broken/ noncanonical
  • Tools: GSC, Screaming Frog

URL Structure

  • Are your URLs overly complex?
  • Are they being indexed normally?
  • Check for duplicate content created by lower case letters and end slashes

Link Equity

  • Look at the link equity of each page – is it distributing in a normal fashion?
  • Find “orphan” pages
  • Too many outgoing links per page?
  • Tools: Majestic, ahrefs

High Level Canonical URLs

  • www/ non ww don’t redirect or 302
  • http / https
  • no canonical tag (extra important on homepage)

Canononcial Link Element

  • Google often ignores it
  • Make sure all canonical signals agree
  • Href language, sitemaps, internal links

Duplicate Content

  • Internal vs exxternal
  • faceted navigation
  • alternate URL paths
  • URLs not unique enough
  • Tools: GSC dup titles, Deep Crawl, Screaming Frog

Title Tags

  • Are they too long for the SERPs
  • Look for duplicate and blanks
  • GSC duplicates
  • Screaming Frog: duplicates, long titles, blank titles

Meta Descriptions

  • Influences clickthrough rate
  • Google shows up to 150 characters

Structured data

  • Check errors
  • Look for opportunities
  • Breadcrumbs
  • Product
  • Reviews/ rating

Site Pages

  • Look at each template
  • Focus on home page
  • Home, Category, Subcategory, Product, Blogs, Post, etc
  • Content (text cache), Title Tag, Fetch/ Render

AMP

Most sites don’t have AMP. If you think you should and go to the AMP section of GSC and see a problem, definitely look into it

Page Speed

Next up in Jen! 

Jen’s a fast talker and I’m still under the influence of a Vegas cold, so I’ll do my best. Jen starts by offering a disclaimer on automated site auditing tools

  • Do: Use them for a quick overview of areas to ook at further
  • Don’t: Rely on their numbers of recommendations at face value. For example, the tool may tell you that you’r doing a great job because your Title Tag is 30 characters. You think, good! But the Title Tag might be something that doesn’t make sense or  is completely nonsensical. You have a brain, the tool does not.

External Links

You’re going to be look at problems, diagnostic and issues related to general help. She shows a screenshot of a Panda hit – it’s a drop to nothing. She’d like to think we’re past the point of engaging in risky behaviors, but… we’re not. If you see a sharp drop, you’re going to want to investigate your backlinks. Look for things that look intentional on your part. That’s often related to quality.

How do you identify the quality of a backlink? She recommends using Majestic, Moz, Ahrefs, etc. These are all really good tools. They’re useful to ID something on a range. You can spot things that are really low. When you get into that middle area, you might want to start look at the domains themselves. She likes to take a look at a domain in SEMRush. If you see their keyword rankings have declined, that generally means Google doesn’t see a lot of value in this site so it’s probably not of a lot of value for you either.

Jen also recommends looking at the distibution of anchor text -things that are targeted vs things that are more natural. If there’s a lot of really targeted anchor text, that might be part of your problem (it’s too optimized). If you’re looking at backlinks from a general health perspective, make sure your anchor text is optimized. Natural, but optimized.

Look if links are growing over time. Is it stagnant, growth or are you seeing great losses? You also want to look at the diversity of your links. This includes analyzing the total backlinks vs. referring domains, and the distribution of your links for homepage vs subpages. Domains have authority but Google is really ranking individual pages. If you’re not seeing a lot of backlinks from individual pages, that’s something you’ll want to look into.

You also want to look into links going to to broken or redirecting pages. That’s link equity coming into your site that you can make better use of.  Jen makes a good point that it’s not uncommon for different version of a homepage (site.com, site.com/) to be splitting your link equity. If you’re linking internally to an alternative (read wrong) version of your homepage, it’s gonna show up in your analytics and mess up your data.

Internal Links

There are two different kinds of internal links — Navigation links (sitewide, architectural signals) and In Content (contextual, variation) links.

When you’re looking at anchor text for internal links it should be relevant, broader than your navigational links, and conceptually consistent. Don’t use the same anchor text you’re using to point to your homepage as you are to your Contact page. Look and see if important pages have strong internal link signals.

 

Internal links offer:

  • More context
  • A next logical step for users
  • Opportunity o entice users futher into a site.

Look for dead ends where no non-navigational links exists. Links that aren’t tied to the theme of the current page or links that aren’t an intentinoal part of a forward-moving funnel.

Keyword Research – Your Site

Jen says this is the part of the audit where she basically takes up residence in SEM Rush.

Focus on relevance, posting, intent, volume and trends for the keywords you want to rank for. Getting an understanding of what you rank for will help you identify where you stand and where you need to grow. Also take a look at what pages are ranking. Is it your homepage or is it a subpage? Which subpages have the most visibility beacuse they’re ranking well? These are all things you want to know.

We spend so much time thinking about the messaging and the branding of the homepage. We assume this is how visitors are landing on our site — through the front door. But more and more that’s NOT how they are coming. They’re coming in through the basement and down the chimney. The experience you create within these pages should match that of your homepage.

She looks a lot at internal page competition over keywords. Sometimes two isnt better than one. Consolidate pages when you can.

Jen says that one of her favorite parts of KW research is getting a sense of all the different ways people are searching for a topic. You want to ID the scope of language, topic depth, user intent groups and guide posts on what people want to learn. When you’re loooking at a list of 100+ keywords, it can be overwhelming. She likes to distill redundancies for a clear narrative, which helps you create new content and better content.  When you’re doing keyword market research, she looks for longtail phrases to see opportunity where your content (or content you could create) could rank better than what is currently there because you’re a better expert.

Find keyword inspiration ever, including on sites like Quora, Yahoo Answers, Reddit, Amazon, etc., where people are talking about things in their natural language. When it comes to KW, it’s not just about finding a collection of keywords to shove into a page. It’s really a window into how people think and how they look for what they want.

Content Engagement and Targeting

Content should be written for users. Google is looking to serve its users the best results for any given subject so your goal should be to BE the best results. But we can be mindful of using structure and signals to help search engines find and rank our content.

Jen encourages everyone to look at Content Enagement metrics like Bounce/Exits, Session Duration, and Pages Per Session. You should also be looking at the behavior of your users in segments. Do new users perform differently than repeated users? Information in a silo will only take you so far. Your greatest insights often come from cross referencing.