Solving complex SEO problems require a new ‘discovery’ approach

The SMX presentation with Hannah Thorpe and Arsen Rabinovitch reviewed Google’s latest updates along with diagnostics and tools to get your site back on track.

Hannah Thorpe highlights Google changes and the user discovery experience

Hannah starts by sharing a slide with an SEO checklist used by Rand Fishkin in a past episode of Whiteboard Friday:

The problem is that a lot has changed since Rand published that Whiteboard Friday. One aspect of what has changed is the advent of neural matching. For example, according to Danny Sullivan, 30% of all queries use neural matching, as shown in this example query.

In this example, we see a user asking why their TV looks strange, matched up with the common label for that (the “soap opera effect”). This is an example of neural matching in action.

Another major change is that Google is becoming a discovery engine, where the focus is more on discovery than search. This is quite literally true in the case of the Google feed. Another significant change is the addition of the topic layer.

In this result, above the knowledge graph, you can see the most relevant sub-topics to the query result.

Another significant shift is driven by Google’s increased understanding of entities. This enables Google to do a far better job of finding the content that is most relevant to the user’s query, and in fact, their actual needs (Note: my view is that this is not a deprecation of links, but an improved methodology for determining relevance).

Hannah then provides us with a table comparing the differing natures of search and discovery.

As SEOs, we tend to optimize for search, not discovery, where the entity holds a lot of the value. With this in mind, we need to start thinking about other ways to approach optimization, as suggested by Rand Fishkin.

Hannah then recommends that we attack this new SEO landscape with a three-pronged approach:

  1. Technical Excellence
  2. Well-Structured Content
  3. Being the Best

Use your technical expertise to reevaluate your focus

The need for this is a given, but there are four causes of failure for audits:

  1. Sporadic implementation
  2. A changing ecosystem
  3. No change control process
  4. Too many people involved

How do you get better audits? It’s about your focus. Think about these things:

  1. Being actionable
  2. Prioritize time vs. impact
  3. Understand the individuals involved
  4. Quantify everything

Be smart about your structured content

This is about making it easier for users to find what they want. Here’s a smart way to think about formatting your content:

Basically, you want to make the main point (answer the question) in the first four lines of the content. Expand upon that with examples, and then comment on that.

The specifics of how you format the content matters too. Some basic ideas on how to do this are:

  1. Avoid large blobs of text – keep paragraphs short
  2. Keep tables clean and avoid large chunks of text in them
  3. Avoid excessive text in bulleted lists, and don’t double space them either
  4. Add structured markup to your content

Be the best in your own space

This is critical to success in today’s world. It doesn’t mean that you need to be the best at everything, but carve out your space and then own it.

This starts by understanding where the user is on their journey when they reach your site. What are their needs at the moment that they arrive on your web page? Understand all the possible circumstances for visitors to your page, and then devise a strategy to address the best mix of those situations possible.

 

Arsen Rabinovitch outlines how to evaluate SEO problems and what tools can get you back on track

This was a big year for Google updates and that prompted changes in traffic, some significantly, for a lot of sites.

Here’s a short summary of the updates that occurred.

 

For some sites, the impact of this was quite large. You can see an example in this SEM Rush chart of one site that suffered two large drops during the year.

 

Once you have experienced significant traffic loss, it’s time for SEO triage. The first step is to look at the symptoms.

  1. We lost positions
  2. We lost organic traffic
  3. We lost conversions

It’s also helpful to create a timeline, as shown here.

 

Once this initial research is complete, it’s time to do some diagnostics. It’s helpful to start by ruling out as many potential causes as possible. Some of the things to check for are.

  1. Check for manual actions
  2. Look to see if the symptoms are environmental
  3. Check if the symptoms just started presenting themselves
  4. Determine if the symptoms are chronic

 

It’s also useful to see what’s being affected. Is it the entire site or just some pages? You should also check to see if competitors are affected as well.

 

Further digging can involve checking which queries lost clicks (or rankings). You can also use third-party tools such as SEMRush to confirm your findings. Make sure to also check the URL inspection reporting in the new GSC to see if Google is reporting any problems there. Check to see if the SERPs themselves changed. You can also check the index coverage report, and look for URLs that are indexed but not in sitemap.

 

Checking coverage reporting is also a good idea. Some things to look for there include:

  1. Look for large chunks of pages moving in or out
  2. Blocked by robots.txt
  3. Excluded by NoIndex tag
  4. Crawled – currently not indexed
  5. Alternate page with proper canonical tag

Search Console is truly a goldmine of diagnostics, and here are some more things you can look into:

  1. Check crawl stats
  2. Check blocked resources report
  3. Check html improvements report

Another rich area of information is your web server logs. These are files kept by your web hosting company that contain a ton of information in search and every visitor to your web site. Things to look for here include:

  1. Weird status codes
  2. Spider traps
  3. Performance issues
  4. Intermittent alternating status codes

Don’t forget to look at your backlinks, as you may spot problems there that impact the entire site. Use a crawler tool such as Screaming Frog or DeepCrawl to crawl. Identify and fix all the issues you find there, and look for pages that are not getting crawled by your crawler and find out why.

Arsen also shared a number of one slide guides that captures much of the above. You can go check those out in his presentation below.