Kasra Dash

Recover Your Website Traffic After a Core Algorithm Update: A Step-by-Step Guide

Table of Contents

Table of Contents

Boost organic traffic with our proven system.

Unlock higher rankings and organic traffic with expert SEO techniques designed for growth.

traffic accelerator system

Subscribe To The Newsletter

There’s nothing worse than waking up to a huge drop in your website traffic after a Google core algorithm update.

Core algorithm updates can feel like a mystery.

This is true if you think you’ve been doing everything right.

It can feel like a punch in the gut when one day Google is boosting your traffic, and the next day, it’s cut in half.

Over the years, I’ve done hundreds of consulting calls with website owners.

They all thought their sites were perfect.

But, after we dug a little deeper, we found issues. Algorithm updates had smashed their sites.

PS. I won’t act like an angel.

I’ve also been hit by core updates.

But, here’s a tip for any website owner: traffic from any source is borrowed.

You can complain on social media or try to recover it.

If you've been hit by a core algorithm update why should we listen to you?

Good question.

You should because I’ve been through it all: core algorithm updates, HCU, manual actions, and link spam.

And guess what?

I’ve recovered from every one.

Even this site, which you’re reading, got a manual action in March 2024.

See the attached screenshot.

kasra dash spam problems

If you’re listening to an “SEO guru” who hasn’t fixed it, they aren’t a great SEO.

That’s my opinion.

Thats why i’ve decided to write this blog article showing common mistakes i’ve found in websites.

Your website might have some or all of the problems I mention in this article.

If you want help, feel free to reach out.

I’ve looked at tons of websites over the years.

I’ll update this list with new issues I find from consult calls.

That said, let’s get into the list of common issues I’ve found.

I’ll update this blog article over time with more info and test data.

Understanding Core Update Effects

First of all what we want to do is identify whats been hit.

Ahrefs can be deceiving.

A site’s traffic may come from just one, popular page.

Best way to identify whats been hit is looking at google search console in the compare section.

The screenshot shows how to compare your website’s traffic before and after an algorithm update.

It features a site impacted by the September 2023 Helpful Content update.

I selected the date range from July 1 to July 31, 2023, and compared it to November 1 to November 30, 2023, in Google Search Console.

I prefer monthly reviews.

I understand this can be tough for SEOs, especially with daily traffic drops.

However, I wait to see how a site performs during and after core algorithm updates.

Sometimes, Google makes minor tweaks at the start of an update, then reverts them.

Making big changes to your site during this time can do more harm than good.

For a detailed analysis, use Screaming Frog, GA4, and GSC.

You can track clicks, visits, impressions, and page sessions with Screaming Frog.

This method is advanced but time-consuming.

Screaming Frog + GA4 + GSC Method

PRO TIP: For a detailed analysis, use Screaming Frog, GA4, and GSC.

You can track clicks, visits, impressions, and page sessions with Screaming Frog.

This method is advanced but time-consuming.

Why Sites Get Hit by Core Updates

Here is a reason as to why you might have dropped in the core update:

  • Lack of topical authority
  • Topical dilution
  • Lack of domain authority (lack of EEAT signals)
  • Toxic backlinks
  • Poor internal linking structure
  • Poor technical SEO (Crawl Depth, Bloated Site)

Finding Winners After Core Update

Now that we know which pages on our site were impacted, we want to find out who won after the core algorithm update.

We will then compare their site to ours.

Position history tracking

I like to use ahrefs position history for this:

  1. Goto Ahrefs
  2. Keyword Explorer
  3. Enter keyword
  4. Look at position history

What i’m looking for here are winners that have jumped up in positions (as a minimum i’d want to find 2 winners).

Pro Tip: Use the position history to find other sites affected by the core update. Check for any correlation to your site, like poor content, toxic backlinks, or bad tech.

Identifying Patterns Among Winners and Losers

Now we have a list of websites with both positive and negative impacts.

Now, we want to analyze the data.

We want to find out why some sites have gone up and others have gone down.

What data should I focus on when comparing winners and losers in the SERPs?

  • Content Pruning: Remove or merge outdated content to boost site relevance.
  • Disavow Toxic Links: Identify and disavow harmful backlinks to protect authority.
  • Server Log Analysis: Check which pages Google crawls most. Then, optimize links from these.
  • Internal Linking Audit: Improve links to ensure proper authority flow and fix orphaned pages.
  • Entity Optimization: Focus on entities and context to enhance Google’s understanding of content.
  • Crawl Budget Optimization: Manage low-value pages to save crawl budget and improve SEO.
  • Core Web Vitals: Enhance metrics to better user experience and meet standards.
  • Content Updating: Regularly update content to keep it relevant and aligned with search intent.
  • Link Reclamation: Recover broken links and convert mentions into backlinks.

I wish I could say, “Just fix this one thing.” I can’t. Recovering a website requires analyzing many factors.

PRO TIP: Many mistakenly believe Google has a one-size-fits-all approach to rank websites.

The truth is, ranking factors vary by site type: e-commerce, blog, news, or local service.

Don’t blindly follow advice on social media.

A strategy that boosts rankings for one site could harm another.

I often see claims that a specific tactic works across all industries, but that’s just not true.

For example, Google evaluates a porn site differently from a plumbing business.

It might rely more on dwell time for the former.

For the latter, it prioritizes local relevance, reviews, and service-specific content.

Tailoring your site’s recovery plan hinges on grasping these nuances.

A well-crafted strategy accounts for such distinctions, ensuring optimal restoration.

Analysing Content Quality and Relevance Differences

When you’re looking at depth and comprehensiveness, stack your content up against the top players.

See what they’ve got that you don’t—missing sections, extra detail, FAQs, or even videos.

Like, if their “how to bake a cake” post has troubleshooting tips, ingredient swaps, or step-by-step videos and yours doesn’t, that’s where you’re getting beat.

Fill those gaps, and you’ll be in a better spot to nail user intent.

If Forbes can rank with average content, why can’t I?

Forbes isn’t ranking because their content is groundbreaking; it’s all about their authority.

They’ve got a backlink profile that’s massive—think thousands of high-quality links from top sites worldwide.

That kind of link juice means they can get away with content that’s just okay.

For most of us, though, we don’t have that kind of link power to lean on.

So, focusing on in-depth, high-quality content that actually satisfies user intent is what will set you apart.

Assessing Topical Authority and Content Structure

When you’re looking at topical authority and content structure, you’ve got to hit every part of the funnel—TOFU, MOFU, and BOFU.

I get it, a lot of people hate doing TOFU content because it feels like low ROI, but that’s a short-sighted move.

Illustration of the marketing funnel stages: TOFU (Top of Funnel), MOFU (Middle of Funnel), and BOFU (Bottom of Funnel), highlighting the flow from awareness to consideration and decision-making stages.

TOFU’s what builds your brand’s presence and pulls people in at the start.

If you’re only hitting MOFU and BOFU, you’re missing the boat on showing Google you’re the go-to authority across the board.

PRO TIP: Want to build stronger topical authority with ease? Use my free content clustering tool to organize your content into topic clusters. This helps you create more focused, relevant content and improve your site’s internal linking structure, which Google loves. It’s a simple way to enhance both topical authority and user experience.

Comparing Backlink Profiles: Quality vs. Quantity

When it comes to backlinks, it’s not just about how many you’ve got—each link’s got its own value, and not all of them are pulling their weight.

Here’s how to break down and score your links:

  • Link Relevance: Is the linking site actually relevant to your niche? A link from a related site hits way harder than one from some random source. Aim for those tight connections.
  • Domain Authority: Yeah, everyone loves DA, but it’s not the only game in town. High DA’s good, but without relevance, it’s just a number.
  • Traffic Value: Links from sites that get real traffic? That’s the sweet spot. You’re not just getting SEO points—you’re getting eyeballs on your content.
  • Link Neighborhood: Look at what’s around your link. If it’s surrounded by junk or spam, that’s not where you want to be. High-quality environments give your links more juice.
  • Anchor Text: Not all anchors are created equal. Natural, branded, or partial matches keep it clean. Go too hard with exact matches, and you’re just asking for trouble.
  • Link Placement: Context matters. In-content links are the gold standard—way better than being stuffed in a footer or sidebar.
  • Site Freshness and Activity: Links from sites that are updated regularly send stronger signals. If the site’s alive, the link’s got more value.
  • LRT Toxicity Score: Use this to spot the bad apples. High toxicity means the link’s doing more damage than good. You want low-toxicity, quality links that lift your site, not sink it.

Bottom line?

It’s about quality over quantity.

Stop chasing every link like it’s a win and start closing that link gap with high-value links that actually push the needle.

PRO TIP: When it comes to disavowing toxic links, I recommend using LRT (Link Research Tools) over SEMrush. SEMrush often flags site-wide footer links, like those to GambleAware required by law for UK casinos, as toxic—when they aren’t. LRT, on the other hand, crawls far more links and offers a more accurate toxicity score by considering foundational links. A link that might be harmful to a new site could be totally fine for an established one with a strong foundation.

Evaluating Link Quality: Scoring Your Backlinks

When it comes to backlinks, it’s not just about stacking numbers—it’s about where those links are coming from and how they shape your brand.

Branded links are a game-changer that most people overlook.

Big players like IBM don’t dominate just because of links.

They own their brand SERP with mentions from high-authority sites like Wikipedia, news outlets, and forums.

These branded links build credibility in a way that generic backlinks just can’t match.

Branded links don’t just boost your site’s authority; they help you own the conversation around your brand.

You want to control what people see when they Google you.

When you search for IBM, Apple, or other big names, you see their site, a Wikipedia page, news articles, and other good mentions.

That’s brand control at its finest.

PRO TIP: Establishing E-E-A-T goes beyond just having great content—it’s about building a strong online presence across multiple trusted platforms. Getting listed on Crunchbase is a great start, but you should also focus on ranking your social profiles and owning your brand SERP. Platforms like LinkedIn, Wikipedia, Trustpilot, and Glassdoor are trusted by Google and can help boost your authority. The more consistently you appear on credible sites, the stronger your brand’s reputation will be, which directly impacts your E-E-A-T.

But I don’t have IBM money—how can I do this for my brand?

You don’t need a billion-dollar budget to start owning your brand SERP.

Focus on getting your brand talked about in the right places.

Get featured on industry blogs.

Network with relevant sites.

Use platforms like Crunchbase, niche directories, and social profiles.

It’s all about filling that first page with a mix of your site and trusted mentions.

Building branded links is about staying consistent and getting your name out there.

Your brand SERP should scream authority, even if you’re not a big player yet.

Evaluating Technical SEO: Crawl-ability and Indexation Insights

One of the biggest issues that go unnoticed is how Google actually crawls your site.

You might think just because a page is at crawl depth 1 or 2, Google’s definitely hitting it, right?

Wrong.

It doesn’t work that way.

The image below shows how Google’s crawlers can enter on one page, jump to a couple of others, and ignore the rest—even if they’re one click away.

Diagram showing Google crawler entry and exit points highlighting crawlability issues where certain pages are ignored despite shallow crawl depths, emphasizing the importance of optimizing internal linking and crawl paths for better indexation.

So what’s going on here?

It’s all about how you guide those crawlers through your site.

If you’ve got a bunch of random internal links or poor structure, Google might not even bother crawling some pages, no matter how close they are.

That’s why checking entry and exit points through server log analysis is crucial.

It lets you see which pages are crawled frequently and which ones are ghosted.

Don’t just assume Google’s crawling your best content.

Sometimes, it’s hitting useless pages and bouncing before getting to what really matters.

You need to map out your crawl paths, clean up the mess, and make sure those internal links are doing their job.

The goal is to create a clear, efficient crawl path that leads Googlebot to your high-priority pages and keeps it from wasting time on dead ends.

If you’re not auditing this stuff regularly, you’re flying blind.

Check your crawl paths, optimize your internal linking, and guide Google exactly where you want it to go.

Why isn’t Google crawling all my important pages?

Google might not be crawling your important pages.

This may be due to poor internal linking, crawl budget issues, or technical barriers like noindex tags, blocked resources, or slow load times

Analyzing your server logs can help identify which pages are being ignored and why.

How can I find out which pages Google is crawling and which it’s ignoring?

You can use tools like JetOctopus or Screaming Frog Log Analyzer to analyze server logs.

These tools, along with other SEO crawlers, show Googlebot activity on your site.

They reveal entry and exit points, crawl frequency, and ignored pages.

How does click depth affect Google’s ability to crawl and index pages, and how should I structure my site?

Click depth, the number of clicks from the homepage to a page, directly impacts crawlability and indexation.

Important pages should be at a shallow click depth (1-2 clicks) to signal their priority to Google.

For example, your main service pages or popular content should be easy to access from the homepage or main menu.

Less critical pages, like old blog posts, can be deeper.

This site structure improves user experience.

It also helps Google crawl your best content.

Understanding Internal Linking Strategies Among Competitors

Internal linking isn’t just about connecting pages—it’s about showing Google what’s important.

Tools like Screaming Frog and Sitebulb can give you the inside scoop on how your competitors are doing it:

  • Unique In Links in Screaming Frog: This feature shows how many internal links point to each page. It’s a quick way to see which pages are getting all the juice and which are getting ignored.
  • Simulated PageRank in Sitebulb: Sitebulb takes it up a notch. It assigns a simulated PageRank score to each page. This lets you see how your competitors are pushing power around their site.
  • Pushing Power to Key Pages: The more internal links a page gets, the more Google knows it’s important. This isn’t just about clicks—it’s about directing the flow of authority to the pages that matter most.
  • Link Sculpting Throwback: Remember when SEOs boosted pages by directing link juice? Although Google’s rules have changed, you can still manage internal authority by linking strategically.
Diagram showing internal linking and PageRank distribution between web pages, illustrating how link equity flows to increase page authority through strategic linking.

Use these insights to see what works for your competitors. Then, tweak your internal linking to show Google your top-priority pages.

Cost of Information Retrieval

Even if two sites have the exact same content and backlinks, Google isn’t just looking at those factors. One crucial element that’s often overlooked is how efficiently Google can crawl and index your site. If you’re wasting Google’s resources by making it crawl irrelevant or low-value pages, you’re setting yourself up for penalties.

The Cost of Crawling: $1 Per Page Example

Let’s say it hypothetically costs Google $1 to crawl and index each page (it’s much cheaper, but let’s roll with it).

If your site is packed with thin, duplicate, or outdated content, you’re making Google waste its crawl budget.

Imagine hundreds or thousands of low-value pages costing Google real money to crawl—resources it doesn’t want to waste.

Comparison of two websites showing the cost of crawling and indexing high-quality vs. low-quality pages. Website 1 has only high-quality pages with a lower crawl cost of $6, while Website 2 includes low-quality pages, increasing Google's crawl cost to $10.

Massive shout out to Koray Tugberk for helping me understand this (here is a much more detailed guide on information retrieval)

The Consequences of Wasting Resources

When Google sees a site forcing it to crawl irrelevant or poorly optimized pages, you’re likely to face penalties.

You might end up with a thin content penalty or, in more severe cases, a spam penalty.

Even if your content and backlinks are solid, wasting Google’s resources by crawling junk pages can hurt your rankings.

Example: Two Identical Sites, Different Outcomes

Take a look at the image above. Website 1 has 6 high-quality pages, each of which is relevant and efficiently crawled by Google.

The total crawl cost is $6, as Googlebot doesn’t waste time on low-value pages.

On the other hand, Website 2 also has high-quality pages, but it’s cluttered with irrelevant or thin content.

As a result, it costs Google more—$10 to crawl and index—due to the extra junk pages.

Over time, Website 2 wastes Google’s crawl budget. It gets penalized and drops in rank.

Website 1 stays efficient and thrives in search rankings.

Bottom Line: Optimize or Pay the Price

It’s not just about having great content and backlinks—it’s about not wasting Google’s resources.

Just like in the image, Website 2 is penalized because it costs Google extra resources to crawl through unnecessary or thin content.

Clean up those low-value pages, optimize your crawl paths, and save Google money.

If you don’t, Google will penalize you for making it work harder than necessary.

Topical Coverage vs. Topical Authority

Many websites focus on topical coverage. They produce a lot of content on a broad topic. But, that doesn’t always mean they have topical authority.

What’s the difference?

  • Topical Coverage: This method divides content into many subtopics. Often, it results in shallow or scattered content. For example, you might have 50 brief articles on SEO tips. Yet, none are detailed enough to offer real value or to rank well in searches.
  • Topical Authority: This concept focuses on depth and relevance. Rather than writing 50 shallow pieces, create fewer, detailed articles. These should cover key areas thoroughly. This approach establishes you as an authority in your niche. Consequently, Google rewards you.

Building Topical Authority:

  • Consolidate: Combine fragmented articles into stronger, more comprehensive resources.
  • Create Pillar Content: Focus on fewer, high-value pieces that can serve as cornerstone content for your niche.
  • Interlink: Boost your authority by linking related pieces. Create a web of high-quality, relevant content.

What Should You Focus On?

In summary, the goal is to shift from broad coverage to expert content.

To move from topical coverage to topical authority, you need to cut, combine, and deepen your content.

Pruning Low-Value Content

One of the most critical and often overlooked aspects of content management is content pruning.

It’s not just about updating pages or posting new content. It’s about finding which pages are valuable and which are hurting your site.

Content pruning flowchart guiding users through a decision-making process for evaluating, optimizing, or removing website pages based on relevance, traffic, backlinks, and content quality.

The yes/no flowchart above provides a clear, step-by-step guide that helps you decide whether a page should be pruned or optimized.

It starts with key questions like, “Is the content outdated or irrelevant?” and then branches based on your answers.

Each path leads you to decide on the content’s value.

It is based on its history, backlinks, performance, and quality.

Why Content Pruning Matters

As your site grows, outdated, irrelevant, or thin content tends to accumulate.

Keeping these pages online wastes crawl budget and weakens your site’s topical focus, diluting your authority.

Regularly auditing your content helps to keep your site clean and relevant.

Refer to the flowchart to guide you through the following steps:

  • Evaluate the Content: Start by assessing whether the content is still relevant. If it’s outdated or no longer useful, the yes/no flowchart helps you decide whether to update, merge, or remove it based on the page’s value and performance.
  • Decide Whether to Optimize or Prune: If a page still holds value, it might just need a refresh—update with new data, improve SEO, or expand the content. The flowchart will help you determine which pages are worth keeping and which should be pruned or redirected.
  • Redirect or Delete: The flowchart offers two options when removing a page. First, you can set up a 301 redirect to a similar or updated page. Alternatively, if the page is worthless, you can choose a 410 deletion. This approach aids in making informed decisions about content.

How to Use the Flowchart for Efficient Pruning:

  • Low-Traffic Pages: For pages with little traffic, use the yes/no diagram to decide if you should update, redirect, or remove the content.
  • Outdated or Irrelevant Pages: For outdated pages, the diagram helps you decide whether to improve, merge, or delete the page.
  • Thin Content: For content lacking depth, the flowchart shows whether to expand or prune it to maintain your site’s authority.

Keep It Clean and Focused

Using the content pruning flowchart helps you check your site’s health.

Removing old, duplicate, or irrelevant content makes your site better. This attracts both users and search engines.

Will deleting content hurt my SEO rankings?

Content pruning can boost your rankings. It removes low-quality pages that hurt your authority. Use 301 redirects to keep link value.

Can I prune content that has backlinks?

Be careful with pages that have strong backlinks. Instead of deleting, update or redirect them to maintain link value.

How can I tell if a page is considered “thin content” by Google?

Thin content is shallow, lacks detail, or offers little value. Google sees pages with few words, poor intent match, or less detail than competitors as thin.

Ongoing Updates and How to Get Help

I’ll be keeping this article updated after every core update to ensure you’re always equipped with the latest strategies for recovery.

As SEO evolves, so should your approach.

If your site’s been hit and you’re looking for tailored advice, feel free to book a consultation with me.

I’ve helped many businesses recover and can guide you through the process.

The most important step?

Take action.

Don’t wait—start working on your recovery now.

Subscribe Our Newslater

Drive organic traffic with proven SEO strategies.

Unlock higher rankings and organic traffic with expert SEO techniques designed for growth.

traffic accelerator system