Site Key Facts
May 25, 2022 Site Lost 70% of its Traffic
June 26, 2022 Site Appears to recover
October 19, 2022 Google Spam Update 70% of Site Traffic Lost AGAIN
All major keywords dropped out of the top 100
TIME February 19, 2023 site recovery and gained 10% more traffic
Site averages about 310K visitors per month post update
Site has over 400 pieces of content
Large amount of content that had to be edited individually (400 Pieces)
Site was continually spiking and dropping out of rankings as we worked
Many site content changes were tedious and easy to make a mistake
Site had multiple issues that need to be addressed
Google Algo Updates
I was in contact with this website owner around the time frame the site appeared to be affected by the core update, May 2022 and lost 70% of its traffic.
At the time, I recommended the client hold back from making any major changes to the site and wait for the recovery algorithm to take effect. The client was patient and worked on a few of the following items: fixed core web vitals and improved commercial-to-informational content ratio.
The client let me know that on June 26, 2022, the site had fully recovered and all traffic and revenue had returned.
However, the celebration was short-lived as the website was struck by another update around October 19, 2022, resulting in a catastrophic 70% dip in traffic and consequential revenue loss. This dire situation necessitated immediate action and on October 26, 2022, my team and I embarked on a course of action to resuscitate the website and actually exceed its previous levels of traffic by an astonishing 10%.
Before the website was hit with the first Google algorithm update the site was earning on average $8,000 – $10,000/month. After the site recovered from the first Google Algo Update the site was averaging $19,000 a month and the site is on track to earn $21,000 a month with this current recovery documented here in this case study.
The following account details the rigorous efforts and how we succeeded in restoring the website to its former glory and setting new records. Let’s dive in!
Image Key Facts
Core Update: June 2022
June 26, 2022 Site Fully Recovers
October 19, 2022 Spam Update
October 26, 2022 Work Began
Changes to Site Completed February 9, 2023
The initial step in the process of website recovery entails conducting a thorough and comprehensive audit. The primary objective is to identify and resolve any technical issues that may have contributed to the traffic drop. It is crucial to determine whether there were any substantial changes made to the site prior to the decline in traffic.
For this purpose, I prefer to use the Ahrefs tool to conduct a detailed audit of the website. The results of the Ahrefs report indicated that there were no significant technical problems that could account for a substantial drop in the website’s rankings. There wasn’t any technical issue or change to the website that led to the site losing an astonishing amount of traffic that looked like this in the following two images:
The next step in the recovery process involved running a comprehensive audit with Sitebulb, and focusing on the internal linking patterns of the website.
Using this audit, I was able to focus on analyzing the interlinking structure of the website, particularly with regard to the use of anchors and the anchor text ratio. However, after conducting a thorough review, it was determined that the interlinking of the site was not contributing to the traffic drop in any significant way. This site did not have a large percentage of exact match keyword anchors or aggressive internal linking.
During this particular phase, it is essential to ensure that the search intent has not changed, that similar competitor sites are still performing well, and that Google still favors this type of site in its search results.
One of the easiest ways to assess the SERPs is to search for the top terms that the top pages previously ranked for in the search engine results. By doing so, one can answer several questions, such as:
• Are similar competitor sites still performing well?
• Are similar sites still being shown in the SERPs for the top terms?
• How have the SERPs changed?
• Has the search intent remained the same?
• Does the keyword intent align with the search intent? (i.e., informational, navigational, transactional, commercial)
In this case, the analysis revealed that competitor sites with similar characteristics were still ranking well, and the SERPs showed comparable websites. Furthermore, there had been no changes in the search intent, and the keyword intent aligned with the search intent. Therefore, the results indicated that no changes had occurred in the SERPs, and the website could continue to target the same audience and keywords as before.
I conducted an assessment of the two leading competitors who have demonstrated strong performance in terms of the website’s primary search terms. This involved examining over 20 keywords in order to establish a basis for comparing the website’s content with that of its primary competitors, who continue to perform well in the search engine results pages (SERPs).
After careful evaluation, it became apparent that the content produced by the competitors was subpar in comparison to that of the client’s website. Specifically, the content impacted by the update was found to be of higher quality than that of the competitors.
The client’s content was well researched and in-depth, it had proper topical relevance and major entities were used. This means that there was not a lack of entities or topical relevance causing the site’s loss in traffic.
In this phase, we examine the actual optimization of the content. I like to use Surfer SEO for this phase. I took a sample of 20 URLS across the site that used to perform well, and now all keywords were out of the top 100.
All URLS had the same high optimization score, as you see here:
The client’s competitors had optimization scores that looked like this:
Regarding optimization, it is apparent that the client’s website is more effectively optimized, with more substantial content compared to that of their competitors. Two accompanying images have been provided, offering an overview of the overall theme from the evaluation of the remaining URLs with respect to optimization.
Google has undergone changes in recent times, with Google Algo Updates increasingly targeting on-page SEO strategies. As such, it is critical to exercise caution with regard to the placement of primary keywords throughout the content so, as to avoid any over-optimization.
For instance, it is advisable to include the primary keyword in heading 1 and a few heading 2s while refraining from incorporating the same keyword into headings 3-6 or any other types of headings employed. This shift in emphasis occurred a few years ago during the Google algo updates in 2021.
Previously, it was common practice to incorporate keywords throughout an article, including in multiple headings. However, since the beginning of these updates, it is crucial to avoid over-optimization of keywords within headings or other sections of the website. On-page SEO has changed and its important that we change with it.
It is important to keep in mind that the very tools designed to optimize content may inadvertently lead to over-optimization, underscoring the need for clearly defined parameters and basic rules, such as those expounded upon in this case study. By adhering to these guidelines, it is possible to avoid over-optimization of headings, meta-titles, and other elements in the content.
This website had exact match keywords in multiple headings such as heading 1, multiple heading 2’s, and multiple heading 3’s and 4’s. This is a common trigger of the Google Spam update combined with the backlinks issue mentioned below. But don’t take my word for it, wait until we test our theories below and show you the results.
It is often tempting to take a quick glance at competitor sites and attempt to replicate their meta titles, potentially leading to inadvertent over-optimization of the title. However, before incorporating an exact-match keyword twice in the meta title, it is essential to evaluate the comparability of competitors’ sites to one’s own. Key considerations include whether the competitors’ sites possess a branded or keyword-based domain and whether their content is similar or under-optimized in comparison. The key is to find a competitor’s domain and site that is like yours so you are not comparing apples to oranges here.
It is critical to keep in mind that each website is unique, and various factors must be considered before arriving at a final decision. Optimization tools can assist with constructing meta titles, but it is important to exercise caution, as over-optimization may occur. It is advisable to examine all the mentioned factors and refrain from overusing a keyword phrase. In other words, before using your main phrase more than once in a meta title make sure you are not overusing your keyword phrase. All of the optimization tools mentioned in this case study can help you make that determination but you also need your experience and research as an SEO to make a good decision.
There were several meta titles that needed to be adjusted on this website but not many. This was a small issue that was found but only on select posts and considered a minor issue.
Remember when we used to make sure the keyword was in the content 2-3%? Those days are over, and the actual use of your main keywords has decreased due to the focus on Semantic SEO which, places greater emphasis on factors beyond keyword density. If you need a visual of how semantic SEO differs from keyword SEO, you can check out my video here: https://youtu.be/xzHAR4tU56Y
The focus has shifted from how many times you use your keyword to ensuring you have the keyword in the right places and not over-optimized. Interestingly, some articles that attain top rankings do not even contain the primary keyword in their main content, highlighting the evolving nature of the SEO strategy.
Alt tags are where you do not want to use your exact match keyword repeatedly. Alt tags provide the least rankings boost when applied to a site. It is better to place your exact match keywords in the places that give you the most significant rankings boost, like your headings, meta titles, and main content. I recommend that if you must use your exact keyword in an alt tag, you do it no more than once. I prefer you avoid using your exact match keyword in any alt tag because simply put, there are better places to use your exact match keyword that provide a much larger rankings boost.
Remember, on-page SEO is calculated as a whole with all of the elements combined and over-using a keyword can actually harm your rankings.
My favorite tool to check a site’s optimization is On Page AI.
The tool will look for over-optimization in the title, headings, main text, and images, as well as styling.
Here is an image of the client’s site that collaborated what my eyes were already seeing:
The headings of this website were over-optimized when you combine all of the on-page factors together.
During this process we did realize that the website did have articles written by AI. Out of the 400 articles about 60 were found to be AI. So, about 15% of the website’s articles were written by AI.
Google has issued statements that it doesn’t care if you use AI or a writer. Having AI on a site will not cause it to be hit by a Google Algo Update. However, if your AI content is not structured as discussed in this case study structural elements can cause your site to be hit by the update, or increase the likelihood. So, whether you use AI or have a writer, you must pay attention to all of the guidelines discussed herein.
My team and I focused on the AI articles the same way we focused on all of the other articles mentioned above.
External and Internal EEAT Factors
The client’s website did lack some external and internal EEAT factors. We updated the site so that they had individual author pages that fully described the author’s expertise and background. External factors included making sure the author’s listed this website in their bios on relevant social media platforms. This part of the process was done in order to help future proof the website against future Google updates that rely on strong EEAT factors.
For more information on the exact process or you have fake authors, you can watch the video I have provided regarding EEAT factors: https://youtu.be/lWBPM7lNoUU
There is a strong correlation between on-page SEO and off-page SEO when dealing with the Google Spam update. Just as we checked for over-optimization on the website you now need to check for over-optimization off-site. We ran a full backlink audit specifically targeting the types of backlinks as well as the anchors used.
You want to make sure the majority of your backlinks are branded and/or not keyword related. We found there was a large number of spam backlinks coming in as the site is often targeted by competitors with negative SEO. Secondly, we found a large number of backlinks that were not topically relevant to the site. Guest posts or highly authoritative backlinks from sources like Harro links should be kept at a smaller percentage than your overall backlinks. A disavow file was created and submitted to target all negative SEO and non-relevant backlinks. It generally takes up to two weeks before the disavow file takes effect.
After all of the audits and research I realized that the main issues that needed to be addressed on-page was over-optimization in various areas and backlinks. We used a systemized approach to test our theories by creating the disavow file and uploading it and then we worked on de-optimizing the website in three different phases on 20 URLS that had been badly affected by the Google Algo update.
I tracked keywords and traffic to those specific pages after all changes were made. Below are images of what we saw on all 20 URLS that were updated:
Once we had positive changes on all 20 URLS we then carried out the finalized changes across the entire site, quickly. We then made sure all of the pages / posts were crawled and then we waited for the final results.
The following images will speak for themselves:
The average impressions for this website before the algo update was around 160,000 a day. After our changes the website is now averaging over 340,000 impressions a day.
My team and I have worked on reviving many sites that were hit by various Google Algo Updates. There are a lot of similarities in the Algo updates and no matter which update your website was hit with, we have the answer. My team and I have developed a systemized approach to identifying common problems found on websites that are being targeted by the Google Algo Updates. We are able to not only revive websites but also future proof sites so they are less likely to be hit in the next Algo update.
I’ve developed a systematic approach that instructs you on how to recognize which Google Algorithm Update has impacted your website, pinpoint website problems, and devise a strategy to remedy them, allowing your website to regain revenue and increase traffic. Even if your website has never been hit by an algo update the systemized approach will help you future proof and raise the quality of your site.
To Contact Marie Ysais for a website consult: [email protected]
Join The Conversation
Join Our Mailing List
We aim to help you take action to boost your rankings instead of waiting around for your backlinks to kick in.
We’d love our Facebook community to be full of curious SEOs. Feel free to dive in, ask questions, and share your own insights, but be sure to read the rules first and remember: no self-promo.
If you love SEO as much as we do and you need to close the gap and learn on-page SEO, join us today!