Your rankings tanking on/around 10/19's update looks like a strong correlation. Your GSC with congruent drops in impressions and clicks doesn't immediately look like KWs you were top 3 for suddenly dropping further down the page-- else impressions would be similar but AP/CTR would just be tanking.
If you're using GA and your traffic doesn't have tons of seasonality, look at the landing page report month over month and see which pages have had the largest drop (can also do this in GSC). Then check in GSC and see what those pages' APs have done over the same period.
What types of links is GSC showing? Google says they don't count bad links, but if they make up a large proportion of your site's backlinks, SpamBrain (their spam AI-- publicly announced more frequently) may bucket your site that way.
SEMrush, ahrefs, Majestic, Moz OSE are all decent for checking who is linking (and maybe disavowing), but LinkResearchTools is probably going to focus more on identifying suspicious backlinks, if that is in fact what is impacting your site.
looking for correlation with any google change must always be the last thing to check for. if you look for correlation, you will always find some. there was an domain change and URL change with subideal old2new migration.
This would probably be captured when looking at GA in the landing page report with MoM traffic going to 0 for once high value pages. If that is the case, a 301 from old path(s) to new is the simplest solution to possibly restore most of the traffic? From what I've seen, lots of the SO scraper sites got crushed late last year, so if the top traffic driving pages historically have not had URLs changed, then maybe investigate that further. If they had their URLs changed, just redirect the old URLs and hope for the best.
@OP, if there were in fact high traffic pages that got renamed, do a CSV/Excel export in GA/GSC/sitemap/etc., and map the old URLs to their new and implement 301's and wait and see if traffic starts to return.
Generally, if you make good changes, you'll see a short-term dip before a long-term increase. The challenge is differentiating between bad changes that have caused long-term harm, vs. good changes that are experiencing the natural short-term dip.
A website may build good reputation with a search engine which has ranking benefits but at the core of search engine ranking is individual pages: you must protect your well ranking pages at all costs. That means never break any links. If you break links, your pages will drop out of search engines, and it can take years to rebuild the lost reputation for those pages. The model for thinking about SEO should be page-based, not website-based.
The good news is that if your website was able to rank well once, it'll rank well again, because as much as we might tinker with the structure of a website, what matters most is the content, and you're still distributing that same content... you just might have set yourself back a year by breaking a bunch of links. Easy mistake, c'est la vie.
The ideal strategy for experimenting with SEO is to do it page by page, experiment with different changes on different pages and measure the impact. Don't make wholesale changes to the structure of every page until you're confident that you're doing something that works.
As others have said, this strikes me as a multi-faceted problem.
It's possible Google devalues those sharing (even inadvertently!) GA tags in the rankings, although I don't think there's been any public proclamations from Google on that. But if that were the sole culprit, only your GA instance would reflect that. The fact that you're losing real traffic (as reflected in what the 3rd party tools are telling you) makes me think that's probably not the case/or at least the only thing that's happening.
Not implementing re-directs would also definitely be a culprit. But if it's just images you failed to re-direct, that's likely not the main thing either unless you were getting a majority of your traffic from Google Images.
Since this happened post-Core Update, I would want to know two things:
1. What keywords dropped and what replaced you
2. Whether the drop was site-wide or isolated to individual categories or groups of pages.
Regarding #1: Was what replaced you a big, high authority publisher? Or was the content simply more comprehensive or otherwise a better match for the user's intent? Very likely could be E-E-A-T-related, in terms of the algos determining that you don't have the authority/expertise to rank for what you were ranking for previously.
Investigate those possibilities first and you should be able to better map out a plan for re-gaining that traffic.
Good comment! I’d add in one more question so we can put the GA tag issue to rest.
Does the writer have access to Google Search Central? They rename this product every six minutes. It used to be Google Webmaster Tools and it’s likely called something new now. Are there any manual actions or security alerts in there??
If Google performs a manual action or picks up certain types of security problems, they will tank traffic. But they usually report to give webmasters the chance to fix it.
I’d avoid ahrefs and the like and start right with google tools.
> As I've mentioned, I broke links by not implementing redirects.
Didnt implement redirects for what? The prior part of the article doesnt even mention you changed any page URLs.
But if you did, that is most definitely going to have caused problems.
Think about it - the pages are no longer there, so they will get dropped from the search index, so wont appear in search results, resulting in less impressions/opportunities for click-through, and you will receive less visitors as a consequence.
You effectively delisted your sites content. The existing inbound links will now point to non-existent pages, so they carry no "authority" to your site. The content that is on the new URLs has no inherent "authority" so its not being ranked as well as it was on the old URLs. The search engines will rank it lower (e.g. page 10 instead of page 2), so less impressions and therefore less visitors.
You will likely be able to (partially) fix this with redirects - as sites that still link will still bring authority. But you will possibly have lost any history-based authority (that which comes from being a reliable and consistent URL), which may not be able to be retrieved.
I would recommend going through old access logs to find what your old URLs were, and redirect them to the new URLs for that same content (not a wildcard redirect to homepage!)
> I would recommend going through old access logs to find what your old URLs were, and redirect them to the new URLs for that same content (not a wildcard redirect to homepage!)
Which pages or queries in GSC specifically lost traffic? Was there a handful of keywords that drove most of your traffic that you’ve lost? Or is it down across the board for all articles?
The people that mirror your content would have replaced you. Try searching for a few page titles in quotes, you will easily find the culprits then. The problem is that now they will now have "older" content than you, so search engines may think you are copying content and penalise accordingly.
It’s a bit annoying but you should be able to see in GSC what the ranking change was for particular URLs and queries by comparing before October and after October. You should also be able to see if the issue is whether the pages dropped in ranking or if some articles became de-indexed entirely. Take one of the page URLs that has dropped in traffic and make sure it’s still being indexed.
If you still need help and are really stuck ping me and I will try to help you sort this out quickly. Reply to this comment if you still need the help with an easy way to reach you via email. (Not asking for anything in return)
Johnny is correct and this most likely has to do with the 404 backlinks and accidental misuse of Google Analytics.
I recently did a comprehensive SEO audit for a web3 brand that saw a similar drastic drop in traffic. Here are some other things to keep in mind to rank better on Google.
1. One H1 per page
SEO experts agree the best practice is to use one H1 per page. Headers help keep content structured for both Google and readers. If there are multiple H1s, convert them into proper H2-H4s to improve the content's hierarchy.
2. High-quality content >1000 words
According to Yoast, chances of ranking in Google are higher if you write high-quality posts of 1000 words or more. If you have some sparse content that is less than this add a few more detailed sections.
3. Google PageSpeed score >90
Google PageSpeed says that improving performance increases rankings and user experience. The ideal Google PageSpeed score is >90. Make sure your pages takes less than 2s to load, and ideally less than 500ms. Reduce JS and additional requests on important pages.
4. Add title tags and meta descriptions with primary keyword
Google recommends matching H1 tags to title tags to prevent inaccurate article titles from showing up on search results. It's also best to include the primary keyword in the meta description too.
5. Improve primary blog page with multiple sections
Your blog's primary page is your chance to showcase your best content, not just an archive of your latest posts. Separate your posts into sections like best, classics, and latest. Add H1 to make it clear to the audience what the blog is about. Add H2 subheadline to clarify.
By the way Johnny, you can see the full SEO audit on my Twitter[0].
If you're open to it, I'd love to do a full SEO audit for your blog. Let's get those numbers back to their original state. Please DM me on Twitter.
P.S. - I worked with a software company[1] to build out their docs using Docusaurus so I'm familiar with how it works.
Ironically, this is exactly what Google recommends you do in most cases. This isn’t empty, feel-good marketing either. It’s in their best interest for you to do so.
Your site's performance, reflected on its page speed insight score, is showing 45. This is unacceptable for a blog; it should be close to 100 for text-based pages like you have. Does the site contain scripts, deep dependency trees, or analytics?
Could this have changed during the performance drop?
No - it's because it's a Docusaurus site and so runs React / JavaScript. It was like this before the drop and so is unlikely to be the cause. It's been Docusaurus for about 2 years now
Google claims they don’t use GA for ranking signal. Though in your case it could be interpreted as mischief, maybe something is going on there. You might try recreating a GA4 instance and starting fresh. It might be the least painful road to redemption if that’s the issue.
Without knowing much about docusarus, if the upgrade touched the pages’ markup in some big way, something like not having h1 tags on the pages that would hurt.
The most likely culprit is a ranking algorithm change. I recall there was one around in October that wrecked more than a few sites.
The good news is you probably didn’t wreck your SEO. The bad news is you probably can’t do anything about it.
Yes, true. I've noticed, though, that more often than not HN's deletion of "How"/"Why" makes the title better when it comes to my submissions. I'd estimate 95% of the time I let the HN-auto-edited version stand.
Yup - I am not quite sure! But I reckon it's fairly likely that the post contains the "how" of how I ruined my SEO. The hope is that a knowledgeable someone can say what I did wrong.
I don't think sharing your Google analytics tag was the cause. If Google de-ranked unrelated sites using the same Google analytics id, we would see black-hat SEOs using that against their competitors. This ID isn't a secret.
Of the things on the list, the redirects jumped out to me as the most likely culprit. Second guess would be the algorithm change. You could look at individual pages before & after - mainly the top performing few, looking at what queries they ranked for before and now, and what their average position was. Hopefully the fix to the redirects will kick in soon!
I gave up on following the Google SEO bandwagon years ago. I write for people and most definitely NOT for Google. I get around 800 real users a month on my site across about 20 countries - top 3 being U.S U.K and for some reason China. I get maybe 20 or so emails asking questions or just saying thanks and they have to track me down to send those emails there's no "email me" link on the site. If you want to "monetize" your site then SEO away otherwise setup a Google site owner/developer account for the "advice" (page not mobile friendly for example) but ignore the Googley crap unless you like running on THEIR treadmill.
With your approach I'm sure you'll get back to normal traffic very soon.
I think your conclusion is right. It's not one thing that cause the drop but a combination of things.
Let's forget one sec about the traffic you had before and focus on the current situation.
- Page speed: It's one of the most important ranking factor. You don't have to get 100 score, but passing the core web vitals score and having higher score on mobile is recommended.
- Add a robots.txt You have plenty of pages in the site, it might be good to make sure only things you want to be indexed are getting crawled.
https://i.imgur.com/ONSiQjQ.png
- Add bio (one liner) here: https://i.imgur.com/SXbWVwU.jpg
Great smile. The bio should show your readers your expertise in the topic. If you want to take it to the next level, Quora is a great inspiration. In every Quora answer you can write a different On-liner/byline.
- Internal linking: I love your blog archive. It's a great idea. Try to make more strategic articles stronger. More internal links to the article will signal search engines this page is indeed important.
- Add Privacy policy & Terms of use pages
I can help you with analyzing the traffic drop if I'll understand which pages were your largest traffic generators.
Let's try to understand what happened to those pages, and how it's possible to make them stronger.
I worked with several large companies who had traffic drop after migration. Sometimes companies spend so much time and effort looking for the reason for the drop instead of just focus on recovering.
I'm up for analyzing the drop but if the reason is not obvious (Google Penalty for example), I recommend to focus on the future rather than the past.
I would love to help you (As a case study) with analyzing what caused the issue, and to give tips on how to move forward and get more traffic.
> It's not one thing that cause the drop but a combination of things
Let me propose a bold generalization based on observations of sites of all sizes wrestling with Google over the last 20 years, and to which there are certainly exceptions (this case may be one):
It’s always one thing.
Of course, there are always a multitude of improvements that can be made to nibble around the edges and get incremental improvements in various metrics. And a gradual change in traffic may have several simultaneous sources.
But when the effect is an identifiable precipitous drop in search traffic, almost all the time, it turns out there was a single reason, whether a change Google made or a change the site made.
This may be a totally unhelpful observation, and feel free to ignore it. But in these situations I’ve come to find it’s more fruitful to look for A Cause than to approach it as “a little here, a little there.”
Honestly, the thing about other sites using your GA tag seems like the biggest red flag/potential for some Google algorithm to have actively started penalizing you. The timing doesn’t quite track, but maybe some interaction between that and the spam update?
Generally that's a good thing, but it's not good for title images that appear "above the fold". I wonder if there's something we could do to determine whether an image is "above the fold" and so not apply lazy. Something to ponder.
> Try to make more strategic articles stronger.
Not sure I follow your meaning here, do you literally mean applying bold to certain articles? Or something else?
It dropped to 165 keyword now. That's an insane drop, but I wonder how much traffic these thousands of keywords brought you since there weren't rank too high.
I would like to dive into the keyword that ranked 4-10, on Google (50 keywords). And they keywords that ranked 1-3 (15 keywords)
By understanding on GA which pages brought you the most traffic/ranked the highest
you can make them stronger by adding more internal links to these pages, look at it as kind of first aid ;)
EDIT:
Few tips:
- The 'Recent posts' widget is great. It get your article indexed faster. What about adding a widget below for 'Popular articles'? Place 5-10 articles there.
tell me more about the timestamp. Where does that data get derived? I thought that should be in great shape since I set lastmod in my sitemap based on git commit date
Yeah - primitive but probably reasonable. Would be happy to experiment with that. I've also been pondering things we could do around open graph images as well, would be nice to use image CDNs like Cloudinary for open graph images in the same way we can for blog images https://johnnyreilly.com/2022/12/26/docusaurus-image-cloudin...
Changing all images to be lazy loaded is not a good idea for SEO reasons (and perhaps not a good idea for regular usage of the site either). If it were, browsers would just do it automatically. Using hints on your images is only useful if you are doing it strategically, which means only lazy loading images that are offscreen on the initial page load. Otherwise you are not actually giving useful hints to the browser on how to load your page in the correct manner, and thus you are just better off letting the browser use whatever internal logic it has to decide how and when to load the assets.
Basically you can think about the optimal, minimal set of resources to render your page being:
- HTML
- Required CSS
- Above the fold images
Then at that point, download all of the other things to make the page work in the way you want (javascript, etc). Anything else is delaying the initial page-load. Because your images are lazy loaded, your page load looks like the following:
- HTML
- Required CSS
- Some above the fold images (profile.jpg, etc)
- Massive, >300kb blobs of javascript (runtime-main.js, main.js)
- Above the fold images in the post
This is not good. It doesn't make sense for your page to download 300kb of javascript before downloading the 18kb image that is above the fold. Now you can partially solve this problem by making the javascript asynchronous, but that still is just another band-aid on the problem, as then the javascript and above the fold images would download concurrently, which is still not optimal.
What you want to do is have above the fold post images be loaded eagerly (the default), and then lazy load ones that are lower on the page. If you aren't going to do that, you probably are better off just not having the images being lazy loaded at all, especially if your page includes 300kb of javascript which is likely going to be much larger than the combined size of all the images on the page.
> You don't have to get 100 score, but passing the core web vitals score and having higher score on mobile is recommended
Note that they don't have a CWV score yet due to low traffic. But a 39 performance score from the simulated Lighthouse is often more than enough for a passing grade. That is: if a Moto G4 can do OK, your normal users will likely do great.
For instance, a site I made[0] has a 22 from Lighthouse, but a passing CWV grade, so further improvement to the LCP, FID, and CLS would confer no direct Google SEO benefit.[1] (But it may help things like bounce rate, which may confer second-order benefits)
[1] "For example, a page with an LCP of 1750 ms (better than the “good” LCP guidance) and another one with 2500 ms (at the “good” guidance) would not be distinguished on the basis of the LCP signal" – https://support.google.com/webmasters/thread/104436075/core-...
- The drop might be related tO Google algorithm update. This link can help with understanding if the traffic drop is in correlation with your traffic drop:
Of the possibilities you listed in the article, my money is on the failure to do the redirects. Luckily, it may not be too late to fix that, if you haven't already.
If you're interested, email me at tobes@longtaildragon.com, and I'll pull a report of every URL that needs a redirect.
Seems like a lot of majoring in the minor here. After such an event, someone once told me to go fishing for a week and come back to focus on what really matters. It was helpful advice.
Scott’s Cheap Flights just rebranded to Going. Would there be a good way for him to forecast the SEO impact of that change (which seems on the surface to be probably large)?
The other issue you may need pay attention to is your backlink profile. I ran an quick audit, 19% of your ahrefs links are bad. You probably need disavow them.
if you do not do this correctly a traffic pattern like yours must be expected.
if you do this now you might be abble to recup about 50% of your old traffic, might
additionally you haver massive self made internal duplicate content (tag pages show full pages), soft 404 (non existing pages i.e.: https://johnnyreilly.com/page/fake-url-for-soft-404-error-ch... trigger HTTP 302 redirects and then even more behaviour)
I agree with the sentiment, but if a personal blog tracks users and sends data on them to someone else, as with GA, then it absolutely needs a privacy policy.
I had a blog and regularly commented on dozens of blogs from 2005-2015. Just about all of them used Google Analytics and in the early days many, including mine, were hosted on Google's Blogspot.
There wasn't any legalese on any of the smaller/medium sized ones and I really haven't once regretted that there wasn't.
Sure, and law varies greatly by region. China, North Korea, and to a degree, Russia have been extremely restrictive for a long time, western Europe has recently become more so and many other places are still relatively free.
My "why" question was intended from more of an ethical first principles standpoint than a legalistic one.
I know that some jurisdictions bury pretty much any online activity in mountains of regulatory requirements. That's not novel or interesting. As I said in the ancestor comment, it's "a sad state of affairs when even personal blogs are expected to be laden with legal documents".
It's not that hard to write "we keep none of your personal data" as your privacy policy, if you want to track me on a personal blog you better get your legal story straight...
It's only an ads thing. Facebook, Google, etc., all require to those pages be on a website to run ads. They say nothing about Personal websites, but the EU probably does once you start using GA.
My Facebook ad campaigns were repeatedly rejected until I made that one change. And they never said why it was being rejected. It was only that change that fixed it.
Unlikely to be a source but a good benchmark is to have this stuff in place because
a)It takes almost no effort,
b)You don't know if it makes a difference or not, and
c)There is pretty much zero chance it's going to have a negative impact.
I had a new website that had minimal text content. At first the most popular Google keywords were from privacy and terms! Had to change those pages to no robot.
Also intrigued as to whether this could be a cause. I'm certainly not opposed to adding one, but I'd be surprised if it was necessary. But I'm here to learn!
Your rankings tanking on/around 10/19's update looks like a strong correlation. Your GSC with congruent drops in impressions and clicks doesn't immediately look like KWs you were top 3 for suddenly dropping further down the page-- else impressions would be similar but AP/CTR would just be tanking.
If you're using GA and your traffic doesn't have tons of seasonality, look at the landing page report month over month and see which pages have had the largest drop (can also do this in GSC). Then check in GSC and see what those pages' APs have done over the same period.
What types of links is GSC showing? Google says they don't count bad links, but if they make up a large proportion of your site's backlinks, SpamBrain (their spam AI-- publicly announced more frequently) may bucket your site that way.
SEMrush, ahrefs, Majestic, Moz OSE are all decent for checking who is linking (and maybe disavowing), but LinkResearchTools is probably going to focus more on identifying suspicious backlinks, if that is in fact what is impacting your site.