Hacker News new | past | comments | ask | show | jobs | submit login
How I ruined my SEO (johnnyreilly.com)
125 points by johnny_reilly on Jan 15, 2023 | hide | past | favorite | 92 comments



https://developers.google.com/search/updates/ranking

Your rankings tanking on/around 10/19's update looks like a strong correlation. Your GSC with congruent drops in impressions and clicks doesn't immediately look like KWs you were top 3 for suddenly dropping further down the page-- else impressions would be similar but AP/CTR would just be tanking.

If you're using GA and your traffic doesn't have tons of seasonality, look at the landing page report month over month and see which pages have had the largest drop (can also do this in GSC). Then check in GSC and see what those pages' APs have done over the same period.

What types of links is GSC showing? Google says they don't count bad links, but if they make up a large proportion of your site's backlinks, SpamBrain (their spam AI-- publicly announced more frequently) may bucket your site that way.

SEMrush, ahrefs, Majestic, Moz OSE are all decent for checking who is linking (and maybe disavowing), but LinkResearchTools is probably going to focus more on identifying suspicious backlinks, if that is in fact what is impacting your site.


looking for correlation with any google change must always be the last thing to check for. if you look for correlation, you will always find some. there was an domain change and URL change with subideal old2new migration.


This would probably be captured when looking at GA in the landing page report with MoM traffic going to 0 for once high value pages. If that is the case, a 301 from old path(s) to new is the simplest solution to possibly restore most of the traffic? From what I've seen, lots of the SO scraper sites got crushed late last year, so if the top traffic driving pages historically have not had URLs changed, then maybe investigate that further. If they had their URLs changed, just redirect the old URLs and hope for the best.

@OP, if there were in fact high traffic pages that got renamed, do a CSV/Excel export in GA/GSC/sitemap/etc., and map the old URLs to their new and implement 301's and wait and see if traffic starts to return.


Generally, if you make good changes, you'll see a short-term dip before a long-term increase. The challenge is differentiating between bad changes that have caused long-term harm, vs. good changes that are experiencing the natural short-term dip.

A website may build good reputation with a search engine which has ranking benefits but at the core of search engine ranking is individual pages: you must protect your well ranking pages at all costs. That means never break any links. If you break links, your pages will drop out of search engines, and it can take years to rebuild the lost reputation for those pages. The model for thinking about SEO should be page-based, not website-based.

The good news is that if your website was able to rank well once, it'll rank well again, because as much as we might tinker with the structure of a website, what matters most is the content, and you're still distributing that same content... you just might have set yourself back a year by breaking a bunch of links. Easy mistake, c'est la vie.

The ideal strategy for experimenting with SEO is to do it page by page, experiment with different changes on different pages and measure the impact. Don't make wholesale changes to the structure of every page until you're confident that you're doing something that works.


As others have said, this strikes me as a multi-faceted problem.

It's possible Google devalues those sharing (even inadvertently!) GA tags in the rankings, although I don't think there's been any public proclamations from Google on that. But if that were the sole culprit, only your GA instance would reflect that. The fact that you're losing real traffic (as reflected in what the 3rd party tools are telling you) makes me think that's probably not the case/or at least the only thing that's happening.

Not implementing re-directs would also definitely be a culprit. But if it's just images you failed to re-direct, that's likely not the main thing either unless you were getting a majority of your traffic from Google Images.

Since this happened post-Core Update, I would want to know two things:

1. What keywords dropped and what replaced you 2. Whether the drop was site-wide or isolated to individual categories or groups of pages.

Regarding #1: Was what replaced you a big, high authority publisher? Or was the content simply more comprehensive or otherwise a better match for the user's intent? Very likely could be E-E-A-T-related, in terms of the algos determining that you don't have the authority/expertise to rank for what you were ranking for previously.

Investigate those possibilities first and you should be able to better map out a plan for re-gaining that traffic.


Good comment! I’d add in one more question so we can put the GA tag issue to rest.

Does the writer have access to Google Search Central? They rename this product every six minutes. It used to be Google Webmaster Tools and it’s likely called something new now. Are there any manual actions or security alerts in there??

If Google performs a manual action or picks up certain types of security problems, they will tank traffic. But they usually report to give webmasters the chance to fix it.

I’d avoid ahrefs and the like and start right with google tools.


No manual actions / security alerts I'm happy to say. Good callout


What about duplicate content? Have you tried searching direct quotes from your writing?


That’s good. Any noteworthy increase in error codes?? Perhaps a mass of 404s??

I’m specifically wondering about how you deal with redirects and what you tell the robots.


My site is definitely page based rather than image based.

I'll confess to complete ignorance when it comes to keywords. I'm a total novice in this area.

I still get traffic, but for things like a webpack define plugin post I wrote in 2017 and for 2 CSharp posts. "eslint for c#" and "build props"

All my Azure / TypeScript / React / Bicep traffic dried up. No idea what replaced me.

For what it's with, with no false modesty, my content is pretty good. It's the sort of thing I'd hope to find when I Google.


> As I've mentioned, I broke links by not implementing redirects.

Didnt implement redirects for what? The prior part of the article doesnt even mention you changed any page URLs.

But if you did, that is most definitely going to have caused problems.

Think about it - the pages are no longer there, so they will get dropped from the search index, so wont appear in search results, resulting in less impressions/opportunities for click-through, and you will receive less visitors as a consequence.

You effectively delisted your sites content. The existing inbound links will now point to non-existent pages, so they carry no "authority" to your site. The content that is on the new URLs has no inherent "authority" so its not being ranked as well as it was on the old URLs. The search engines will rank it lower (e.g. page 10 instead of page 2), so less impressions and therefore less visitors.

You will likely be able to (partially) fix this with redirects - as sites that still link will still bring authority. But you will possibly have lost any history-based authority (that which comes from being a reliable and consistent URL), which may not be able to be retrieved.

I would recommend going through old access logs to find what your old URLs were, and redirect them to the new URLs for that same content (not a wildcard redirect to homepage!)


> I would recommend going through old access logs to find what your old URLs were, and redirect them to the new URLs for that same content (not a wildcard redirect to homepage!)

this is great advice and I've done that as much as possible. Herewith a giant file of redirects! https://github.com/johnnyreilly/blog.johnnyreilly.com/blob/m...


Which pages or queries in GSC specifically lost traffic? Was there a handful of keywords that drove most of your traffic that you’ve lost? Or is it down across the board for all articles?


I still get traffic, but for things like a webpack define plugin post I wrote in 2017 and for 2 CSharp posts. "eslint for c#" and "build props"

All my Azure / TypeScript / React / Bicep traffic dried up. No idea what replaced me.


The people that mirror your content would have replaced you. Try searching for a few page titles in quotes, you will easily find the culprits then. The problem is that now they will now have "older" content than you, so search engines may think you are copying content and penalise accordingly.

EDIT:

Heres a random example from a title taken from your blog archive: https://laptrinhx.com/using-bootstrap-tooltips-to-display-jq...

Heres another from some recent content: https://blog.logrocket.com/docusaurus-using-fontaine-reduce-...

Both mark themselves as canonical (the original source of the content) in the head.


Fascinating - in the case of LogRocket, they are the canonical and my blog reflects that too. The other is clearly not...


It’s a bit annoying but you should be able to see in GSC what the ranking change was for particular URLs and queries by comparing before October and after October. You should also be able to see if the issue is whether the pages dropped in ranking or if some articles became de-indexed entirely. Take one of the page URLs that has dropped in traffic and make sure it’s still being indexed.


at the point of the observed drop I hadn't changed any page URLs, no. Apologies for lack of clarity.

I had changed image URLs when they migrated from being PNGs to WebPs. Following that there was the SEO drop off (within 6 weeks or so)


If you still need help and are really stuck ping me and I will try to help you sort this out quickly. Reply to this comment if you still need the help with an easy way to reach you via email. (Not asking for anything in return)


Johnny is correct and this most likely has to do with the 404 backlinks and accidental misuse of Google Analytics.

I recently did a comprehensive SEO audit for a web3 brand that saw a similar drastic drop in traffic. Here are some other things to keep in mind to rank better on Google.

  1. One H1 per page
SEO experts agree the best practice is to use one H1 per page. Headers help keep content structured for both Google and readers. If there are multiple H1s, convert them into proper H2-H4s to improve the content's hierarchy.

  2. High-quality content >1000 words
According to Yoast, chances of ranking in Google are higher if you write high-quality posts of 1000 words or more. If you have some sparse content that is less than this add a few more detailed sections.

  3. Google PageSpeed score >90
Google PageSpeed says that improving performance increases rankings and user experience. The ideal Google PageSpeed score is >90. Make sure your pages takes less than 2s to load, and ideally less than 500ms. Reduce JS and additional requests on important pages.

  4. Add title tags and meta descriptions with primary keyword
Google recommends matching H1 tags to title tags to prevent inaccurate article titles from showing up on search results. It's also best to include the primary keyword in the meta description too.

  5. Improve primary blog page with multiple sections
Your blog's primary page is your chance to showcase your best content, not just an archive of your latest posts. Separate your posts into sections like best, classics, and latest. Add H1 to make it clear to the audience what the blog is about. Add H2 subheadline to clarify.

By the way Johnny, you can see the full SEO audit on my Twitter[0].

If you're open to it, I'd love to do a full SEO audit for your blog. Let's get those numbers back to their original state. Please DM me on Twitter.

P.S. - I worked with a software company[1] to build out their docs using Docusaurus so I'm familiar with how it works.

[0]: https://twitter.com/dericksozo/status/1613171898430488578

[1]: https://medium.com/solidstateso/shortcat-documentation-case-...


One H1 per page is a myth dispelled directly by John Mueller at Google[0].

[0] https://www.searchenginejournal.com/h1-headings-for-google/4...


Thoughts on AI driven content and SEO?


Don't do it. Easily detected as plagiarism.


Screw Google. Optimize your website for real humans. Then advertise it outside of Google, for example: have a link to your website in your profile.


Ironically, this is exactly what Google recommends you do in most cases. This isn’t empty, feel-good marketing either. It’s in their best interest for you to do so.


Which profile? I link to it in my GitHub / Mastodon / Twitter profiles for instance


I think they mean on HN.


Your site's performance, reflected on its page speed insight score, is showing 45. This is unacceptable for a blog; it should be close to 100 for text-based pages like you have. Does the site contain scripts, deep dependency trees, or analytics?

Could this have changed during the performance drop?


No - it's because it's a Docusaurus site and so runs React / JavaScript. It was like this before the drop and so is unlikely to be the cause. It's been Docusaurus for about 2 years now


Whilst it still should be fixed, Google isn't tanking a site for its PageSpeed score.


Depends on how much better the speed of competitors is for a given query, and thus the relative severity.


Google claims they don’t use GA for ranking signal. Though in your case it could be interpreted as mischief, maybe something is going on there. You might try recreating a GA4 instance and starting fresh. It might be the least painful road to redemption if that’s the issue.

Without knowing much about docusarus, if the upgrade touched the pages’ markup in some big way, something like not having h1 tags on the pages that would hurt.

The most likely culprit is a ranking algorithm change. I recall there was one around in October that wrecked more than a few sites.

The good news is you probably didn’t wreck your SEO. The bad news is you probably can’t do anything about it.


Confusing post for me.

Ad redirects? are we talking page URL redirects or just image URL redirects?

the GSC pattern looks a lot like page URLs changed. did they?

I would recommend to this kind of analysis https://www.fullstackoptimization.com/a/google-update

so basically

- did the page URLs change?

- 4 SEO tests

- and then winner/loser pages analysis.

Update: 4 SEO tests are better outlined here https://www.fullstackoptimization.com/a/seo-basics


There's no ads.

Over time I've change page URLs, partly as a migration from Blogger to Docusaurus, partly due to just changing page URLs without thinking about it.

I've changed image URLs by switching from PNG to WebP and back again.

Thanks for the link will check it out.


that was the latin "ad" aka "about"

mobile friendly tests fails see https://search.google.com/test/mobile-friendly/result?id=SAu...

and google can not render the page correctly

also test with GSC page inspect -> render page -> see screenshot

if this correlates with the software update (and the page URLs) did not change, then this is the strongest contender.

check the crawling metrics in GSC google search console

js, css average load time should be max 200ms- or it is some other asset delivery issue

html not more than 400ms average response time

note: i am the author "understanding seo" https://gumroad.com/l/understanding-seo/hacker-news (think seo for hackers) feel free to mail me, contact in profile

everything points either to an page URL change or 4 SEO tests issue - from my point of view

written on a mobile in the metro, so can not check in detail right now


> that was the latin "ad" aka "about"

Latin "ad" is more like "to" or "toward", but I've never seen it used to mean "about". FWIW.

You might be aiming for "ab", with which autocorrect will likely refuse to cooperate.


ok, new learnings, "ad" in the meaning of "about" is not used like this in english.

in german the latin "ad" is used as "about that point" and even part of the Duden (basically the official guide to the german language)

https://www.duden.de/rechtschreibung/ad

did not know that its more a german thing.


additionally misconfigured /tags/

i.e.: https://johnnyreilly.com/tags/authorisation shows the full content of the first one, this is also the ones which get indexed, see https://www.google.com/search?q=site%3Ahttps%3A%2F%2Fjohnnyr...

a.k.a. massive self made internal duplicate content


Title:

> How I ruined my SEO

Conclusion:

> I'm not quite sure


Usually Hacker News strips out "How" (and "Why") at the start of article titles to prevent this. I'm not sure why there are exceptions sometimes.


Oh I didn't realise it did that! I assumed I'd copied and pasted incorrectly when I submitted it and added the "How" back after the fact.


You can edit it back in after submission.


Yes, true. I've noticed, though, that more often than not HN's deletion of "How"/"Why" makes the title better when it comes to my submissions. I'd estimate 95% of the time I let the HN-auto-edited version stand.


Yea he should of phrased the title as a question. As his last paragraph goes:

> I'm hoping someone will read this and tell me what I did wrong.


Spelling nazi hat on

> Yea he should of phrased the title as a question.

... he should have phrased ...

Spelling nazi hat off


That's the grammar Nazi hat.

Pedant hat off.



Grammar looks fine to me. "Have" is just misspelled :)

And no, Godwin's is about comparing something in the topic to Nazis, not self identifying as a "XXX Nazi".


Yup - I am not quite sure! But I reckon it's fairly likely that the post contains the "how" of how I ruined my SEO. The hope is that a knowledgeable someone can say what I did wrong.


Title suggestion: "How did I ruin my SEO?"


It's your redirects causing duplicate content and/or indexing issues.

Google's index of your old site (excluding tags): https://www.google.com/search?q=site%3Ablog.johnnyreilly.com...

Shows your blog posts were indexed with no trailing slash: https://blog.johnnyreilly.com/page/155

Your current 301 sends this to: https://johnnyreilly.com/page/155

Which doesn't have a canonical tag, or 301 redirect, to the correct blog post: https://johnnyreilly.com/2017/05/20/typescript-spare-rod-spo...

Once that's fixed, I would also jump into GSC and run their Change of Address tool (https://support.google.com/webmasters/answer/9370220?hl=en).

LMK if you need a hand with this.


I don't think sharing your Google analytics tag was the cause. If Google de-ranked unrelated sites using the same Google analytics id, we would see black-hat SEOs using that against their competitors. This ID isn't a secret.


Of the things on the list, the redirects jumped out to me as the most likely culprit. Second guess would be the algorithm change. You could look at individual pages before & after - mainly the top performing few, looking at what queries they ranked for before and now, and what their average position was. Hopefully the fix to the redirects will kick in soon!


I gave up on following the Google SEO bandwagon years ago. I write for people and most definitely NOT for Google. I get around 800 real users a month on my site across about 20 countries - top 3 being U.S U.K and for some reason China. I get maybe 20 or so emails asking questions or just saying thanks and they have to track me down to send those emails there's no "email me" link on the site. If you want to "monetize" your site then SEO away otherwise setup a Google site owner/developer account for the "advice" (page not mobile friendly for example) but ignore the Googley crap unless you like running on THEIR treadmill.


With your approach I'm sure you'll get back to normal traffic very soon.

I think your conclusion is right. It's not one thing that cause the drop but a combination of things.

Let's forget one sec about the traffic you had before and focus on the current situation.

- Page speed: It's one of the most important ranking factor. You don't have to get 100 score, but passing the core web vitals score and having higher score on mobile is recommended.

https://pagespeed.web.dev/report?url=https%3A%2F%2Fjohnnyrei...

A cool trick to improve the result fast is by removing the lazy load effect from the LCP: https://i.imgur.com/rOOWm91.png

- Add a robots.txt You have plenty of pages in the site, it might be good to make sure only things you want to be indexed are getting crawled. https://i.imgur.com/ONSiQjQ.png

- Add bio (one liner) here: https://i.imgur.com/SXbWVwU.jpg Great smile. The bio should show your readers your expertise in the topic. If you want to take it to the next level, Quora is a great inspiration. In every Quora answer you can write a different On-liner/byline.

- Internal linking: I love your blog archive. It's a great idea. Try to make more strategic articles stronger. More internal links to the article will signal search engines this page is indeed important.

- Add Privacy policy & Terms of use pages

I can help you with analyzing the traffic drop if I'll understand which pages were your largest traffic generators. Let's try to understand what happened to those pages, and how it's possible to make them stronger.

I worked with several large companies who had traffic drop after migration. Sometimes companies spend so much time and effort looking for the reason for the drop instead of just focus on recovering.

I'm up for analyzing the drop but if the reason is not obvious (Google Penalty for example), I recommend to focus on the future rather than the past.

I would love to help you (As a case study) with analyzing what caused the issue, and to give tips on how to move forward and get more traffic.

Edit: We wrote a SaaS SEO guide that cover several related topics: https://growtika.com/saas-seo/


> It's not one thing that cause the drop but a combination of things

Let me propose a bold generalization based on observations of sites of all sizes wrestling with Google over the last 20 years, and to which there are certainly exceptions (this case may be one):

It’s always one thing.

Of course, there are always a multitude of improvements that can be made to nibble around the edges and get incremental improvements in various metrics. And a gradual change in traffic may have several simultaneous sources.

But when the effect is an identifiable precipitous drop in search traffic, almost all the time, it turns out there was a single reason, whether a change Google made or a change the site made.

This may be a totally unhelpful observation, and feel free to ignore it. But in these situations I’ve come to find it’s more fruitful to look for A Cause than to approach it as “a little here, a little there.”


I've listed all the causes I can think of. Does any strike you as "the one"?


Honestly, the thing about other sites using your GA tag seems like the biggest red flag/potential for some Google algorithm to have actively started penalizing you. The timing doesn’t quite track, but maybe some interaction between that and the spam update?


Thanks for the tips!

> A cool trick to improve the result fast is by removing the lazy load effect from the LCP: https://i.imgur.com/rOOWm91.png

Haha, ironically I'm responsible for wholesale lazy loading of images in Docusaurus:

https://github.com/facebook/docusaurus/pull/6598

Generally that's a good thing, but it's not good for title images that appear "above the fold". I wonder if there's something we could do to determine whether an image is "above the fold" and so not apply lazy. Something to ponder.

> Try to make more strategic articles stronger.

Not sure I follow your meaning here, do you literally mean applying bold to certain articles? Or something else?


That's cool. Just for fun, try to remove the lazy load effect from the LCP, let's see how it effect the score.

As for the strategic articles, I mean that you'll increase the internal links certain articles you think can rank higher.

On October you had 3300+ keyword that were ranked 11-100 on Google

https://i.imgur.com/ZGU0Ezk.png

It dropped to 165 keyword now. That's an insane drop, but I wonder how much traffic these thousands of keywords brought you since there weren't rank too high.

I would like to dive into the keyword that ranked 4-10, on Google (50 keywords). And they keywords that ranked 1-3 (15 keywords)

By understanding on GA which pages brought you the most traffic/ranked the highest you can make them stronger by adding more internal links to these pages, look at it as kind of first aid ;)

EDIT:

Few tips:

- The 'Recent posts' widget is great. It get your article indexed faster. What about adding a widget below for 'Popular articles'? Place 5-10 articles there.

https://i.imgur.com/MBquX6M.jpg

- Make sure you implement time stamp to your articles the right way

https://i.imgur.com/b5usnJJ.png

https://i.imgur.com/0vpXdyE.png


tell me more about the timestamp. Where does that data get derived? I thought that should be in great shape since I set lastmod in my sitemap based on git commit date

https://johnnyreilly.com/2022/11/25/adding-lastmod-to-sitema...


We could apply lazy loading only after the 2nd image maybe


Yeah - primitive but probably reasonable. Would be happy to experiment with that. I've also been pondering things we could do around open graph images as well, would be nice to use image CDNs like Cloudinary for open graph images in the same way we can for blog images https://johnnyreilly.com/2022/12/26/docusaurus-image-cloudin...


Changing all images to be lazy loaded is not a good idea for SEO reasons (and perhaps not a good idea for regular usage of the site either). If it were, browsers would just do it automatically. Using hints on your images is only useful if you are doing it strategically, which means only lazy loading images that are offscreen on the initial page load. Otherwise you are not actually giving useful hints to the browser on how to load your page in the correct manner, and thus you are just better off letting the browser use whatever internal logic it has to decide how and when to load the assets.

Basically you can think about the optimal, minimal set of resources to render your page being:

- HTML

- Required CSS

- Above the fold images

Then at that point, download all of the other things to make the page work in the way you want (javascript, etc). Anything else is delaying the initial page-load. Because your images are lazy loaded, your page load looks like the following:

- HTML

- Required CSS

- Some above the fold images (profile.jpg, etc)

- Massive, >300kb blobs of javascript (runtime-main.js, main.js)

- Above the fold images in the post

This is not good. It doesn't make sense for your page to download 300kb of javascript before downloading the 18kb image that is above the fold. Now you can partially solve this problem by making the javascript asynchronous, but that still is just another band-aid on the problem, as then the javascript and above the fold images would download concurrently, which is still not optimal.

What you want to do is have above the fold post images be loaded eagerly (the default), and then lazy load ones that are lower on the page. If you aren't going to do that, you probably are better off just not having the images being lazy loaded at all, especially if your page includes 300kb of javascript which is likely going to be much larger than the combined size of all the images on the page.


> You don't have to get 100 score, but passing the core web vitals score and having higher score on mobile is recommended

Note that they don't have a CWV score yet due to low traffic. But a 39 performance score from the simulated Lighthouse is often more than enough for a passing grade. That is: if a Moto G4 can do OK, your normal users will likely do great.

For instance, a site I made[0] has a 22 from Lighthouse, but a passing CWV grade, so further improvement to the LCP, FID, and CLS would confer no direct Google SEO benefit.[1] (But it may help things like bounce rate, which may confer second-order benefits)

> by removing the lazy load effect from the LCP

Indeed. Even better, making it high priority instead of normal: https://addyosmani.com/blog/fetch-priority/

[0] https://i.imgur.com/TGD1sj2.png

[1] "For example, a page with an LCP of 1750 ms (better than the “good” LCP guidance) and another one with 2500 ms (at the “good” guidance) would not be distinguished on the basis of the LCP signal" – https://support.google.com/webmasters/thread/104436075/core-...


This is very helpful, thank you. Raised a ticket against Docusaurus to track this: https://github.com/facebook/docusaurus/issues/8552 will experiment


Edit II:

Few more tips:

- The drop might be related tO Google algorithm update. This link can help with understanding if the traffic drop is in correlation with your traffic drop:

https://ahrefs.com/google-algorithm-updates#october-2022-spa...

- Google search console: Check if there was any penalty More info here:

https://searchengineland.com/google-penalties-manual-actions...

EDIT III: This thread might be useful

You can find more related threads of people who suffer traffic drop after Google algorithm update (October 2022):

https://www.google.com/search?q=site%3Ahttps%3A%2F%2Fsupport...


Yeah I'd love to get your help! My email address is johnny_reilly at hotmail dot com or you can DM me here: https://twitter.com/johnny_reilly or here https://fosstodon.org/@johnny_reilly


> Add a robots.txt

Likely not the cause but not having a robots.txt can cause weird crawling problems.

Since it takes two minutes to set up, why not. And, here's a tip, don't try and be clever, keep it simple:

  User-agent: \*
  Disallow:


Is there really no way to rotate your GA tag without creating a new property in Google Analytics? That seems like a glaring omission.


Of the possibilities you listed in the article, my money is on the failure to do the redirects. Luckily, it may not be too late to fix that, if you haven't already.

If you're interested, email me at tobes@longtaildragon.com, and I'll pull a report of every URL that needs a redirect.


Seems like a lot of majoring in the minor here. After such an event, someone once told me to go fishing for a week and come back to focus on what really matters. It was helpful advice.


Scott’s Cheap Flights just rebranded to Going. Would there be a good way for him to forecast the SEO impact of that change (which seems on the surface to be probably large)?


The other issue you may need pay attention to is your backlink profile. I ran an quick audit, 19% of your ahrefs links are bad. You probably need disavow them.


Can Google Search Tool site verification via Google Analytics have played a role?

So all the new pages that use your tag get linked to your Google search tool property.


ok, as I was curious

this was your old site blog.johnnyreilly.com (note:hosted on on the blog subdomain) https://www.google.com/search?q=site%3Ablog.johnnyreilly.com... https://web.archive.org/web/20230000000000*/https://blog.joh... archive.org picks it up until december 2022

this is your current site https://johnnyreilly.com/ (note: no subdomain() https://www.google.com/search?q=site%3Ajohnnyreilly.com&pws=... archive.org seems to pic it up late december 2022, start 2023 https://web.archive.org/web/20230000000000*/https://johnnyre...

between old pages and new pages you have a redirect chain https://blog.johnnyreilly.com/page/201/ -> 301 -> https://johnnyreilly.com/page/201/ -> https://johnnyreilly.com/page/201 final destination (which does not seem to be the correct new URL)

you had an domain movie, URL change and a subideal old 2 new migration if you do not want to loose traffic with URL changes, follwo this spec to the letter https://developers.google.com/search/docs/crawling-indexing/...

if you do not do this correctly a traffic pattern like yours must be expected.

if you do this now you might be abble to recup about 50% of your old traffic, might

additionally you haver massive self made internal duplicate content (tag pages show full pages), soft 404 (non existing pages i.e.: https://johnnyreilly.com/page/fake-url-for-soft-404-error-ch... trigger HTTP 302 redirects and then even more behaviour)


yeah I'm hoping that my redirect story is now quite good - see dynamic redirect code here:

https://github.com/johnnyreilly/blog.johnnyreilly.com/blob/m...

as to the duplicate content, Docusaurus generates /tags/ and /pages/ content by default that I strip from from my sitemap manually.

Thanks for the links!


Does anyone have a guideline for seo? Is doing the basics enough or is there something that creates magic?


No privacy policy and no terms of service might do it


What a sad state of affairs when even personal blogs are expected to be laden with legal documents.


I agree with the sentiment, but if a personal blog tracks users and sends data on them to someone else, as with GA, then it absolutely needs a privacy policy.


Why?

I had a blog and regularly commented on dozens of blogs from 2005-2015. Just about all of them used Google Analytics and in the early days many, including mine, were hosted on Google's Blogspot.

There wasn't any legalese on any of the smaller/medium sized ones and I really haven't once regretted that there wasn't.


Law.


Sure, and law varies greatly by region. China, North Korea, and to a degree, Russia have been extremely restrictive for a long time, western Europe has recently become more so and many other places are still relatively free.

My "why" question was intended from more of an ethical first principles standpoint than a legalistic one.

I know that some jurisdictions bury pretty much any online activity in mountains of regulatory requirements. That's not novel or interesting. As I said in the ancestor comment, it's "a sad state of affairs when even personal blogs are expected to be laden with legal documents".


It's not that hard to write "we keep none of your personal data" as your privacy policy, if you want to track me on a personal blog you better get your legal story straight...


That's the first time I'm hearing of suggestions that Google will downrank sites without it. Do you have a source?


It's only an ads thing. Facebook, Google, etc., all require to those pages be on a website to run ads. They say nothing about Personal websites, but the EU probably does once you start using GA.


My Facebook ad campaigns were repeatedly rejected until I made that one change. And they never said why it was being rejected. It was only that change that fixed it.


Unlikely to be a source but a good benchmark is to have this stuff in place because a)It takes almost no effort, b)You don't know if it makes a difference or not, and c)There is pretty much zero chance it's going to have a negative impact.


Definitely need a source.

I had a new website that had minimal text content. At first the most popular Google keywords were from privacy and terms! Had to change those pages to no robot.


Also intrigued as to whether this could be a cause. I'm certainly not opposed to adding one, but I'd be surprised if it was necessary. But I'm here to learn!


This is definitely not a thing.




The deadline for YC's W25 batch is 8pm PT tonight. Go for it!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: