Hacker Newsnew | past | comments | ask | show | jobs | submit | Growtika's commentslogin

Another interesting research by SparkToro. This lines up with what we’re seeing working with SaaS companies. A few patterns stand out:

- Informational traffic is dropping since people can grab answers instantly from AI Overviews or LLMs without visiting the site.

- Traffic sources are more diverse now. Long-tail queries are showing up again, and discovery happens across more places than just Google.

- SEO hasn’t died, it’s shifted. It’s less about search engine optimization and more about search everywhere optimization. That makes strategy tougher, but also more interesting.

- The traffic that does land tends to be higher intent. Fewer drive-by visitors, more people ready to take action.

So while there was fear that AI would gut search and kill SEO, the effect has been different. It hasn’t removed the work, it has changed the shape of it. And in practice, that means more strategy, more experimentation, and actually more demand for the kind of work we do.


The first time I heard about Docusaurus was when someone mentioned on Hacker News that they had ruined their SEO:

https://news.ycombinator.com/item?id=34389421

I helped him in recovering from a Google SEO penalty, and he documented his journey on his blog:

https://johnnyreilly.com/how-we-fixed-my-seo

This might be helpful for those using Docusaurus, as some default settings, like pagination and tag pages, can generate thousands of non-helpful pages. These issues can be easily fixed with noindex tags and a sitemap/structure cleanup.

Overall, I think Docusaurus is great. It's clean, flexible, and the community is very responsive, so it's constantly improving at a fast pace


This post in particular helped me fix seo as well for https://appraisalinbox.com (the design still needs a lot of love).

These days I'd probably start off using astro for a static site. They've got a docs starter, too.


Aww man I hate clicking on a result on google, taking me to a pagination page where the content I'm looking for is no longer on that page.


I like it too but it doesn’t have auto-docs based on docstrings, which Material for MkDocs has:

https://github.com/mkdocstrings/mkdocstrings


Oh, Reilly? He should write a book. :)


Some potentially interesting information I've just checked that might give a new perspective:

From an SEO standpoint, Quora is currently at one of its peak levels.

- It has reached 267 million organic visitors per month: https://i.imgur.com/899g3Jm.png

- Quora's organic traffic is quite diverse https://i.imgur.com/JfbYKH1.png

Google's helpful content update has increased traffic for sites like Quora, Stack Overflow, and Reddit. In many instances, I believe this boost lacks justification.

Quora's user experience is not intuitive. Although they removed the signup wall, which impacted their bounce rate and caused millions of organic visitors to leave their site shortly after clicking on a Quora result in a search engine, reading an answer on Quora now feels overwhelmingly disorganized.


About a year ago, John shared a post on HN about how he ruined his SEO. There were loads of helpful tips in the thread: https://news.ycombinator.com/item?id=34389421

We added our two cents as well: https://news.ycombinator.com/item?id=34390189 and decided to keep in touch with John after the thread lost its initial hype to make sure his site recover.

This article is about the different steps John took with our advice to fix the issue.


The answer is monitor & adapt. If Google has indeed increased the price for ads, you can choose to be upset with them or to see it as a signal for your company to diversify the methods through which you drive traffic, leads, sales, and brand awareness. I'm into SEO, few months ago with the rise of AI it felt like the SEO industry is about to end. The ranking signals of today, might not be as useful tomorrow. So does your PPC budget.


There really aren’t many other options. If Google raises prices 10x, then your budget will be 10x less effective, but it will still probably be much more effective than if you were to advertise on, say, Bing or Yahoo instead.


>The answer is monitor & adapt.

And why isn't the answer also take Google to your state attorney for fraudulent and anti-competitive practices?


With your approach I'm sure you'll get back to normal traffic very soon.

I think your conclusion is right. It's not one thing that cause the drop but a combination of things.

Let's forget one sec about the traffic you had before and focus on the current situation.

- Page speed: It's one of the most important ranking factor. You don't have to get 100 score, but passing the core web vitals score and having higher score on mobile is recommended.

https://pagespeed.web.dev/report?url=https%3A%2F%2Fjohnnyrei...

A cool trick to improve the result fast is by removing the lazy load effect from the LCP: https://i.imgur.com/rOOWm91.png

- Add a robots.txt You have plenty of pages in the site, it might be good to make sure only things you want to be indexed are getting crawled. https://i.imgur.com/ONSiQjQ.png

- Add bio (one liner) here: https://i.imgur.com/SXbWVwU.jpg Great smile. The bio should show your readers your expertise in the topic. If you want to take it to the next level, Quora is a great inspiration. In every Quora answer you can write a different On-liner/byline.

- Internal linking: I love your blog archive. It's a great idea. Try to make more strategic articles stronger. More internal links to the article will signal search engines this page is indeed important.

- Add Privacy policy & Terms of use pages

I can help you with analyzing the traffic drop if I'll understand which pages were your largest traffic generators. Let's try to understand what happened to those pages, and how it's possible to make them stronger.

I worked with several large companies who had traffic drop after migration. Sometimes companies spend so much time and effort looking for the reason for the drop instead of just focus on recovering.

I'm up for analyzing the drop but if the reason is not obvious (Google Penalty for example), I recommend to focus on the future rather than the past.

I would love to help you (As a case study) with analyzing what caused the issue, and to give tips on how to move forward and get more traffic.

Edit: We wrote a SaaS SEO guide that cover several related topics: https://growtika.com/saas-seo/


> It's not one thing that cause the drop but a combination of things

Let me propose a bold generalization based on observations of sites of all sizes wrestling with Google over the last 20 years, and to which there are certainly exceptions (this case may be one):

It’s always one thing.

Of course, there are always a multitude of improvements that can be made to nibble around the edges and get incremental improvements in various metrics. And a gradual change in traffic may have several simultaneous sources.

But when the effect is an identifiable precipitous drop in search traffic, almost all the time, it turns out there was a single reason, whether a change Google made or a change the site made.

This may be a totally unhelpful observation, and feel free to ignore it. But in these situations I’ve come to find it’s more fruitful to look for A Cause than to approach it as “a little here, a little there.”


I've listed all the causes I can think of. Does any strike you as "the one"?


Honestly, the thing about other sites using your GA tag seems like the biggest red flag/potential for some Google algorithm to have actively started penalizing you. The timing doesn’t quite track, but maybe some interaction between that and the spam update?


Thanks for the tips!

> A cool trick to improve the result fast is by removing the lazy load effect from the LCP: https://i.imgur.com/rOOWm91.png

Haha, ironically I'm responsible for wholesale lazy loading of images in Docusaurus:

https://github.com/facebook/docusaurus/pull/6598

Generally that's a good thing, but it's not good for title images that appear "above the fold". I wonder if there's something we could do to determine whether an image is "above the fold" and so not apply lazy. Something to ponder.

> Try to make more strategic articles stronger.

Not sure I follow your meaning here, do you literally mean applying bold to certain articles? Or something else?


That's cool. Just for fun, try to remove the lazy load effect from the LCP, let's see how it effect the score.

As for the strategic articles, I mean that you'll increase the internal links certain articles you think can rank higher.

On October you had 3300+ keyword that were ranked 11-100 on Google

https://i.imgur.com/ZGU0Ezk.png

It dropped to 165 keyword now. That's an insane drop, but I wonder how much traffic these thousands of keywords brought you since there weren't rank too high.

I would like to dive into the keyword that ranked 4-10, on Google (50 keywords). And they keywords that ranked 1-3 (15 keywords)

By understanding on GA which pages brought you the most traffic/ranked the highest you can make them stronger by adding more internal links to these pages, look at it as kind of first aid ;)

EDIT:

Few tips:

- The 'Recent posts' widget is great. It get your article indexed faster. What about adding a widget below for 'Popular articles'? Place 5-10 articles there.

https://i.imgur.com/MBquX6M.jpg

- Make sure you implement time stamp to your articles the right way

https://i.imgur.com/b5usnJJ.png

https://i.imgur.com/0vpXdyE.png


tell me more about the timestamp. Where does that data get derived? I thought that should be in great shape since I set lastmod in my sitemap based on git commit date

https://johnnyreilly.com/2022/11/25/adding-lastmod-to-sitema...


We could apply lazy loading only after the 2nd image maybe


Yeah - primitive but probably reasonable. Would be happy to experiment with that. I've also been pondering things we could do around open graph images as well, would be nice to use image CDNs like Cloudinary for open graph images in the same way we can for blog images https://johnnyreilly.com/2022/12/26/docusaurus-image-cloudin...


Changing all images to be lazy loaded is not a good idea for SEO reasons (and perhaps not a good idea for regular usage of the site either). If it were, browsers would just do it automatically. Using hints on your images is only useful if you are doing it strategically, which means only lazy loading images that are offscreen on the initial page load. Otherwise you are not actually giving useful hints to the browser on how to load your page in the correct manner, and thus you are just better off letting the browser use whatever internal logic it has to decide how and when to load the assets.

Basically you can think about the optimal, minimal set of resources to render your page being:

- HTML

- Required CSS

- Above the fold images

Then at that point, download all of the other things to make the page work in the way you want (javascript, etc). Anything else is delaying the initial page-load. Because your images are lazy loaded, your page load looks like the following:

- HTML

- Required CSS

- Some above the fold images (profile.jpg, etc)

- Massive, >300kb blobs of javascript (runtime-main.js, main.js)

- Above the fold images in the post

This is not good. It doesn't make sense for your page to download 300kb of javascript before downloading the 18kb image that is above the fold. Now you can partially solve this problem by making the javascript asynchronous, but that still is just another band-aid on the problem, as then the javascript and above the fold images would download concurrently, which is still not optimal.

What you want to do is have above the fold post images be loaded eagerly (the default), and then lazy load ones that are lower on the page. If you aren't going to do that, you probably are better off just not having the images being lazy loaded at all, especially if your page includes 300kb of javascript which is likely going to be much larger than the combined size of all the images on the page.


> You don't have to get 100 score, but passing the core web vitals score and having higher score on mobile is recommended

Note that they don't have a CWV score yet due to low traffic. But a 39 performance score from the simulated Lighthouse is often more than enough for a passing grade. That is: if a Moto G4 can do OK, your normal users will likely do great.

For instance, a site I made[0] has a 22 from Lighthouse, but a passing CWV grade, so further improvement to the LCP, FID, and CLS would confer no direct Google SEO benefit.[1] (But it may help things like bounce rate, which may confer second-order benefits)

> by removing the lazy load effect from the LCP

Indeed. Even better, making it high priority instead of normal: https://addyosmani.com/blog/fetch-priority/

[0] https://i.imgur.com/TGD1sj2.png

[1] "For example, a page with an LCP of 1750 ms (better than the “good” LCP guidance) and another one with 2500 ms (at the “good” guidance) would not be distinguished on the basis of the LCP signal" – https://support.google.com/webmasters/thread/104436075/core-...


This is very helpful, thank you. Raised a ticket against Docusaurus to track this: https://github.com/facebook/docusaurus/issues/8552 will experiment


Edit II:

Few more tips:

- The drop might be related tO Google algorithm update. This link can help with understanding if the traffic drop is in correlation with your traffic drop:

https://ahrefs.com/google-algorithm-updates#october-2022-spa...

- Google search console: Check if there was any penalty More info here:

https://searchengineland.com/google-penalties-manual-actions...

EDIT III: This thread might be useful

You can find more related threads of people who suffer traffic drop after Google algorithm update (October 2022):

https://www.google.com/search?q=site%3Ahttps%3A%2F%2Fsupport...


Yeah I'd love to get your help! My email address is johnny_reilly at hotmail dot com or you can DM me here: https://twitter.com/johnny_reilly or here https://fosstodon.org/@johnny_reilly


> Add a robots.txt

Likely not the cause but not having a robots.txt can cause weird crawling problems.

Since it takes two minutes to set up, why not. And, here's a tip, don't try and be clever, keep it simple:

  User-agent: \*
  Disallow:


Great article. Made me search for cool stats about manhole. Apparently there's a site dedicated only for manhole from around the world

https://manhole.co.il/


Really cool project, bookmarked.

For those interested, for Bucharest there's this [1] blog-post with a few photos of some old manholes. In here [2] there's a list of links from the same project to posts that present some old stuff from and around Bucharest (the text is in Romanian, but there are lots and lots of photos that are self-explanatory).

[1] https://www.simplybucharest.ro/?p=36671

[2] https://www.simplybucharest.ro/?page_id=12112


I've been taking pictures of manhole covers for around 10 years now. Well, manhole covers, drainage gates, square access covers, and so on.

Locally, the challenge is an interesting, somewhat artistic, usually (but not always) urban picture. I've already taken a picture of most designs I see, after all. In any other city, I'm catching the design first. (Tromsø, Norway has a reindeer!)

I'm not all that surprised there is a website to catalogue them and am always kind of happy to be reminded that others have the same fascination.


Made me think of this Berlin based artist: https://raubdruckerin.de/

She uses manhole covers to create prints, "Raubdruck" means "pirated edition" but literally it's "stolen print"


Wasn't familiar with the book until I came across this Reddit thread few years ago.

"The author of the Anarchist Cookbook, an infamous instructional book on homemade explosives, weapons, and drugs, regretted its publication. He attempted to have it removed several times, only to be thwarted by the publisher Lyle Stuart."

https://www.reddit.com/r/todayilearned/comments/b2b3m7/til_t...


The original author went as far as to publish a "second edition" more recently. This second edition is actually literally a cookbook. It's geared towards the mutual aid scene and the recipes are recipes you can make for hundreds of people

We actually used it when starting a new Food Not Bombs chapter locally and most activists I talked to seemed to mainly be familiar with the followup. So I guess it had some level of success in covering up the original


Do you have a reference? I'd love to see a literal anarchist cookbook, but a bit of searching turned up nothing.


https://www.foodnotbombs.net/a.%20Anarchist%20Cookbook%20int...

Ah I guess it's more specifically a Food Not Bombs thing. Also it wasn't mostly written by the original author, but the original author provides a forward (page v) and obviously gives his blessings for this work to take the namesake

I guess it being Food Not Bombs specific explains why most people I knew only knew of this version XD


Nice idea!

The '10 years ago' is interesting. Could be fun if you add also 3/5/7/10 years tab. It's always fun reading content from The 'About' and 'Privacy' buttons are not working.

Consider adding 'archive' section where we could quickly see older newsletter issues


Click the "Want more?" link at the end of the list to see older entries.


Thanks! Wasn't so intuitive. Adding 'Recent Issues' or 'Archive' at the bottom menu can help with the navigation.


I have a lot of respect to Bob Dylan and his amazing music. My grandpa always love to tell me the story how the ice makers were protesting against the arrival of freezers.

I guess life and innovation just keeps moving like a rolling stone


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: