Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In my experience, the forum experience is far from dead, but it's effectively impossible to surface in a search engine - any search engine - unless you know the name of the forum.

Oh, and the content must also be "fresh". If the content isn't "fresh" (which most of the best forum/blog posts are not), nobody shows it anymore. I can search for a specific blog post using a verbatim quote, but the result (if it exists) is buried under 10+ pages of "fresher" content, no matter how disconnected it may be from the search.



The forum experience is dying. I spent about 4 years of my time in-between Google stints working on a searchable feed for forum sites. Finally gave it up when I realize the extent to which the forum scene had died and moved to Reddit & Facebook while I was working on the project.

The root problem is that attention has gone from abundant to scarce, and people already have their habits. That makes it really hard to build a new forum site and attract an audience that's willing to type your URL in every day (and if they don't visit daily, forget about building a viable community). Forum hosts like Facebook and Reddit don't have this problem - you can view your Buy Nothing Group and Moms of Springfield posts interspersed with your feed of friends, or your r/factorio content interspersed with a steady stream of r/AskReddit.

There's also emerging technological barriers. If you don't sign up for CloudFlare, as a new website, you're going to get hosed - but at the same time, CloudFlare makes it basically impossible for any new search engine other than Google to spider the site. Ditto security patches, and keeping software up-to-date. Most people don't want to deal with sysadmin stuff at all, particularly if they're trying to build a community as a hobby. So that pushes people further toward hosted solutions with a turn-key secure software stack, which is Facebook and Reddit.


> The root problem is that attention has gone from abundant to scarce

I don't think that's necessarily true.

I think the root problem is that running & using a forum is too difficult. That is why centralized forums (like you mentioned, reddit and facebook) that handle it for you won out against decentralized forums run by forum members.

Even before facebook/reddit/etc forums tended to live or die by individual effort of one passionate system admin dealing with all the hosting, updates, accounts, and spam until they get fed up and the forum closes because they can't find someone else to take the keys.


One of my favorite niche forums is https://archboston.com. It has years of deep-dive discussion from passionate users about the history and progress of Boston area infrastructure and real estate development projects.

For a while there the site was up but not allowing new accounts to be created -- someone was paying the hosting bills but didn't have time to do any admin tasks. Thankfully, someone else stepped up and people post new stuff every day (albeit with banner ads at the top of each page now, which is honestly not too bad)

I'm happy, but it could have gone poof so easily.


It's not so much that running a forum is difficult.

But anyone who launches a forum today is competing with the large, metaforum platforms like Reddit and Facebook and Discord.

It's just too difficult to assemble a userbase.


> The forum experience is dying.

Perhaps the experience is dying, but the wealth of curated information in forums is still there, is still incredibly valuable, and in some cases is still being added to. Here's one example I used extensively recently; it was sent to me by a colleague, since I never could have found it via a search engine.

https://gearspace.com/board/studio-building-acoustics/610173...


> but at the same time, CloudFlare makes it basically impossible for any new search engine other than Google to spider the site

I hadn't heard about this, can anyone supply a link for more context?

Is there anything a Cloudflare customer can do to "opt in" to being scraped by other search engine bots?


Aside from prioritizing Ads, I actually think the root of the problem is sort of the opposite: information has simply gone from scarce to too abundant. Finding information you need is like trying to find a needle in a haystack. To solve this problem, search engines like google search came into being. Initially, having multiple search engines caused a problem in itself: if you have 700 search engines, which search engine do you use? So the industry naturally ended up coalecesing into a near monopoly, as having 1 (or a few) engine(s) to use is simpler than having 700. However, the root of the problem is still growing. Now it's not so much as "which search engine to use?" but more: "what's the precise combination of text to feed the search engine needed to find my needle?" And as more information gets created, the problem just gets worse: your information gets drowned in the sea of other information and thus gets harder to search for...

Amusingly, appending "reddit" to your search is like a pseudo search engine in itself, instructing the search engine to act as a search engine of only a specific domain of information. Almost like we're back to having multiple search engines, and no one knows which one to use...


> information has simply gone from scarce to too abundant

no, the signal to noise ratio has gone from sufficient to far too low


> hard to build a new forum site and attract an audience that's willing to type your URL in every day

This is exactly the problem that RSS has been solving since it was created over 20 years ago.

If you have this problem, it means you do not have granular-enough RSS feeds. Per-discussion-thread at minimum.


There's a few niche ones out there that are even growing a bit! For example for medium format cameras the largest groups are not on reddit.


> the forum experience is far from dead

If you find a forum for a given subject, it is almost always an authoritative source filled with experts. This is especially true in engineering disciplines.

It's unfortunate that Reddit and social media took over and led to their decline, because it's suboptimal setup in so many ways.

- Reddit in the large is a high noise, low signal monetization chamber. Some subreddits have good moderation, but that doesn't stop the spill over and drama.

- You can't assume much about any given Reddior, and you won't typically form relationships or associations with them. It's pretty much pseudonymous.

- Reddit doesn't focus on authorship. It doesn't allow inclusion of images, media, or carefully formatted responses in threads.

- Reddit corporate is the authority and owner of all content. They can change the rules at any time, and that's a fragile and authoritarian setup for human discourse.

- Reddit corporate is constantly changing the UI and engaging in dark patterns to earn more money. This flies in the face of usability.

Forums should make a comeback. It would be better if each community had real owners and stakeholders that had skin in the game rather than a generic social media overlord that is optimizing for higher order criteria that sometimes conflict with that of the community.

But forums have problems too. They should be easier to host, frictionless to join, easy to discover, and longer lived.

Another way to think of this: every major subreddit is a community (or startup) of its own and could potentially be peeled off and grown. You'd have to overcome the lack of built-in community membership and discovery, but if you can meet needs better (better tools for organizing recipes, community events, engineering photoblogs, etc.), then you might be able to beat them. Reddit can't build everything, just like Facebook couldn't.


This is depressing. Good information is useful for far longer than a carton of milk in your fridge! And a lot of that new "milk" is apparently made of chalk and bilge-water.


Are people still drinking cow milk? Oat milk all the way.


Yes, of course they do, the alternative milk market is growing fast but most people still go for cow milk if they want milk. That's why grocery stores still devote a ton of space to cow milk.


I guess it depends on where you are. Here in Sweden it’s about 50/50 (if you take into account the non-refrigerated milks).


I'm relatively well informed but I always just assumed oat "milk" was an inferior substitute marketed to the actively or wannabe lactose intolerant. [EDIT: and vegans of course.] (No idea if "wannabe lactose intolerant" is really a thing, but i'm thinking of the way gluten sensitivity became a faddish self-diagnosis for a while.)

I still don't drink the stuff, but it's only dawned on me in the last year that there are other reasons, such as environmental concerns or ... actually, I'm not sure. Opening two tabs to Google "oat milk why" and "oat milk why site:reddit.com" now, which conveniently makes this relevant to TFA :)


I use cow milk in my coffee and almond milk in my bland cereal of choice. I’m not lactose intolerant, but too much milk definitely feels “heavy”. The almond milk plus some sultanas substitutes nicely and is very cheap. Mostly its not for the taste, it just makes breakfast easy and efficient so I can focus on fancy stuff later in the day.


Sweden was the 4th highest consumer of milk per capita in 2013. Unfortunately that seems to be the most recent data.

https://en.wikipedia.org/wiki/List_of_countries_by_milk_cons...


Here (mid-tier city in the US) it's more like 80/20 or 90/10 at any normal grocery store, and I suspect there's higher product turn-over for dairy so the actual sales figures favors dairy more than that suggests.


Non refrigerated milk can still be from cows.


Of course it depends on where you in Sweden you are. The super market in the the more affluent part of a large city where I live is about 50/50. Out in the 'sticks' where my parents live it's much closer 80/20


The entire information ecosystem has internalized a bias toward "freshness." It's even really strong in software. Evidently code is more valid and correct if it has recent GitHub commits.


Almost no software just works if left unattended for years. If its a library it means it will likely not work with the latest versions of everything else. Your bug reports will go unattended.

People also have a lot more tolerance for missing features or issues if they see it improving regularly. While getting something unsatisfactory as the last and final version is not nearly as acceptable.


I have built Unix/Linux stuff from the 1990s with no code changes. Programs that do well defined things generally have a long shelf life. Even X stuff often works, though it can look bad on modern displays.

Math kernels, codecs, and so forth can more or less live forever.


Software is the most toxic environment possible for freshness bias. Every month it seems like there's a new framework/ecosystem/whatever and everyone's migrating to it and you're behind the times if you're not using it.


This is only if you bother chasing the absolute freshest trend. Things like React, Ruby on Rails, Node.JS, etc have been the standard and most popular tools for close to 10 years now and aren't going anywhere.


I guess this is why a lot of sites now removed dates from their articles.


Google allows for searches within ranges of dates.


Anyone know the origin of the fresh rule, and the purpose? It makes sense in some niches but in others it is so obviously bad I wonder why Google added it


There was a big news event once (forgot what) and Google was only showing aged pages at the time. So they started prioritizing freshness.


Google had button to search only forums. Monopoly shows.


So frustrating to find background Information to a big current event. Google will aggressively show the same news articles over and over.


I feel this would be solved if the search engine weighted results based on whether users trust the domains.


That only works if you validate users, otherwise the trust database becomes 99% the result of the actions of SEO firms.


Doesn't cloudfare do this? (extremely extensively imo - just about every single click is a captcha of some type sometimes) As much as HN despises blockchains because of crypto, (no, that's not the only use of that tech) it does seem like verification of websites could be a decentralized consensus - along with DNS as well -and user verification could be a portion of the consensus of website quality. This could even help with making self hosted search engines more accessable


Hmm, or you outsource the trust problem to your users. Let them select which domains to trust or who to trust to weight trustworthiness of domains for you.

Imagine that user A can create a list of domains with trustworthiness score. User B can then use that list by going to user-a-awesome-curation.koogle.com?q=nice+shoes

It might create filter bubbles but it would be transparent filter bubbles. You could even wikify/open source the curation.


A lot of this could be solved if we could signal intent or context before searching. But that would require that you know how to use the tool which is a gargantuan user barrier from Google's point of view. Meanwhile, we have to hack about trying to signal context.


pretty much what we're doing at https://breezethat.com, a topic search engine

- we currently curate domains / pages internally based on trust - we'll be adding an open low-code way for others to vote / moderate


the forum experience has effective been totally replaced by either discord or subreddits, or any other kind of self-moderated social media group you can think of.

Its a plus in minus in a lot of ways but the biggest con is that its just straight impossible to search a discord log effectively.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: