Google+ is a great example of the point that once a community's platform gets shut down, it's often tough to find another place to meet, and some people don't survive the transition.
Maintaining federation of USENET was a massive effort. I used to run an NNTP server, and spent way too much time dealing with ensuring we had redundant feeds and kept up with the volume. And on top of that handling spam. It worked well for what it was at the time, but it was nowhere near an ideal federated platform.
What has Mastodon improved on this process, though? It seems the same issues are in place -- difficult to administer technically (this post) and hard to deal with spam (have heard before, don't have a link on hand unfortunately). This is a genuine question -- I wasn't around for USENET so maybe this is a "quantity of difference becomes quality of difference" issue where the degree of effort for maintaining it was just way harder than it is now.
I deployed and have been maintaining a very small Mastodon server (~10 active users) for several years now and I can say that at my scale it is not difficult (for someone with modest technical abilities but no professional sysadmin experience). Sure if your instance is as large as Mastodon.technology and you are the only admin doing it as a hobby/side-gig then things can get rough.
Regarding spam, there is some (seemingly inevitable in any community involving humans). But frankly Mastodon, right from the beginning, focused heavily on Moderation tools. Figuratively speaking, no expense has been spared to make it easy and convenient to block bad actors (or instances) from your account (or your whole instance).
The biggest thing is that expectations are different. Mastodon doesn't do much (or rather ActivityPub doesn't do much - there are many ActivityPub implementations), though there's at least a less manual way of handling federation than having to e-mail people and get added to their config by hand.
With USENET people expected to see every posts in the newsgroups they were subscribed to, and a lot of people would complain loudly if anything was missing. With a network like this nobody expects to read everything. At the same time receiving everything is easier (still) - if you federate with a couple major endpoints you get most stuff. There will be challenges with this as the network grows but I spent too much time ensuring we got messages from newsgroup X within Y hours (less was unfeasible, as there were still sites exchanging on a schedule via dialup).
With respect to spam, once you accept you're not likely to see everything, things get a lot simpler and you can apply a lot more aggressive filtering.
Personally I'm working on my own ActivityPub implementation, and one of the things I've used on Twitter in the past very effectively via the Twitter API was a simple Bayesian network used to rank things to surface interesting stuff, but lots of room to apply more sophisticated machine learning there too.
Maintaining anything takes effort, so who's gonna do it for free?
Maybe there could be a variant of NNTP which allowed some amount of advertisement posts in between. Then let the maintainers keep the proceeds from those?
Usenet is still here. Smaller, than used to be, but still here. Probably average English speaking person even don't understand, how big it is: there are healthy German speaking userbase, lot of people from Italy, even some Finnish groups have life in them.
The main problem here is that contact information is lost. If there's one problem that distributed blockchain technology would be the better solution for, it's a durable collection of self-managed identifiers and groups of identifiers.