The web is peer to peer if everyone just runs a static webserver. Back when Opera was still Opera they integrated a webserver into the browser (with a naming system/proxy through Opera). If this would've taken off the human (but not the commercial) web would be a very different place. It still can be.
I host a static nginx webserver from home and just use a text editor to look at my log of POSTS to participate in p2p web techs like indieweb's webmention.
IPv6 and phones (many people never allow their phone to get so flat it dies, and if it does it’s soon recharged) ought to mean that every person in the world can host their own website effortlessly. The tech industry (telco & Google/Apple) make this basically impossible, so we (the tech industry) are the ones to blame IMHO.
Android/iOS, IPV4 shortage issues, technical hurdles, yeah the list abounds. Don't even need a phone, 10 dollar computers exist, it's mostly an accessibility issue, both of tech and IP.
The big reason to ditch the modern web entirely, though, is DNS - you pay a centralized thousands of dollars for rights to a name - if That's not centralization I don't know what is.
Truly decentralized means anyone hosts google.com, only difference is you pick which certs to trust. Probably by consulting your regional office/town hall, or similar.
Do you think it's worse than using an application which runs untrusted remote code by design? A static webserver is incomparibly more secure and less of an attack surface than simply running a modern browser with JS enabled.
Im more hybrid oreinted myself. The web is short on more federated apis. That I think we are slowly growing. But in many ways pingbacks are still state of the art, not much advanced.
The other thing we need is some trust/take it with us models. Servers signing their content, local not distributed blockchains.
Federated + pingback + logs. Paul Frazee had some good threads a bit back on kind of replacing distributed but concensus blockchains with local contract logs, where we could inter-transact, but check to see whether someones really being honest & doing the things they said/following their interface's contracts... ugh where was it.
Could well be wrong but I dont see consistent hash rings or other open p2p content stores as being likely to ever scale storage or availability sufficiently.
Imagine going back to the 80s and 90s and telling people "the web isn't peer to peer! We have to replace it!"
It would be extremely confusing. Didn't we invent HTTP and IP and the web to let computers shoot messages to each other directly? What happened to that?
There are many people advocating to keep using network address translation, refusing to let go of this workaround for a problem that IPv6 solved long ago.
Inevitably they either think that NAT == firewall, or that carrier-grade NAT is needed for their privacy. (I mentally translate that last word to "illegal activity".)
Because the whole point of NAT is that there are multiple machines behind it, so if you spin up a local web server on port 443 or whatever, by default it'll be invisible to the internet. Not just firewalled off, but not even on a routable address.
Most NATs do allow a hole to be punched, but it's a manual step that most users aren't even aware is an option. The process is unique to each model of NAT device, and there are hundreds if not thousands.
Because of address sharing, there can be only one port 80 or one port 443 per NAT. Of course, SNI and similar technologies can work around this, but this is also a relatively high hurdle to get over.
This is difficult enough at home, but in corporate environments it's no longer doable at all. Gone are the days of some early adopter running a webserver on a box under his desk for a decade! I mean, you can, but between the firewalls and the NAT(s) in the way, there is little chance it'll be accessible even internally, let alone externally.
There are other issues also: For technical reasons typical ISPs provide much more downstream bandwidth than upstream bandwidth. Residential-grade connection public IP addresses change semi regularly. Some ISPs are now so low on IP addresses that there is second layer of carrier-grade NAT in front of your home WiFi NAT. Home users have no chance of defending against even a DoS attack, let alone a DDoS attack, which would then also take out their personal Internet, not just their web site. Most people run Windows at home, which would be fine except that the consumer editions limit IIS connections to 10 concurrent requests. Etc...
Meanwhile professional, centralised web hosting can be had for $5/month, little more than a cup of coffee.
You could have a retort for every point, but fundamentally self-hosting makes no sense. For example, you could say that people should just use a separate machine running Linux instead of Windows to host their site. Okay... what's the cheapest decent turnkey machine you can buy? $200? That's the same cost as 40 months of hosting on a cheap provider, and you still have to worry about your dynamic public IP, NAT punching, certificate management, HTTP security headers, NGINX configuration, etc...
Check the apps that are being build by Synonym and Holepunch:
1. https://keet.io (a p2p chat and file transfer app). DHT is used to map your public key -> IP address. Currently in open Alpha. How is this better than Jitsi? Jitsi Meet uses a server to coordinate and merge WebRTC video streams, but it is fully E2E encrypted and decentralized as you can run your own instance. Much like Nostr versus other truly P2P alternatives.
2. https://www.impervious.ai (a browser with build in P2P functions). In closed Beta at the moment.
3. Holepunch is a platform for creating apps that don’t use any servers whatsoever. See also: https://www.bloomberg.com/news/newsletters/2022-07-28/tether-s-holepunch-takes-on-big-brother
4. Synonym (pun on "nym", maybe?) is also trying to solve identity: https://bitcoinmagazine.com/business/synonym-launches-architecture-for-self-sovereign-economy-around-bitcoin
I don't really get these flavor of article, the web is kind of the poster child of a decentralized application. saying the challenge of a decentralized web is kind of like saying the challenge of water not being wet enough.
They're arguing against "big water", and shouting that we should all have local wells.
Well... umm... okay, if that makes sense. But it doesn't for most people most of the time.
People as a rule like centralised services, because they can innovate and iterate faster. Centralised services have efficiencies of scale, round-the-clock support, and other perks that decentralised systems generally cannot afford to replicate many times over, at least once for each instance.
the idea of web3 isn’t that everybody runs their own server. It’s that everybody is using a shared source of truth - the blockchain - that is not centrally owned. So for that data, you are not locked into any platform even if you never self host, which most people don’t.
The challenge here seems to be to resist the power law (pun semi-intended). It's not about water being too wet, it's about fish in the water being vastly different sizes, and a giant shark or two concentrating a biomass worth a million small fish.
The "truly decentralized" web, under this angle would only contain small fish. But the very success of the web means that giants will emerge.
The first step to all of this is, unfortunately, socio-political--we need to collapse the ISP monopsonies. Once you have viable competition in the ISP space, most of the problems preventing peer-to-peer go away (lack of upload bandwidth, NAT restrictions, etc.)
At least in my country there is no ISP monopoly (I can choose between ~8 providers at home). You're right that we have plenty of upload bandwidth, but the internet looks the same as everywhere else.
Those who truly believe in a decentralized web should focus on creating routers with web servers in them that are easy enough to use that everyone can host their content.
I don't know if hardware is the primary issue today. There are cheap computers on the market and you can even use old smartphones for a web server if it comes to it[0]. Not a lack of software either; even if recent, all the Fediverse services, Matrix and more are available and designed to run in a very decentralized fashion. (These being clones of centralized services. There are many past protocols and software that are more than capable.)
I personally think it's just a matter of cultural practice. A large fraction of the world got on the web after the web as publishing platform was privatized. If it was the norm back in the 00s that every household, town and school first bought a box to function as the home server for it's members (which doesn't seem that far fetched imo. If Usenet and IRC were a little more accessible...), there's no reason that in this alternate history, most of the Facebook groups, Discord servers or Reddit subreddits would have had their own domain name or something like that. There will always be space for public soapboxes like Twitter and algorithmic feeds but that's only a fraction of how social networking is done. I like to think as the public consciousness is waking to the harms of monopolized publication platform that's always looking to grow it's stock value, we'll see the federated web return to the norm.
I host a static nginx webserver from home and just use a text editor to look at my log of POSTS to participate in p2p web techs like indieweb's webmention.