Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Subvert network effects and encourage adversarial interoperability (promarket.org)
117 points by walterbell on April 18, 2022 | hide | past | favorite | 45 comments


This reminds me of another writer that's been writing about open and closed systems, Tim Wu.

In his book The Master Switch[0], he makes a similar argument, though in the case of this, it was open access and net neutrality, but I think the overall message and patterns around that he talks about, it is very much similar to whats being talk about by Doctorow. The sum of his argument maintains that basically open systems allow for high periods of innovation, experimentation, and ultimately new markets, and eventually, when those open systems start to get dominated by a few players, they tend to close and become entrenched, until they stagnant, and become open again. He also suggests, as part of the conclusion of his book, that if as society it is encouraged, for instance, for base systems to be open, you don't have to sacrifice interim innovation and growth that come with open systems, and therefore encouraging innovation across all affected sectors.

Think about it like this. Large companies were once small, and when they were small, they built their backs on reverse engineering proprietary protocols and using open access systems, in which they were able to innovate in these spaces, while also able to maintain that innovation catalyst by inter-operating with common systems (e.g TCP/IP, HTTP) and reverse engineering proprietary protocols. In the case of Apple, for instance, one of their primary motivations in investing in AppleWorks[1] is that they actively reversed engineered the Microsoft Office formats, which afforded interoperability with PCs. Arguably, without open access systems and the ability to reverse engineer these formats, Apple would not be where it is today. However now, legislation and legal precedent has been set that would be it, at best, dubiously legal, for a competitor to do just that.

[0]: He gave a talk at Stanford based on the ideas of the book, which centered around net neutrality and open protocols, you can see that talk on YouTube: https://www.youtube.com/watch?v=ij76dh_340w

[1]: https://en.wikipedia.org/wiki/AppleWorks


It's not entirely a coincidence that Tim and I sound alike on this subject! We grew up together at the same hippie alternative school in Toronto and then both came under the influence of Larry Lessig in our adult lives.

Tim and I got a chance to catch up at a Brussels antitrust conference last month and it was great to go over all the ways that our backgrounds contributed to our work in adulthood:

https://www.cra-brusselsconference.com/home/Programme


Wow hello famous person.

This weekend I got back to doing a little bit of work on my TerseNet concept, inspired by one of your articles "https://www.eff.org/deeplinks/2019/07/adblocking-how-about-n..." and things like the Gemini Protocol with the idea of combating monopolies by making protocols simple enough for the average programmer or small team to implement them.

https://github.com/runvnc/tersenet -- still just some ideas, not an actual prototype.


Exciting!


That's amazing, such a small world! I been following Tim Wu since I first heard of him way way back when he was a guest on an early video series produced by The Verge (the video series unfortunately, is now defunct, these were the Josh Topolsky days). No idea you guys were connected in any way.

He's an amazing writer on the subject of information systems and broader society, in my opinion. I was drawn to your work via Factually! of all things.


He really is a terrific writer.

And Adam is tremendous; I just binge-listened to so much of the Factually back-catalog!


Going to take a moment to mention - though I shared a disagreement elsewhere in the comments, I respect the work you've been doing Cory and I regret not saying so in my comment. I'm glad you're make a case for adversarial interop and I hope people do give it a shot.


Thank you - even I don't always agree with me.


Doctorow is arguing for adversarial interop but also observes that companies shut down APIs due to them becoming data leak vectors and then use that concern to make circumvention illegal. The dilemma is real. You end up with a sense that fair access and security are meaningfully in conflict. Perhaps a bit like the anti-cheating DRM conversation in the RMS state of FOSS thread. I’m also reminded of the free speech vs moderation debate, where the common retort to decentralization by Twitter employees is “you don’t know how impossible moderation will become.”

Im inclined to think the system has to be rearchitected to solve the dilemma such that interop is the default easier option. (And, ideally, makes data security default easier.) Nobody has succeeded at it yet but it’s always the theory I’ve chosen to pursue.


> the system has to be rearchitected to solve

If it was obvious how to, we'd be talking about that, but it's not even obvious how to. A process to solve a problem no one knows how to solve can get co-opted and yield something awful, so I am not inclined to start one yet.

Moreover, there must be plenty of room for innovation.

We're all using TCP/IP, which is a great win for interoperability, but also an enabler of proprietary application protocols above it because it didn't try to preclude them. We've missed out on some things though, in the CNLP/OSI side of things, but I'm not familiar enough with them to know just how tragic, if at all, that is -- we're pretty happy with TCP/IP anyways. Do note that no one forced TCP/IP on anyone. The best interop is that which arises w/o the use of force.


>> the system has to be rearchitected to solve

> If it was obvious how to, we'd be talking about that, but it's not even obvious how to

We're in agreement, both on that and on your points on how innovation does happen. Basically all of the decentralization projects in the past ten years, from federation to p2p to blockchains, have been taking a stab at the greenfield approach. None of them have worked in the sense that they displace existing approaches, but I remain hopeful that something useful can come out of it all.

As a fun aside, one of the more interesting semi-adversarial approaches I've seen was Delta Chat [1] which basically put a chat interface on top of email.

1. https://delta.chat/en/


The only thing missing in the "dilemma" is someone, teary, begging us to "think of the children."


"The dilemma is real."

Um, authentication?

The issue with API mass data leaks is unauthenticated access to the data.

Facebook HAS a web-wide authentication service, to heap the irony. Which of course would be sabotaged, but there are of course other means to do basic authentication. The users presumably know their passwords.

That is a fake issue, IMO.


Okay so they have wide authentication infrastructure, that much is obvious. However consider this:

If users, say, want to use an alternative application to view their Instagram posts and Facebook posts, or an alternative application to interact with TikTok, you would, of course, need to authenticate the user against their service to read the APIs, and thats fine, applications can do that with user consent, and of course, if you do that, you can reverse engineer the platforms formats, as described in this article.

The dilemma isn't being able to authenticate a user against an API, its that, if you actually go through and do this, Facebook, TikTok, Slack, or any other service, can use the courts to shut down your business. They have, effectively made this illegal, or at best, legally dubious, through case law and legislation. This was not always the case however, it hasn't always been this way, see Apple reverse engineering Microsoft Office document formats, or Airbnb scraping Craigslist to build their listings.


Is it fake? The relationship isn't just between the user and facebook (thus, auth) it's between the user, facebook, and the third-party app. The concern is that the 3rd-parties act maliciously; see the challenges with browser extensions.

I'm not making excuses for the big companies -- I'm just trying to evaluate the challenges properly. If the core of the problem really is the behaviors of 3p apps, the security model for them is what a solution needs to address.


Then you get into... authorization.

Yes, facebook and other social platforms have issues with people snagging account access, usually by shared password vectors, but come on.

This is still basic meat and potatoes auth for a more open social "pan-platform".


Well I'm more thinking of the problem that browser extensions had, where people would make something handy or fun, be granted perms by the user, then get sold to people looking to mine user data. Extension authors get solicitations for these sales constantly, and the sale prices can be really tempting for devs. A friend of mine just made a silly "cursor sparkles" extension and he's mentioned multiple of these emails.

Regardless of how you feel about it, the Cambridge Analytica ordeal was entirely about using benign quiz apps to mine user data, and the pressure that Facebook received was to fix their security. That's a huge incentive to lock down fair access. So if this is the situation, we need to address it if we're going to fix the technical pressures that create monopolies.


that's a user problem. they granted the extension permission to access the data. that's on them. You will never be able to solve that problem. after all a user can just hand out their credentials if they want to.

Facebook's issues stems from the fact they allowed apps in their market to access data they didn't need to function / hide exactly how much access these applications are granted.

the reality here is auth is the solution; for browser extensions we need better UX. if a extension is asking for access to your navigation / page data that needs to be flagged and warned about during the setup flow.

many of these auth systems make the risky API access grants appear extremely benign.


At this rate (on the trend that this article also denounces), I feel like considering giving up computation altogether is on the table (like supposedly happened in the Dune fictional universe).

RMS has been right all along, although he may not be delivering his message in the most effective way possible, the core of this ideology is the understanding that when software does not encourage freedom, is all but restricting it.

(IMHO) Software is a development on par with writing itself. Writing as a technology provides the basis (the foundation) for all of what is considered "civilized", the law (in general, any and all law) is a written artifact.

Hence, I conclude that as a society we need to acknowledge what software enables for the concept of law.

However, unlike law which can (and has been) used to curtail freedom all through history, when written laws are used to this end they require a police or another kind of violent-if-necessary "enforcement". The critical point is that software does not need this kind of enforcement. Software can accomplish a comparable freedom-reduction much more softly; but not less effectively (arguably, software can be even more effective).


"(IMHO) Software is a development on par with writing itself. Writing as a technology provides the basis (the foundation) for all of what is considered "civilized", the law (in general, any and all law) is a written artifact."

This is ... great, combined with the observation "software is eating the world".


Doctorow is correct here: interoperability is necessary, from a social perspective. Governments should mandate it.


Or at the very least, not criminalize competitive compatibility attempts.


That's one of the things the upcoming Digital Markets Act is trying to tackle.


Yeah. I went in swinging with a chip on my shoulder but I couldn't find fault with the article. Solid as heck for the internet. (IMO)

The one major problem I see with all this is mass adoption: it feels like most folks just don't care, and most of the folks who do care wind up using the "bad stuff" anyway (mostly out of convenience, but also due to FOMO (Fear Of Missing Out).)

Even the right-to-repair and "comcom" mandates rest on popular support, eh?

To me that seems like the "800 lb gorilla": how do you get the average user to care about this stuff?


The Internet became mainstream because folks could use modems and phone landlines (circuit switching technology, that the new TCP/IP end-to-end packet switching stack would eventually replace) to get online.

The path for mass adoption of the new, p2p and interoperable technologies is the same: make them work on web browsers (using the browser as a full peer, not a client). It is the ubiquitous VM: JavaScript, WASM, IndexedDB, WebRTC+WebSockets, all the pieces are there.

The building blocks for a representation / data model are all in the air as well: CRDTs, Merkle-ized structures, content-based addressing, etc.

I'm a believer.


> it feels like most folks just don't care, and most of the folks who do care wind up using the "bad stuff" anyway (mostly out of convenience, but also due to FOMO (Fear Of Missing Out)

They do care, but most of us lack the understanding and vocabulary to change. All these things can be overcome. FOMO, worship of convenience, lock-in, or fear of being different. Never give in to bullying by weak-minded naysayers and defeatists who shout "resistance is useless", or demean others by presuming to know "what they want" or saying that "ordinary people are too stupid..." etcetera. These are the vestiges of the 20th century hacker that we must shake-off as a culture.

These are mild obstacles in the end. People can be educated. Please help spread the word about Digital Vegan [1] which is an attempt to make a case for "healthier technology". Spread the word about Humane Tech [2] and the harms inflicted by incumbent monopolies

As an ordinary citizen. speak your mind and remain true to the values of what you think technology can be, not the pale imitation offered by so-called "Big Tech". I personally refuse to use any big-tech products and I use what influence I have to convince others to drop them too.

Interop will not happen by legislation alone, it also requires awareness, dissent and self-determination. People must be reminded that it is their right to be masters of the technology necessary for modern life, not slaves to it.

[1] https://digitalvegan.net

[2] https://www.humanetech.com/


Imposing interoperability has other costs than switching costs: mainly the opportunity cost of stifled innovation.

All costs must be taken into account.

Cost of opportunity is immeasurable.

This is not an easy task.


Wouldn't imposing interoperability be the opposite of stifling innovation? It's forcing companies to allow others to innovate as well, as they'll actually have a chance of breaking into the market of the dominant player. If Twitter can (and will) act without interoperability, they can shutdown anyone who tries to interop, stifling innovation in the space at large, while if they are forced to allow interoperability, other players can also enter the market.


It depends! For example, IPv4 did inhibit some innovation (think CNLP, OSI, IPv6, etc.), and TCP did too (probably), each at their respective layers, but TCP/IP produced a great deal of innovation at layers above, and that's easily worth the opportunity cost of missed innovation at the the network and transport layers.

But now suppose we standardized on Java or Ada or something for app dev. That would not be good, and the opportunity cost of inhibited language development would be awful. Standardization in APIs is worse even, if you disallow extensions, but is less nasty if you do allow extensions.

So, yeah, "it depends" is the best I can do for an answer.

Sometimes it's possible to reduce the opportunity cost of innovation inhibition by standardizing in mature technology spaces, but sometimes a sector that appears mature is ripe for disruption, so it can be really hard to predict. And we cannot know what we've lost for quite some time after the standard is imposed.

So on the whole I'm disinclined to have standardization forced on the market.


Well, it may stifle innovation in the protocol, while encouraging innovation in the implementation and/or use. In my book, that's a net win.

Why? There's less room for innovation in the protocol. Take TCP, for instance. Could it be improved? Probably. But how much gain are you going to get by innovating there? On the other hand, how much was enabled, on both ends of the wire, by everything standardizing on TCP? Far, far more than we could ever get by a "better TCP".


TCP is a bad example; since its actively being supplanted by quic simply because TCP is such a inefficient protocol.


TCP dates from 1974, QUIC from 2012. I don't think a new transport protocol after 38 years invalidates my point - that there is less room (not none) for innovation in the protocol than there is for innovation in the things that use the protocol.


TCP is a bad example because it has become so hard to replace.

SCTP failed to replace TCP for various reasons, such as too many middleboxes filtering ULPs that aren't TCP or UDP, or SCTP being far to rich and complex, or just inertia.

But TCP has had a ton of problems, and while there has been a ton of research into those problems and the solution spaces for them, adopting even hacky solutions into TCP has been difficult. As a result, TCP has gotten complex and bloated over the years without at the same time solving all the problems that a new protocol could solve.

On the plus side, in spite of inhibiting innovation at the transport layer, TCP enabled innovation at the application layer by making the application layer possible at all.


didn't say it invalidated your point just that it was a bad example. just because something has inertia doesn't mean its good or even okay. it just means it timed the market correctly.

you could have used UDP for example which being as old as TCP has spawned many new protocols on top of it (including quic) and in its own right has been the backbone of many applications.


Perhaps. But all the core protocols of the web were done using government money and standardized as a public good. I don't see why social media protocols should not have some base level standard of interoperability, given their importance in society.

I don't begrudge FB and Twitter their success. But I think that, as with the previous generations of captains of industry, it is time that the winners take into account the public good.

Something like the living web standard for social media would be nice.


the means advocated for here seem kind of strange though. He correctly points out that the biggest advantage social media incumbents have is their users data and network effects.

His "comcom" methodology consisting of "reverse engineering, scraping, bots, and other improvisational techniques", is not exactly something I want near my personal information because unlikely the old days you're not reverse engineering hardware schematics, you'd be scraping people's private information.


It's fine to scrape your private information under your own control.

Many people have the problem that their life's information is sitting in a proprietary silo, without a nice way to extract it. (Truth be told, Google does offer the "takeaway" thing, kudos. FB apparently does not.)


> FB apparently does not

At least as of a few years ago (~2015/2016) they did; when I deleted my own FB account I found an option that allowed me to download a zip archive of my photos and posts. No idea if they still support that now, or where they've hidden the option if so.


1) FB does have a takeout product

2) Scraping your data for your use might be fine, but what about the private data of friends and family that you have access to? Do you want grandma to be able to scrape all your posts/pictures/etc just because she wanted to take the "What kind of harry Potter broomstick are you?" Quiz?


For (2): Whatever you've one made available, is available. You lose control once you make it available. You can ask nicely not to do certain things with what you divulge, but you cannot enforce it.

So yes, as long as grandma has access to my pictures, it's fair use for her to collect them. Republishing them to a wider circle than originally posted is a different matter.

This is the reason I only post online such things which I can repeat under oath, or see posted on the busiest crossing.


No, it think it's got to be something more than that. I have no idea how you'd do this from a legal perspective. But a base level of "core-functionality must interop". Social media federated by fiat.


I suspect interoperability is fundamentally at odds with ad-driven business models.

When your business is run on ads, you are essentially serving users a mixture of information they do want to see (content) and don't (ads). If your system must allow interoperability, then it's inevitable that a competitor will come along and separate out the content from the ads to give users the experience they want, but killing your business model as a consequence.


This is somewhat semantic, but I think it's more accurate to say that adversarial interop leverages network effects, not subvert them. By enabling interop, you're expanding the network -- sometimes enormously so, if interop lets you bridge otherwise closed networks. What you're subverting is rent sinking, monopolistic exploitation by network owners.


The idea of giving random parties immunity from civil action for maximizing interoperability is downright dystopian. The direct effect would be something akin to an open access version of Facebook's API preventing lawsuits over another data breach like Cambridge Analytica.

The entire article seems to border on ignoring the idea that we have data in silos that we want to be protected by more than just user permission to access that data. Especially when users can give those permissions to various parties who undergo changes of ownership.

Open access is not as straightforward in the current web as it was before big data. This article did nothing to reassure me that those who believe in access have any sort of realistic plan for data privacy. Facebook is not an actor I want to trust with data and as a result I put very little data on it. But I trust the random apps made through their API by borderline unknown developers far less. I don't think someone getting access to data I have set to either friends of friends or friends because they played a quiz game is something I want an endless expansion of.


I don't think the author is envisioning anything that compromises privacy and data protection. "Immunity for maximizing interopability" means the right to ignore anti-interop ToS, not privacy.

There are many use cases where either there is no privacy concern in the first place (think public Facebook pages locked behind account creation) or the user explicitly consents and enters their login information (e.g. in alternative WhatsApp clients).




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: