Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As I implied in an earlier comment, your conviction (if you're wrong on the inevitability of the direction), may be one of the things that leads it into that direction.

Here's my historical take: in the 1960s and 1970s, computation in general was viewed as a sinister, authoritarian thing. Many people assumed it was going to be that way, and a small minority recognised that it also had the potential to empower and grant autonomy to a wider class of people. These were the advocates of the personal computer revolution -- the idea of "computer lib", whereby the tools of control would be inverted and provided to the people at large.

You can argue about whether that strategy was a success or not, but the group tht was largely irrelevant to that fight were the people who decided not to get involved, or to try (although not very hard) to impede the development of computation in general.

To bend the trajectory of AI in general involves understanding and redeploying it, rather than rejecting it. It also involves engaging. If it's anything like the last few times, the group that is simultaneously exploring and attempting to provide agency and autonomy for the maximum number of people will be smaller than both those using new tech to exploit people or maintain an unequal status quo, and the people who have good intentions, but throw their hands up at the possibility of using their skills to seize control of the means that provide for a better future.




> in the 1960s and 1970s, computation in general was viewed as a sinister, authoritarian thing.

And it was correct. We now live in surveillance states much worse than Stalin's or east germany.


Structural incentives explain the computer trajectory. While they were purely in the academic realm they were a force of empoverment, but this ended when economic incentives became the main driver. AI has speedrun the academic stage—if it ever existed—and is now speedrunning the enshittification stage.

But there is very little you or I can do about it except choosing not to partake.


At least in my experience, this is ahistorical. Personal computing in the 1970s and 1980s lived outside of academia, as did bulletin boards. The productive, creative, and empowering elements of the Internet and the Web were subversive actions that existed -- and in some cases were barely tolerated -- within its academic usage.

You say "there is very little you and I can do about it". Even if you don't listen to me, perhaps you might listen to the coiner of the term "enshittification"? https://archive.is/CqA8w


Huh? Computing in the 70’s and 80’s was almost entirely driven by academia - even being run by schools.


I agree with the author who said that is ahistorical...at least from my, and the people I grew up with's, perspectives. I grew up with computers in the 70's and 80's and while you may be thinking of centralized computing (minicomputers and mainframes), the personal computing revolution was widely distributed, not centralized in academia. BBSes, swap meets, user groups, even the corner Radio Shack was where 'computing' was vibrant and active. (And the magazines...SO many 'zines!)

We may be talking past each other, but my experience of computing in the 70's and 80's was definitely not academic.


That didn’t start to become common until the early/mid 80’s.

Did it exist a little? Of course. But it was dwarfed by the other stuff going on. I suspect your (and a lot of other HN) experience is going to bias on the hobbiest side though, as does mine. I only found out about the much larger stuff going on at the same time much later.

Almost all the early networking stuff (UUCP, pre-Internet internet like Arpanet, early Usenet, Gopher, even HTML and the WWW, etc) was academic institutions or related.

Often with military grants/contracts. Sometimes with purely commercial contracts, but even those were almost always for some Gov’t project. The amount of work on basics like sorting algorithms that grew out gov’t research is mind boggling, for instance.

There is a lot of well documented history on this.

Then PCs and halfway decent modems became available (2400 baud+), and things changed very rapidly.

Mid 80’s, BBS’s started sprouting like weeds. There were a few before then, but the truly hobbiest ones were very niche.

Then even more so with commercial services like Prodigy, then AOL, then actual ISPs, etc.


I think the compromise position here is to concentrate on the 1980s, and acknowledge that there was a lot of networking tech going on in academia in the 1970s.

However, in context, what I was trying to convey was that the personal computing revolution took place outside of academia. Generally, that lineage started in the early 1970s, with the homebrew movement, and took off with the Apple II in the United States in 1977. This is also well-documented, but a different branch, and definitely more concerned with the idea of computers as providing autonomy.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: