Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> No source code control

30 years ago is 1992, we certainly had source control a long time before!

In fact in 1992 Sun Teamware was introduced, so we even had distributed source control, more than a decade before "git invented it".

CVS is from 1986, RCS from 1982 and SCCS is from 1972. I used all four of those are various points in history.

> No multi-monitor systems, frankly anything bigger than 80x25 and you were dreaming.

In 1993 (or might've been early 1994) I had two large monitors on my SPARCstation, probably at 1280×1024.



That's like saying that "we" had computers in 1951.. The future is already here – it's just not evenly distributed.

Something existing is different from something being in widespread use.

When I was a kid in the 90s, I had a computer in my room that was entirely for my personal use. There was a pretty long stretch of time where most kids I encountered didn't even have access to a shared family PC.. much longer before they had a computer of their own.


Had a Kaypro back in ‘82 that I used to create a neat umpire for a board war game. It had a markup language and could run things that let me get on arpanet and run Kermit. Lots of time has passed and programs used to be way more “efficient”. And the workstations and mini supers that followed shortly had great graphics it just wasn’t a card as much as a system. SGI’s and specialized graphics hardware such as Adsge and the stuff from e&s. Lots happened before pc’s.


I'm certainly not saying that nothing happened before PCs, only that when talking about the past, one cannot say "we had X" based simply on whether X existed somewhere in the world, but one must consider also how widespread the usage of X was at the time.


There were gobs of suns and SGI’s in the 80’s on just not at home. Whole lot of Unix work was done before that on pdp11’s and VAXen. Had to dial in or stay late to hack :-).


Indeed, I'm not disputing that.

However, you still need to mind the context. For instance, there existed computers in 1952.. Saying that "we had computers in 1952" is still right, very few institutions had access to one. Most people learning to program a computer in 1952 wouldn't actually have regular access to one, they'd do it on paper and SOME of the their programs may actually get shipped off to be actually run. So it'd even be entirely unreasonable to say "We, the people learning to program in 1952, had computers", one may say "We, who were learning to program in 1952 had occational opportunity to have our programs run on a computer"..

Yes, there were lots of nice hardware in the 80s, and LOTS of people working professionally in the field would be using something cheaper and/or older. In context of my original post, I took issue with op writing that "we had version control", sure, version control exited, but it was not so widely used throughout the industry that it's reasonable to say that we had it, some lucky few did.


The topic of the thread was software developer experience as a professional career, not at home.

Sure, in the early 90s I didn't have muti-CPU multi-monitor workstations at home, that was a $20K+ setup at work.

But for work at work, that was very common.


Maybe GP was in a less developed or wealthy area than you.

Often when talking to Americans about the 90s they're surprised, partly because tech was available here later and partly because my family just didn't have enough money.


Dude is literally talking about Atari. It’s surprising they didn’t have better source control by 1992; Apple certainly had centralized source control and source databases by that point. But Atari was basically out of gas by then.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: