Hacker News new | past | comments | ask | show | jobs | submit | TheOtherHobbes's comments login

To make a generation ship work you have to build a self-contained ecology that is stable and self-repairing, inside mechanical and software systems that are fault tolerant and either extremely redundant or self-repairing, run by a political and social system that is also fault-tolerant and self-repairing.

We know how to do exactly zero of those things.


More to the point the ship needs to be absolutely self sufficient, it can't even use solar power and has no access to outside mass whatsoever. But if you have a ship like this you could build an orbital habitat using the same technology, and it would be much much easier to build since it doesn't have to accelerate, can use solar power, and has access to the rest of the resources of a solar system.

If you have all of this why would you go to the enormous extra effort to move the habitat to a different solar system? Even if your civilization is so old that the star is a dim brown dwarf that's still plenty of energy for day to day life.


There's a CRPG I've been meaning to play where this is basically the plot; there was a generation ship, it was heading towards some planet or another, but the social and political structure on the ship broke down at some point and now there's no one actually in charge, the ship is getting run down, and they probably blew past their destination a hundred years ago if they were even still on course at all.

I remember someone pointing out that a generation ship could be problematic because you have one generation who decides to launch this expedition but will never see the end, multiple generations who didn't choose this life and won't get to see the benefits, and then one generation who actually gets to the planet but might not even want to be there. Without some kind of cryogenic sleep or relativistic speeds the whole thing might fall apart just because most of the people involved "didn't sign up for this" but they have to toil away anyway for someone else to benefit from it.


What of the "just so" attitude of a child growing up? Everything is taken at face value, there is no comparison, only stories (unless you have a catalogue of 30EB of 8K earth footage or something to that effect for them to fawn over). They don't have the reference frame for other situations for a while, perhaps long enough to not be able to see things differently?

This makes me think of multi-generational migrations north out of Africa. There's only so much that can be passed orally losslessly. Eventually the group in north siberia after 20K years doesn't see living any other way.


> I remember someone pointing out that a generation ship could be problematic because you have one generation who decides to launch this expedition but will never see the end, multiple generations who didn't choose this life and won't get to see the benefits, and then one generation who actually gets to the planet but might not even want to be there.

That isn't really different from the way things are now. We are, in fact, traveling through the galaxy for many generations and none of us signed up for it. We just happen to be on a largeish ship and have no destination.


> run by a political and social system that is also fault-tolerant and self-repairing

That's the point of the AI; it would generally replace that.


Well, to be frank, we currently have such a ship, but we're doing quite a lot to disrupt its capability of sustaining human life.

Of course, even if we stopped doing that, we'd need to figure out how to visit another place if our ship is passing close by. That also seems to pose a problem: both Voyagers are barely out of the exhaust fumes of our ship's motor, and getting so far took ~40 years.


There's about one particle of dust per million cubic metres. c is about 300 million metres/second. So even at 0.5c that's still a lot of particle collisions per second, each having significant kinetic energy.

Basically it would be like flying through explosive sandpaper. Each dust particle would be reduced to plasma, which creates problems of its own.

If you're accelerating there's also the Unruh Effect, which will raise the perceived temperature. By a lot.

There's no way to make this work with any kind of engineering we know about today.


The Unruh effect is theoretical, and no evidence at all has ever been found that it's real. It literally exists as nothing more than a hypothetical mathematical model, that also happens to be debated by others who know enough to effectively debate it, and disagree.

Which suggests that:

1. You can have black holes inside black holes.

2. Potentially each black hole is a universe - although some are much smaller and less interesting than others.


Your electricity bill is set by the grift of archaic fossil energy industries. And nuclear qualifies as a fossil industry because it's still essentially digging ancient stuff out of the ground, moving it around the world, and burning it in huge dirty machines constructed at vast expense.

There are better options, and at scale they're literally capable of producing electricity that literally is too cheap to meter.

The reasons they haven't been built at scale are purely political.

Today's AI is computing's equivalent of nuclear energy - clumsy, centralised, crude, industrial, extractive, and massively overhyped and overpriced.

Real AI would be the step after that - distributed, decentralised, reliable, collaborative, free in all senses of the word.


Distressing and revealing that Apple's WWDC gets thousands of comments, and this - which is just a little more important - gets four.

That model stops working as soon as you try to distribute software at scale. You need some kind of standardisation and baseline. Otherwise your customisations conflict with someone else's customisations, with consequences that vary from mildly annoying to catastrophic. And the whole idea of shared software collapses.

It would work if you had strict access control to each feature in a common code base. But Git hadn't been invented yet.

The other issue is performance. Compiled ST isn't particularly slow, but you lose the modifiability. Interpreted ST has a mixed profile - some features are slow, some are fast - but generally it was slow compared to C.

Today that doesn't matter so much, but it was a drawback at the time.

It's a seductive model and I totally get the appeal. But it's not quite as straightforwardly superior as it might appear to be.


> standardisation and baseline

1984 "Smalltalk-80 The Interactive Programming Environment" page 500

"At the outset of a project involving two or more programmers: Do assign a member of the team to be the version manager. … The responsibilities of the version manager consist of collecting and cataloging code files submitted by all members of the team, periodically building a new system image incorporating all submitted code files, and releasing the image for use by the team. The version manager stores the current release and all code files for that release in a central place, allowing team members read access, and disallowing write access for anyone except the version manager."

https://rmod-files.lille.inria.fr/FreeBooks/TheInteractivePr...

    ~
Later "ENVY/Developer"

https://www.google.com/books/edition/Mastering_ENVY_Develope...

https://archive.esug.org/HistoricalDocuments/TheSmalltalkRep...


As far as performance, the Alto team described the performance of Smalltalk as between "majestic" and "glacial". From my experience with the Alto, Smalltalk seemed unusably slow, and I'm amazed that they could actually accomplish anything with it.

With the ParcPlace flavour teamwork with Envy version control was pretty slick. And in performance it was mostly compiled to bytecode with some cunning tricks in place to make interactive debugging appear to work at the source level. Truly great environment for exploratory development: concrete example, while building a comms server I could inspect why message handling failed, patch code, rerun from failure, much faster cycle time than most other langs/envs.

The Xerox Dorado, an ECL beast evolved from the Alto, was considered the only machine to offer a good Smalltalk experience when Smalltalk-80 was initially released. The Xerox Dolphin, for example, was only about "0.1 Dorados" despite being a serious improvement on the Alto. The Apple Lisa was only about 0.05 Dorados.

That level of performance was only achieved by PCs when we got 50MHz 486 (for purely interpreted Smalltalk-80 virtual machines, with JITs much slower computers could match the Dorado).


not sure about context or timing, but there was work put towards improving performance around the same time: https://archive.org/details/ungar-smalltalk (pub Feb 1986)

For a similar experience, boot up Windows 3.1 on a machine with 1MB RAM.

> But Git hadn't been invented yet.

Git isn't the first or oldest DVCS. Smalltalk actually had multiple DVCSs, including ENVY (VA/Cincom), which dates back to the mid 90s, and Monticello (Squeak/Pharo), which dates to the early 2000s--both of which predating git.


With regards to performance, Craig Chambers's thesis on the Just-In-Time compiler for Self proves pretty definitively that a language as (or more) flexible as Smalltalk can be very performant without losing any dynamic properties. It certainly isn't an easy compiler/runtime to recreate for any given language, but it does exist.

It didn't exist at the time you're referring to however, so in that context the point does stand, mostly. I believe there was some form of flexible compiled optimization for Smalltalk (since Chambers's thesis references it as prior work) but I forget when that was and how commonly used it was.


Chamber's thesis was '92 if i'm looking correctly (https://www.cs.tufts.edu/~nr/cs257/archive/craig-chambers/th...), Ungar's thesis from '86 greatly predates it: https://archive.org/details/ungar-smalltalk

The first JIT was for Smalltalk, running on a 68020 system.

That sounds about right. (Just to clarify, what I was referring to with "did not exist at the time" was specifically Self's JIT and its new techniques.)

> That model stops working as soon as you try to distribute software at scale.

True, but lots of people do not need to distribute software at scale.

IIRC Pharo has a way of exporting the current image to files for version control. That should solve the problem of multiple developers working on the code. You can modify while running to try things, and then commit the final result to version control.


Does Erlang have these same properties? From what I hear, it works great for distributed systems and modifying-while-running.

Erlang systems don't do so well when multiple people are working on them with different understanding of what production is. (And especially if nobody actually knows).

I don't know that you need 'strict access control' as proposed, but you do need coordination, and a source code repository is a way to manage coordination.

Erlang doesn't have a different mode for fixed vs loadable code, although you may need to write code differently in order to be able to hotload changes, and that could affect performance; some code naturally fits the idioms needed though.

I find it amazingly valuable to be able to change the program without having to lose the program state; but it requires different thinking, and it is easy to mess things up in exciting ways that are different than many other systems. It's amazing to fix a bug and deploy it in seconds into a live system with millions of active connections; it's also amazing when you try to fix a bug and break millions of active connections in seconds, but in a negative way, because now you and your users have to pay the costs of reestablishing all of those connections.

OTOH, the key issue is interchange of state and messages between versions, and that's an issue that comes up in almost all distributed systems that update without downtime, so most people have to deal with that issue anyway, just usually not within the much smaller scope of a single task.


> stops working as soon as you try to distribute software at scale.

I disagree – just make it read-only – but to illustrate my point, let me try to invert your point.

You're saying that the Lisp/Smalltalk software creation model doesn't scale and only mass-produced software can.

Instead of the traditional car analogy, let me use clothing instead. Even more people use clothes than cars.

"The idea of hand-making bespoke clothing stops working as soon as you try to distribute clothes at scale. There are billions of people and they're all slightly different sizes and shapes. You can't customise clothes for them all!

"The only way we can clothe billions of people is if they all wear the same thing. One-offs worked when we only had a small population and only the rich got a choice. There was no other way. But since then, we invented elastic and zips and belts.

This way we can give everyone a simple all-in-one coverall, a sort of jumpsuit, and using a few zips and elasticated bands and some straps to shorten or lengthen the arms and legs, it will fit everyone. At the worst we can make a few sizes and they will scale.

It's all about scale, and we have to make sacrifices if we want something that scales efficiently."


Why the downvotes?!

It was the late 70s/early 80s, so a certain amount of twee whimsy was obligatory.

Especially not in non-English countries.

Icons make localisation much easier. In fact flat web design has evolved a fairly standard set of icons for basic operations. Most people know what a burger menu and x in the top corner of a window do. Same for copy, share, and so on.

The problem with Liquid Glass is that it's making the background style more important than the foreground content. No one cares if buttons ripple if they can't see what they do, because icons themselves are less clear and harder to read.

So I don't know what the point of this is.

Unifying the look with Apple's least successful, least popular, most niche product seems like a bizarre decision. I'm guessing the plan is to start adding VisionPro features in other products, but without 3D displays the difference between 3D and 2D metaphors is too huge to bridge.

I really liked Aqua. It was attractive and it was very usable.

This is... I don't know. It seems like style over substance for the sake of it, with significant damage to both.


Democratic voters want those things. It's not at all obvious the party establishment does.

The tell is that when Republicans push through their policies, Democratic opposition is weak and ineffectual. Instead of ferocious opposition the Dems send one of their famous sternly worded letters.

Since at least 2000 the party establishment has absolutely refused to do any of the things it could do to change this - including packing the Supreme Court, supporting and promoting grass roots activism between elections, using the filibuster, and so on.

Biden couldn't even get any of Trump's prosecutions over the line - including televised evidence of insurrection, and treasonous mishandling of official state secrets (!)

However it's spun, there is a very obvious reluctance to challenge the extremes of Republicanism.

The party is far more likely to censure one of its non-centrists than its centrists, while the opposite is true of the Republicans.


The Democrats operate as if they're controlled opposition. It's like their donors pay them to blunt their base. They haven't accomplished anything legislatively this century beyond pass the 1993 Republican healthcare plan under Obama's name. They couldn't even raise the minimum wage.

In my experience this is dead on. People have short attention spans but this has been happning the whole 21st century. In 2008 Obama won the primary despite the best efforts of leadership to nominate clinton. They even scrambled the "super delegates" (delegates who vote for the candidate chosen by senior leadership) hoping that even if Obama won more delegates, they could override the voters choice.

Of course, they failed, and democrats won 2 elections in a row running a candidate labeled a radical socialist. Obama became the only 21st century president to win the poplar vote twice, and the DNC has been trying to drag the party back in the 20th century ever since, blaming their own voters when it doesn't work.

It boggles my mind that they refused to even engage with the "undecided movement", which created a grass-roots get out to vote movement out of thin air. In swing states no less.

The starkest contrast between the two parties is womens rights and to a lesser extent LGBTQ rights. Although I'm not even sure how true this is anymore with so many politicians backing Cuomo, who resigned because an investigation found overwhelming evidence he sexually harrassed and assaulted female employees. And I'm pretty sure people like Chuck Schumer and other centrists view the LGBTQ community as a liability.


>"Obama became the only 21st century president to win the poplar vote twice"<

Amazing! And who won the pine, elm, and oak vote? [it's "popular" not "poplar"; A poplar is a effing tree!)


An oak tree's vote adds more to our democracy than your comment does to this conversation.

itsanaccount says "An oak tree's vote adds more to our democracy than your comment does to this conversation."

Now you're calling for the trees to vote! Have you no shame, sir? I assure those reading not to panic: no unregistered trees shall be allowed to vote, even in California, as long as Donald Trump is President! Simultaneously we extend our grief to all of those in CA whose registered and unregistered trees were slaughtered by the recent fires in CA.

"I've seen thing you people wouldn't believe... forests on fire off the hills of Redmond.... I watched fire retardants glitter in the dark streaming in the skies over San Bernadino. All those votes will be lost in time, like tears in the rain...Time to go."

- parting words of homeless anarchist who started the blaze.


> I'm pretty sure people like Chuck Schumer and other centrists view the LGBTQ community as a liability.

Why do you think he / “other centrists” hold that view?


Both are true. The end of the Fairness Doctrine normalised the psychotic distortions and lies pumped out by Fox. But the same machine that uses Fox also runs bot farms, astroturfing operations, and curated social media algorithms to normalise even more extreme RW POVs.

And here we are.


Fox is a cable network.

Fairness doctrine only applied to limited spectrums (Radio) not to cable.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: