Hacker News new | past | comments | ask | show | jobs | submit | roughly's comments login

> one has to pay to enter the toilet.

I’m not sure what people were expecting when they implemented this policy. There’s plenty of places to pee in the world, a restroom is a request it happens in one particular place.


One note on this with regards to the “flat” design - the technical reasoning for that was it decoupled the interface from the screen. Flat designs were all vector, and could scale to any screen or interface size. This has effectively unpinned Apple from fixed screen sizes (I’m genuinely not sure if any two iPhone designs have shared the same pixel counts in the last 5-10 years) and allowed them to scale the interface to any size.

I’m not sure what I think about liquid glass, but I do agree with the premise that it’s being driven by the move towards AR and extending interfaces outside the phone/tablet.

I think another interesting tell here will be the 20th anniversary iPhone, which should be coming in 2027 - the iPhone X set the tone for Apple devices for the next decade (so far), and I’d expect to get a better idea of what Apple’s doing here when they show off that hardware.


I’ve seen skeuomorphic designs done with vector art, surely this can’t be the only/real reason.

like most things it was probably a combination of things:

marketing (big new design), design trend catch-up (metro, android), and all those other technical reasons (memory, textures, vector graphics, enables easy dark-mode) etc etc

just my guess, but making a dark mode (more easily) possible must have been a large factor too


Apple has worked out a system of scaling to deal with differing pixel counts. One just provides assets in @2x and @3x versions. And most designers design non-vector assets once in 3x, and then downscale once. This system works remarkably well given that we have long reached the sweet spot of screen DPI.

I have a very open mind wrt liquid glass.

I expect an "AR based UI" to somehow leverage depth of field and focus. Blur and translucency/transparency used to achieve that could be amazing.

I'm reminded of prior UIs which had Z depth. One of the iOS releases had parallax.

Remember that awesome demo repurposing two Wii controllers to do head tracking? It transformed the screen into a portal that you that thru. Moving your head around changed your perspective.

I want that.

I just started watching the WWDC videos. So far I like what I see. I'm on board with stacked components; we'll see how it pans out. I love the idea of morphing UI elements, transitioning between list <-> menu bar; I really want this to succeed.

Mostly, I want less clutter. No matter the appearance or theme, I'm overwhelmed by all the icons, options, etc.

The age old conundrum of balancing ease of use against lots of features. Having created UIs in anger, I'm no smarter than anyone else and don't have any ideas to offer.

Further, I'm apprehensive about voice (w/ GPT). Methinks this will become the best strategy for reducing visual clutter.

Being an old, I just hate talking to my computer. Though I accept it feels natural for others, like my son's generation.


> People do not want invasive glasses, even if they make them as small at normal glasses. I just don't see it becoming anything other than a niche product.

Wait, are you arguing that consumers will reject something that puts, say, a social media feed in front of their face 24hrs a day? That will allow them to just gaze at an internet site constantly without even having to think about it? That will allow them to have videos in their peripheral vision while they “concentrate” on something else?

AR headsets will not replace computers, they’ll replace phones.


Defector had a great piece on roughly this point: https://defector.com/how-will-the-golden-age-of-making-it-wo...

So’s Azure? https://downdetector.com/status/windows-azure/

This is where we get to learn about the one common system all of our “distributed cloud” systems rely on, isn’t it?


My gut says all clouds spike when one goes down from people misreporting issues.

But I suppose there's always "something something BGP" but that feels less likely.


This is neat! I’m not fully through it yet, but just wanted to emphasize this:

> And understanding molecular motion is key for everything in biology, everything in biology is vibrating molecules underneath the surface!

Coming into bio as a programmer, this is the absolute sin qua non rule you need to internalize: there are no boundaries between systems, because everything is jiggling atoms. DNA encodes for genes, except the transcription process is heavily mediated by the physical environment and physical constraints of accessing the DNA; RNA transcribes to amino acid strings, except it’s also a molecule, and so sometimes it folds into a structure and just does shit itself; proteins have a function, except sometimes they have many functions, because the “lock and key” metaphor isn’t wrong, except when you’ve got a billion locks and your key’s kinda floppy, it’ll probably fit more than one. Nature plays with physical systems and will repurpose anything to do anything else - the informatics only take you so far, all the real action is vibrating molecules.


> Coming into bio as a programmer, this is the absolute sin qua non rule you need to internalize: there are no boundaries between systems, because everything is jiggling atoms.

(Similar background as you.) Another sine qua non rule is that evolution created biology, it wasn't engineered like software and it doesn't decompose like software. Evolution creates hairballs that has don't respect traditional engineering boundaries and abstraction hierarchies.

From that, along with probabilistic molecular jiggling, we get biological systems that are quite difficult to understand, predict, and control.


It's a good start to realize that what underlies all the understanding of science are simplified predictive models, and usually only statistical models at that.

What this means is that running an experiment in many fields is so difficult that replication is a real challenge. There are so MANY ways you can screw up, or you could just have a statistical fluke that screws you over. Just a tiny contamination or seemingly irrelevant missed step will cause a failure. That's why the idea of having journals composed of failed experiments just doesn't work. Unstated experimental process assumptions are legion. Sometimes an expert can look at the result and see what you've done wrong (like bad contacts in "Electron Band Structure In Germanium, My Ass") and often not even that. Sometimes there's something interesting in the failure, but 99% of the time it's just your pitch is so bad you can't hit the strike zone. Do better!

The things that are easy to replicate (and usually they've been specifically designed that way like Starbucks' over roasted beans), have actually been reduced to engineering. They're not on the edge where scientists can get published. That way perverse incentive madness lies.

Enjoy the controlability of inputs, the repeatability of bugs, the near perfection of compilers and memory allocation, the complete independence of variables while you can. Unless that is, you like Rowhammer and voltage glitch attacks.


If you’re looking for a hobby computer, Framework’s laptops are a lot of fun. There’s something about a machine that’s so intentionally designed to be opened up and tinkered with - it’s not my daily driver, but it’s my go to for silly projects now.

Advertising is a fungus. You let it into any part of your product and it’ll be everywhere before you know it.

> I admit it is exciting to make something you truly believe is good and helpful.

I want to double down on this - I’m lucky enough to have worked places where I truly believed the world would be a better place if we “won,” and not on the margins, and it really, really makes a difference in quality of life. I’ve worked at other places, too, and the cognitive drag of knowing that your skills and efforts - your ability to change the world - is at best being wasted is something you don’t truly feel until it’s gone.


I've wasted countless years on pursuits I thought were good but later determined to have been bad, and therefore deeply regretted. I don't wish this on anyone.

I've also wasted countless years on pursuits I still think were good but overall never truly helped make the world better. This was less bad and seems inevitable.


Yeah I got a couple places on my resume I don’t like to talk about anymore. Turns out an awful lot of things are bad for the world in the wrong hands.

Still, if I’m going to spend a third of my life on something - and, more importantly, if I’m going to be responsible for my efforts contributing to something - I’d prefer it be something I find value in. I’ll take the risk of being wrong - although I’m certainly looking at the world through less rose-tinted glasses than I used to.


I agree, and I'm convinced selling my own software is the only way I can do that. At least for me. I just need to put it all together now, all the skillsets I've honed for decades, and the insight I might have gleaned from what people need.

I recommend “The Dawn of Everything” by David Graeber and David Wengrow - they do a good job dispelling this myth.

Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: