Hacker Newsnew | past | comments | ask | show | jobs | submit | aragonite's commentslogin

>In other words, I believe the reason this code is hard to read for many who are used to more "normal" C styles is because of its density; in just a few dozen lines, it creates many abstractions and uses them immediately, something which would otherwise be many many pages long in a more normal style.

I also spent some time with the Incunabulum and came away with a slightly different conclusion. I only really grokked it after going through and renaming the variables to colorful emojis (https://imgur.com/F27ZNfk). That made me think that, in addition to informational density, a big part of the initial difficulty is orthographic. IMO two features of our current programming culture make this coding style hard to read: (1) Most modern languages discourage or forbid symbol/emoji characters in identifiers, even though their highly distinctive shapes would make this kind of code much more readable, just as they do in mathematical notation (there's a reason APL looked the way it did!). (2) When it comes to color, most editors default to "syntax highlighting" (each different syntactic category gets a different color), whereas what's often most helpful (esp. here) is token-based highlighting, where each distinct identifier (generally) gets its own color (This was pioneered afaik by Sublime Text which calls it "hashed syntax highlighting" and is sometimes called "semantic highlighting" though that term was later co-opted by VSCode to mean something quite different.) Once I renamed the identifiers so it becomes easier to recognize them at a glance by shape and/or color the whole thing became much easier to follow.


I've experimented a few times with coloring my variables explicitly (using a prefix like R for red, hiding the letters, etc) after playing with colorforth. I agree getting color helps with small shapes, but I think the colors shouldn't be arbitrary: every character Arthur types is a choice about how the code should look, what he is going to need, and what he needs to see at the same time, and it seems like a missed opportunity to turn an important decision about what something is named (or colored) over to a random number generator.

> (1) Most modern languages discourage or forbid symbol/emoji characters in identifiers

> (2) When it comes to color,

Call me boomer if you wish, but if you can't grasp the value of having your code readable on a 24 rows by 80 columns, black and white screen, you are not a software developer. You are not even a programmer: at most, you are a prompt typist for ChatGPT.


While I agree that, if the function at hand can’t fit in a 25x80 window it most likely should be broken in smaller functions, there are kinder ways to say that.

I also joke God made the VT100 with 80 columns for a reason.


... For the reason that IBM made their 1928 card with 80 columns, in an attempt to increase the storage efficiency of Hollerith’s 45-column card without increasing its size?

That said, ~60 characters per printed line has been the typographer’s recommendation for much longer. Which is why typographers dislike Times and derivatives when used on normal-sized single-column pages, as that typeface was made to squeeze more characters into narrow newspaper columns (it’s in the name).


The fact that the claim is wrong on multiple levels (IBM punchcards, VT100 did 132 columns as well) is part of the fun.

23x75 to allow for a status bar and the possibility that the code may be quoted in an email. Also, it’s green on black. Or possibly amber.

And yet I still have a utility named "~/bin/\uE43E"


\uExxx is in the private use area. What is it?

Fun fact: both HN and (no doubt not coincidentally) paulgraham.com ship no DOCTYPE and are rendered in Quirks Mode. You can see this in devtools by evaluating `document.compatMode`.

I ran into this because I have a little userscript I inject everywhere that helps me copy text in hovered elements (not just links). It does:

[...document.querySelectorAll(":hover")].at(-1)

to grab the innermost hovered element. It works fine on standards-mode pages, but it's flaky on quirks-mode pages.

Question: is there any straightforward & clean way as a user to force a quirks-mode page to render in standards mode? I know you can do something like:

document.write("<!DOCTYPE html>" + document.documentElement.innerHTML);

but that blows away the entire document & introduces a ton of problems. Is there a cleaner trick?


I wish `dang` would take some time to go through the website and make some usability updates. HN still uses a font-size value that usually renders to 12px by default as well, making it look insanely small on most modern devices, etc.

At quick glance, it looks like they're still using the same CSS that was made public ~13 years ago:

https://github.com/wting/hackernews/blob/5a3296417d23d1ecc90...


I trust dang a lot; but in general I am scared of websites making "usability updates."

Modern design trends are going backwards. Tons of spacing around everything, super low information density, designed for touch first (i.e. giant hit-targets), and tons of other things that were considered bad practice just ten years ago.

So HN has its quirks, but I'd take what it is over what most 20-something designers would turn it into. See old.reddit Vs. new.reddit or even their app.


There's nothing trendy about making sure HN renders like a page from 15 years ago should. Relative font sizes are just so basic they should count as a bug fix and not "usability update".

Overall I would agree but I also agree with the above commenter. It’s ok for mobile but on a desktop view it’s very small when viewed at anything larger than 1080p. Zoom works but doesn’t stick. A simple change to the font size in css will make it legible for mobile, desktop, terminal, or space… font-size:2vw or something that scales.

It’s not ok for mobile. Misclicks all around if you don’t first pinch zoom to what you are trying to click.

Indeed, the vast majority of things I've flagged or hidden have been the accidental result of skipping that extra step of zooming.

> Zoom works but doesn’t stick.

perhaps try using a user agent that remembers your settings? e.g. firefox


Perhaps not recommend workarounds to lack of utilizing standards.

Setting aside the relative merits of 12pt vs 16pt font, websites ought to respect the user's browser settings by using "rem", but HN (mostly[1]) ignores this.

To test, try setting your browser's font size larger or smaller and note which websites update and which do not. And besides helping to support different user preferences, it's very useful for accessibility.

[1] After testing, it looks like the "Reply" and "Help" links respect large browser font sizes.


Side note: pt != px. 16px == 12pt.

You are correct, it should have been "px".

Please don’t. HN has just the right information density with its small default font size. In most browsers it is adjustable. And you can pinch-zoom if you’re having trouble hitting the right link.

None of the ”content needs white space and large fonts to breathe“ stuff or having to click to see a reply like on other sites. That just complicates interactions.

And I am posting this on an iPhone SE while my sight has started to degrade from age.


Yeah, I'm really asking for tons of whitespace and everything to breathe sooooo much by asking for the default font size to be a browser default (16px) and updated to match most modern display resolutions in 2025, not 2006 when it was created.

HN is the only site I have to increase the zoom level, and others below are doing the same thing as me. But it must be us with the issues. Obviously PG knew best in 2006 for decades to come.


On the flipside, HN is the only site I don't have to zoom out of to keep it comfortable. Most sit at 90% with a rare few at 80%.

16px is just massive.


Sounds like your display scaling is a little out of whack?

Yeah, this is like keeping a sound system equalized for one album and asserting that modern mastering is always badly equalized. Tune the system to the standard, and adjust for the oddball until it's remastered.

Except we all know what happened to the "standard" with the Loudness War.

I'm not a fan of extreme compression and limiting, but doing so in a multiband fashion (as occurs due the loudness war) actually does result in more consistent EQ from album to album, label to label, genre to genre, etc. which virtually eliminates the need to adjust EQ at playback time between each post-war selection.

You're obviously being sarcastic, but I don't think that it's a given that "those are old font-size defaults" means "those are bad font-size defaults." I like the default HN size. There's no reason that my preference should override yours, but neither is there any reason that yours should override mine, and I think "that's how the other sites are" intentionally doesn't describe the HN culture, so it need not describe the HN HTML.

on mobile at least, i find thati can frequently zoom in, but can almost never zoom out, so smaller text allows for more accessibility than bigger text

Browser (and OS) zoom settings are for accessibility; use that to zoom out if you've got the eyes for it. Pinching is more about exploring something not expected to be readily seen (and undersized touch targets).

Don't do this.

I agree, don't set the default font size to ~12px equiv in 2025.

[flagged]


Do you think that "Don't do this" as a reply comment is following the spirit of the guidelines? It doesn't seem very thoughtful or substantive to me.

Content does need white space.

HN has a good amount of white space. Much more would be too much, much less would be not enough.


No kidding. I've set the zoom level so long ago that I never noticed, but if I reset it on HN the text letters use about 2mm of width in my standard HD, 21" display.

> but if I reset it on HN the text letters use about 2mm of width in my standard HD, 21" display.

1920x1080 24" screen here, .274mm pitch which is just about 100dpi. Standard text size in HN is also about 2mm across, measured by the simple method of holding a ruler up to the screen and guessing.

If you can't read this, you maybe need to get your eyes checked. It's likely you need reading glasses. The need for reading glasses kind of crept up on me because I either work on kind of Landrover-engine-scale components, or grain-of-sugar-scale components, the latter viewed down a binocular microscope on my SMD rework bench and the former big enough to see quite easily ;-)


Shameless plug: I made this userstyle to make HN comfortable to handle both on desktop and mobile. Minimal changes (font size, triangles, tiny bits of color), makes a huge difference, especially on a mobile screen.

https://userstyles.world/style/9931/


Thanks for that, it works well, and I like the font choice! Though personally I found the font-weight a bit light and changed it to 400.

> HN still uses a font-size value that usually renders to 12px by default as well, making it look insanely small on most modern devices, etc.

On what devices (or browsers?) it renders "insanely small" for you? CSS pixels are not physical pixels, they're scaled to 1/96th of an inch on desktop computers, for smartphones etc. scaling takes into account the shorter typical distance between your eyes and the screen (to make the angular size roughly the same), so one CSS pixel can span multiple physical pixels on a high-PPI display. Font size specified in px should look the same on various devices. HN font size feels the same for me on my 32" 4k display (137 PPI), my 24" display with 94 PPI, and on my smartphone (416 PPI).


On my MacBook it's not "insanely small", but I zoom to 120% for a much better experience. I can read it just fine at the default.

On my standard 1080p screen I gotta set it to 200% zoom to be comfortable. Still LOTS of content on the screen and no space wasted.

> At quick glance, it looks like they're still using the same CSS that was made public ~13 years ago:

It has been changed since then for sure though. A couple of years ago the mobile experience was way worse than what it is today, so something has clearly changed. I think also some infamous "non-wrapping inline code" bug in the CSS was fixed, but can't remember if that was months, years or decades ago.

On another note, they're very receptive to emails, and if you have specific things you want fixed, and maybe even ideas on how to do in a good and proper way, you can email them (hn@ycombinator.com) and they'll respond relatively fast, either with a "thanks, good idea" or "probably not, here's why". That has been my experience at least.


I hesitate to want any changes, but I could maybe get behind dynamic font sizing. Maybe.

On mobile it’s fine, on Mac with a Retina display it’s fine; the only one where it isn’t is a 4K display rendering at native resolution - for that, I have my browser set to 110% zoom, which is perfect for me.

So I have a workaround that’s trivial, but I can see the benefit of not needing to do that.


The font size is perfect for me, and I hope it doesn’t get a “usability update”.

“I don’t see any reason to accommodate the needs of others because I’m just fine”

I bet 99.9% of mobile users' hidden posts are accidentally hidden

12 px (13.333 px when in the adapted layout) is a little small - and that's a perfectly valid argument without trying to argue we should abandon absolute sized fonts in favor of feels.

There is no such thing as a reasonable default size if we stop calibrating to physical dimensions. If you choose to use your phone at a scaling where what is supposed to be 1" is 0.75" then that's on you, not on the website to up the font size for everyone.


I find it exactly the right size on both PC and phone.

There's a trend to make fonts bigger but I never understood why. Do people really have trouble reading it?

I prefer seeing more information at the same time, when I used Discord (on PC), I even switched to IRC mode and made the font smaller so that more text would fit.


I'm assuming you have a rather small resolution display? On a 27" 4k display, scaled to 150%, the font is quite tiny, to the point where the textarea I currently type this in (which uses the browsers default font size) is about 3 times the perceivable size in comparison to the HN comments themselves.

Agreed. I'm on an Apple Thunderbolt Display (2560x1440) and I'm also scaled up to 150%.

I'm not asking for some major, crazy redesign. 16px is the browser default and most websites aren't using tiny, small font sizes like 12px any longer.

The only reason HN is using it is because `pg` made it that in 2006, at a time when it was normal and made sense.


Yup, and these days we have relative units in CSS such that we no longer need to hardcode pixels, so everyone wins (em, rem). That way people can get usability according to the browsers defaults, which make the whole thing user configurable.

1920x1080 and 24 inches

Maybe the issue is not scaling according to DPI?

OTOH, people with 30+ inch screens probably sit a bit further away to be able to see everything without moving their head so it makes sense that even sites which take DPI into account use larger fonts because it's not really about how large something is physically on the screen but about the angular size relative to the eye.


Yeah, one of the other cousin comments mentions 36 inches away. I don't think they realize just how far outliers they are. Of course you have to make everything huge when your screen is so much further away than normal.

I have HN zoomed to 150% on my screens that are between 32 and 36 inches from my eyeballs when sitting upright at my desk.

I don't really have to do the same elsewhere, so I think the 12px font might be just a bit too small for modern 4k devices.


I'm low vision and I have to zoom to 175% on HN to read comfortably, this is basically the only site I do to this extreme.

I have mild vision issues and have to blow up the default font size quite a bit to read comfortably. Everyone has different eyes, and vision can change a lot with age.

Even better: it scales nicely with the browser’s zoom setting.

Text size is easily fixed in your browser with the zoom setting. Chrome will remember the level you use on a per site basis if you let it.

I'm sure they accept PRs, although it can be tricky to evaluate the effect a CSS change will have on a broad range of devices.

The text looks perfectly normal-sized on my laptop.

Really? I find the font very nice on my Pixel XL. It doesn't take too much space unlike all other modern websites.

A uBlock filter can do it: `||news.ycombinator.com/*$replace=/<html/<!DOCTYPE html><html/`

Could also use tampermonkey to do that, also perform the same function as OP.

There is a better option, but generally the answer is "no"; the best solution would be for WHATWG to define document.compatMode to be writable property instead of readonly.

The better option is to create and hold a reference to the old nodes (as easy as `var old = document.documentElement`) and then after blowing everything away with document.write (with an empty* html element; don't serialize the whole tree), re-insert them under the new document.documentElement.

* Note that your approach doesn't preserve the attributes on the html element; you can fix this by either pro-actively removing the child nodes before the document.write call and rely on document.documentElement.outerHTML to serialize the attributes just as in the original, or you can iterate through the old element's attributes and re-set them one-by-one.


On that subject I would be fine if the browser always rendered in standard mode. or offered a user configuration option to do so.

No need to have the default be compatible with a dead browser.

further thoughts: I just read the mdn quirks page and perhaps I will start shipping Content-Type: application/xhtml+xml as I don't really like putting the doctype in. It is the one screwball tag and requires special casing in my otherwise elegant html output engine.


Still possible in VSCode through somewhat hackish methods (esp. arbitrary CSS injection via createTextEditorDecorationType). Here are some quick screenshots of random JS/Rust examples in my installation: https://imgur.com/a/LUZN5bl


This is one of those things where both extremes of madness and genius wrap around to infinity and meet again.


Honestly, it looks like a ransom request letter! :D


Saw this comment and couldn't figure out what it implied so I clicked on the link.

Now I see it definitely made sense.


Has anyone had success getting a coding agent to use an IDE's built-in refactoring tools via MCP especially for things like project-wide rename? Last time I looked into this the agents I tried just did regex find/replace across the repo, which feels both error-prone and wasteful of tokens. I haven't revisited recently so I'm curious what's possible now.


That's interesting, and I haven't, but as long as the IDE has an API for the refactoring action, giving an agent access to it as a tool should be pretty straightforward. Great idea.


Serena MCP does this approach IIRC


Please consider making the UI respect the user's custom text scaling settings for accessibility. I'm not referring to DPI scaling but the TextScaleFactor value at HKCU\Software\Microsoft\Accessibility (see [1][2]) that users can set in Ease of Access > Display > Make text bigger.

(Failing that, adding basic support for scaling text or UI via ctrl+plus/minus would be a huge improvement!)

With the exception of Chromium/Chrome [3] this's been a persistent issue with Windows desktop apps from Google (most of these also use hard-coded control sizes making the problem worse).

[1] https://learn.microsoft.com/en-us/windows/apps/design/input/...

[2] https://learn.microsoft.com/en-us/uwp/api/windows.ui.viewman...

[3] https://issues.chromium.org/issues/40586200


I'm split with this. If it helps other people, then I'm all for it. But speaking as someone who is legally blind and makes extensive use of these settings, Windows 10 accessibility drives me mad. I'm waiting for fractional scaling to improve for Linux so I can make the switch.

The problem with Make text bigger and Make everything bigger is they apply to every single application that supports them. Let's say I have two applications: A is comfortable enough to see and B isn't. If I change either of these settings to help me use B, A could now be a problem because it can take up too much screen real estate, which makes it unusable for a different reason.

This doesn't sound like much of a problem until you have 5 or more applications you're trying to balance via these two settings. In reality, it's more complex than I'm describing because I may need to change both settings to help with a new application, which then means I have to continuously test every other application I use to make sure they're all somewhat comfortable enough to use.

If an application I use updates to include support for these settings, I have to go through all this unplanned work again to try and make everything usable once more. It's frustrating.

I know people make fun of Electron, but one major plus point for me is I have per application scaling when using it, and so it gives me better accessibility than Windows does by far.

> (Failing that, adding basic support for scaling text or UI via ctrl+plus/minus would be a huge improvement!)

I'd consider this to be a better option.


Try Fedora with KDE. It has fractional scaling, per display.

I set my laptop (1920x1080) to 120%, effectively making it 1600x900 but with very good physical size of things. I set my external panel (2560x1440) to 160%, effectively making it 1600x900 also. KDE even visualizes the two panels to be the same size. Ontop of these basic DPI settings, you can then tweak font/text even further. Its quite amazing. Windows cannot do custom dpi per monitor, only a single custom dpi that gets applied to all monitors.

If you do go down the fractional scaling rabit hole, make sure whatever values you pick, both the height and width ends without any fractions after applying your custom dpi... that elimnates all blurs. In my example above, 2560/1.6 and 1440/1.6 gives nice round numbers, even though the operating systems typically only offer 100/125/150/175/200 etc.

I built a small console app for myself that takes the resolution and tests all increments of 1% to see which resolution combinations gives values that don't end with fractions at the end. So it tells me which effective resolutions I will get at which % settings. Its awesome and made it so that I can easily make so that my laptop and external display has the same amount of space (or line of code) on the screen, even though they are different physical sizes.


> Windows cannot do custom dpi per monitor, only a single custom dpi that gets applied to all monitors.

This is wrong. Windows supports per monitor DPI since Windows 8 and have an improved API since Windows 10. I find it the only good implementation among desktop OSes. It is the only one that guarantees that font renders align with the pixel grid.

Many old apps do not support this API though. It is opt-in and while there is a hybrid mode to let Windows scale fonts and Win32 components via API hooks, without implementing DPI change callback most apps turn into blurry mess.

Usually browsers have the gold standard implementation of those callbacks hence why Electron is used everywhere.


Brother I'm looking right at it. I cannot set one monitor to 120% and another to 160% (both are custom values), like on KDE. If I use a custom setting it gets applied to both monitors, in fact it gets grayed out for some reason - the values don't even show properly. Only a reset button available that logs you out to reset it to 100%.

If I want to set them to different scaling factors, I have to use one of the values from the drop downs (100/125/150/175/200%), which is not what I want.


You have literally said this:

> Windows cannot do custom dpi per monitor, only a single custom dpi that gets applied to all monitors.

Here are all of my monitors at different DPIs: https://imgur.com/a/q3z2P1E . They don't have a "single DPI" that gets applied to all of them. The custom DPI setting is for changing all base system DPI.

> I cannot set one monitor to 120% and another to 160% (both are custom values), like on KDE.

Okay you're unhappy with the granularity. Yes Windows uses 25% granularity.

I don't know if this will work but you can probably do a combination of changing the base DPI and then calculating the 25%. So you can set the base DPI to something like 120 (which is 125%) and then set the other monitor to 125% which gives 156%:

I think the base DPI is stored in this registry key:

HKEY_CURRENT_USER\Control Panel\Desktop\WindowMetrics\AppliedDPI

It is a DWORD value


Thanks for detailed response. Do you happen to know if this is a recent change in Fedora/KDE? I tried somewhat recently, although I can't remember quite when that was. Gnome had experimental support for fractional scaling at the time but it wasn't good enough to switch to.

> Windows cannot do custom dpi per monitor, only a single custom dpi that gets applied to all monitors.

Yeah, support for custom DPI in general isn't great. I've been using https://www.majorgeeks.com/files/details/windows_10_dpi_fix.... for years to at least partially help.

Edit: I think I answered my own question about how recent the change might have been: https://blogs.kde.org/2024/12/14/this-week-in-plasma-better-...

This seems to be just after I last tried. I'll give it another go, thanks BatteryMountain!


As a word of warning, it is still not 100% perfect. I've noticed that on my laptop, when the Zed editor is maximized, there is a tiny gap between it and the panel at the bottom. I think this happens when an app, even if it supports fractional scaling in general, can't handle a logical window size that is not a whole number. To be fair, this is one of the only apps I've really had any scaling issues with lately, and it is just a minor visual annoyance. The Linux DPI scaling story is finally pretty solid.

Also, many apps (including Electron/Chromium apps) will still run under XWayland when using a Wayland session by default, because there are still a handful of small issues and omissions in their Wayland drivers. (It's pretty minor for Electron/Chromium so you can opt to force it to use native Wayland if you want.) In case of XWayland apps, you'll have to choose between allowing X11 apps to scale themselves (like the old days) or having the compositor scale them (the scaling will be right, even across displays, but it will appear blurry at scales higher than 1x.) I still recommend the Wayland session overall; it gives a much more solid scaling experience, especially with multiple monitors.


> In case of XWayland apps, you'll have to choose between allowing X11 apps to scale themselves (like the old days) or having the compositor scale them (the scaling will be right, even across displays, but it will appear blurry at scales higher than 1x.) I still recommend the Wayland session overall; it gives a much more solid scaling experience, especially with multiple monitors.

I'm wondering if this was the problem I was running into before – it sounds eerily familiar. I never got far enough to explore individual apps outside of preinstalled ones because I couldn't get comfortable enough. I appreciate your response as I wasn't aware of the different session types.


Yeah, it probably has something to do with this. In X11 sessions, the display server does not typically handle scaling. Instead, the desktop environments provide scaling preferences to UI toolkits that then do the scaling themselves. In Wayland, the display server does handle scaling.

In both X11 and Wayland, you should usually see most applications following your scaling preferences nowadays. In Wayland sessions, you can ensure that applications always appear at the correct size, though at the cost of "legacy" applications appearing blurry. This behavior is configured in the Display Settings in KDE Plasma.

Also possibly useful: if you like the KDE Plasma session, it has a built-in magnifier; just hold Ctrl+Meta and use the scroll wheel.


> Yeah, it probably has something to do with this. In X11 sessions, the display server does not typically handle scaling. Instead, the desktop environments provide scaling preferences to UI toolkits that then do the scaling themselves. In Wayland, the display server does handle scaling.

Presumably this leads to a more unified scaling experience. This was one thing I was concerned about before, as it didn't seem that way. That's a solid improvement on its own.

> Also possibly useful: if you like the KDE Plasma session, it has a built-in magnifier; just hold Ctrl+Meta and use the scroll wheel.

This is useful yes, along with the rest of your comments. Thanks for your help.


Thats why you need to calculate which "eventual resolutions" divides down to values without fractions after your scaling has been applied.

So if you take a 2560x1440 panel, 160%/1.6 scaling factor will give you 1600x900, hence there won't be any artifacts. Between 100% and 200% there are maybe 5 combinations that will give you clean resolutions.

As an example:

Enter monitor Width (1920):

Enter monitor Height (1080):

1920x1080 at 100%

1600x900 at 120%

1536x864 at 125%

1280x720 at 150%

1200x675 at 160%

960x540 at 200%

800x450 at 240%

768x432 at 250%

640x360 at 300%

Aything besides these value WILL give you artifacts at some level.


Curious about GNOME fractal scaling issues you experience.

I currently have the experimental feature enable at 150% scale for a laptop screen at 2560x1600 resolution. Have not had any issues by itself nor with an external 3440x1400 at 100% scale with GNOME 48.


I wish I could give you a better answer here, but I honestly don't remember. I only remember that something I needed was missing from it for me to make the switch.


So Gnome does support it, but it is terrible. Last I tried it, it also applied the custom scaling value to all the displays like Windows. KDE does it perfectly.


Ideally, applications should use the Windows settings by default, but allow configuring a different scaling. Even more ideally, Windows should allow per-application settings, but until it does it’s the applications’ job.


Part of my wonders if this is what Microsoft hoped would happen when they implemented the settings in the manner they did. But it hasn't played out that way.


It is.

Any app implementation of the windows setting could expose a multiplier of it somewhere. They already did the hard part of building a dynamic UI...


What are your thoughts on screen magnifiers? Personally I tend to increase scaling a bit and use Magnifier for anything that's too small (or increase the font size in the application if possible)


I try to avoid using them. If I can, I prefer to configure my environment to not need them, but that does take a fair amount of work. I get by because of my technical knowledge. I don't know how other people cope.


Incidentally I once ran into a mature package that had lived in the 0.0.x lane forever and treated every release as a patch, racking up a huge version number, and I had to remind the maintainer that users depending with caret ranges won't get those updates automatically. (In semver caret ranges never change the leftmost non-zero digit; in 0.0.x that digit is the patch version, so ^0.0.123 is just a hard pin to 0.0.123). There may occasionally be valid reasons to stay on 0.0.x though (e.g. @types/web).


Presumably they’re following https://0ver.org/


Isn’t vim or bash kinda like that? One of them publishes something like a few hundred patches on top the released tarball…


Maybe that is intentional? Which package is it?


It's the type definitions for developing chrome extensions. They'd been incrementing in the 0.0.x lane for almost a decade and bumped it to 0.1.0 after I raised the issue, so I doubt it was intentional:

https://www.npmjs.com/package/@types/chrome?activeTab=versio...


This is part of the DefinitelyTyped project. DT tends to get a lot of one-off contributions just for fixing the one error a dev is experiencing. So maybe they all just copied the version incrementing that previous commits had done, and no one in particular ever took the responsibility to say "this is ready now".


threejs ?



When trying to understand complex C codebase I've often found it helpful to rename existing variable as emojis. This makes it much easier to track which variables are used where & to take in the pure structure of the code at one glance. An example I posted previously: https://imgur.com/F27ZNfk

Unfortunately most modern languages like Rust and JS follow the XID_Start/XID_Continue recommendation (not very well-motivated imo) which excludes all emoji characters from identifiers.


wouldn't writing a parser of sorts that would replace emojis with a valid alphabetical string identifier be trivial?


You're right that writing a preprocessor would be straightforward. But while you're actively editing the code, your dev experience will still be bad: the editor will flag emoji identifiers as syntax errors so mass renaming & autocompletion won't work properly. Last time I looked into this in VSCode I got TypeScript to stop complaining about syntax errors by patching the identifier validation with something like `if (code>127) return true` (if non-ascii, consider valid) in isUnicodeIdentifierStart/isUnicodeIdentifierPart [1]. But then you'd also need to patch the transpiler to JS, formatters like Prettier, and any other tool in your workflow that embeds their own version of TypeScript...

[1] https://github.com/microsoft/TypeScript/blob/81c951894e93bdc...


There's a pretty amazing video here showing a Prince Rupert's drop defeaing a hydraulic press: https://www.youtube.com/shorts/ns7PQjjqHIo


Yeah its not really what it looks like though. They put cylinders of soft metal in place of where you would expect the press to have hardened steel.


Has this been confirmed? The original channel also posted a comparison video [1] showing what seems to be the same cylinders tested against titanium and tungsten cubes (though it's difficult to be sure they are identical)

There's also footage from another channel [2] showing a Prince Rupert's Drop bursting at 20 tons with significant damage to both the steel plate and the press.

[1] https://www.youtube.com/watch?v=4SuPFbeqqKU

[2] https://www.youtube.com/watch?v=A6NUNroyUys



You can see the steel deforming - definitely soft steel.


> Because they do things. With their hands. That no one else does

That's only true of surgeons :) What if your specialty is nonsurgical (internal medicine, pediatrics, psychiatry, etc)?


Not even true of all surgeons, the ones that make the most money use machines to work on things their hands couldn't do


pathologists are some of the highest paid doctors and they are right in the crosshairs of what AI is getting better at performing.


Do you really know what pathologists do ? Apparently not...


IT engineers thought the same. Until finally automation is setting them right.


I'm sure going to be amazed if the LLMs of the future 10y suddenly acquire the ability to physically cut just the right bit of a random surgical piece, with a precise idea of where, when and in what orientation the surgeon dug it out, all that with shitty documentation. Humans will be cheaper for a long time still.


Haha. Have you actually ever seen a surgical robot yourself? Your claim is laughable. There is no automation whatsoever in any robot on the market currently.


not automation, yet


Almost all specialties do various technical procedures that only them really know how to do. The extreme is psychoanalytic psychiatry, which are the only ones really doing nothing with their hands (yes, interventional psychiatry is a thing...). Now, you could argue that 'yes, but most of the times it's done by techs/nurses'. Well, no. When things go south, and in all places where there is noone else to do the stuff (of which there are many) docs are on their own.

Regarding surgery, I expect it to be one of the easiest procedures to automate, actually (still quite hard, obviously). Because surgery is the only case where there's always advanced imaging available beforehand, and the environment is relatively fixed (OR).


Why do you think medical science wrt complexity is any different than applied math, which computer science essentially is? People already can use LLMs to assist them in diagnosing health issues so why would it be hard to believe that the doctors won't be using the same kind of assistance soon too?


> Why do you think medical science wrt complexity is any different than applied math

I don't think I wrote that.

Doctors already use tech assistance. I just pointed out that while we've got efficient robots for applied math, we don't have those as agents in the physical world. People who do blue collar jobs are less replaceable. Well, believe it or not, but most doctors are actually blue collar workers.


You sort of implied that with your replies across the thread. And since AI already replaced part of the CS, I was wondering why do you think this would not be the case with doctors. I'm not sure I agree it's a blue collar profession. I can easily see diagnostics being replaced with AI models.


I never wanted to imply that. But here, people frequently assume that because that's what they're used to. Diagnosis is the tip of the iceberg. Most people here aren't sick, so diagnostics are their only focus. If they get ill, they want a diagnosis. But many people are chronically ill already, and doctors spend most of their time treating, not diagnosing. Treating people is made in good part of technical procedures and practical assessments, and you need doctors for that because robots are still far behind for that kind of stuff. People actually have a completely skewed view of what a doctor is.


> People actually have a completely skewed view of what a doctor is.

It could be but treating patients also requires continuous diagnostics, result comprehension, and final assessment so this is certainly the part where AI could play the crucial role.

I don't think anyone thinking of the AI consequences on medicine is arguing that it will replace manual labor such as procedure executions or psychological support. This is obviously not possible so when I see people talking about the "AI in medicine" I read that as mostly complementing the existing work with new technology.


Psychiatrists do that triangle shape with their hands.


Uh, pediatricians do a lot with their hands. I don't think my kids (or future grandkids) will be seeing an AI/robot doctor.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: