Hacker Newsnew | past | comments | ask | show | jobs | submit | eviks's commentslogin

Because it's not limited to games, forcing updates cuts of a lot of apps that can't invest enough in updating.

Also the barrier to use you're suggesting with alternative install/emulator is pretty high for an average user. It also breaks integration with everything else (e.g., a simple alt-tab will show the VM instead of 2 apps running inside)

Also because a lot of progress is regression, so having an old way to opt out into is nice


Integration is the biggest thing. While some desktop VM hosts provide various integration bits like file sharing and rootless window support, the experience is rarely seemless.

Drawing a few examples from an old Raymond Chen blog post[1], integrations required for seemless operation include

• Host files must be accessible in guest applications using host paths and vice versa. Obviously this can't apply to all files, but users will at least expect their document files to be accessible, including documents located on (possibly drive-letter-mapped) network shares.

• Cut-and-paste and drag-and-drop need to work between host and guest applications.

• Taskbar notification icons created by guest applications must appear on the host's taskbar.

• Keyboard layout changes must be synchronized between host and guest.

These are, at least to a useful degree, possible. Integrations that are effectively impossible in the general case:

• Using local IPC mechanisms between host and guest applications. Chen's examples are OLE, DDE, and SendMessage, but this extends to other mechanisms like named pipes, TCP/IP via the loopback adapter, and shared memory.

• Using plug-ins running in the guest OS in host applications and vice versa. At best, these could be implemented through some sort of shim mechanism on a case-by-case basis, assuming the plug-in mechanism isn't too heavily sandboxed, and that the shim mechanism doesn't introduce unacceptable overhead (e.g., latency in real-time A/V applications).

Finally, implementing these integrations without complicated (to implement and configure) safeguards would effectively eliminate most of the security benefits of virtualization.

[1] https://web.archive.org/web/20051223213509/http://blogs.msdn...


> forcing updates cuts of a lot of apps that can't invest enough in updating.

What about emulation?


emulation is addressed in the next sentence? Also see the sibling comment with more details on the list of issues if you "simulate" the OS instead of using the real one

It wasn't before, when I asked. Yes now there is more here about emulation :).

> Bugs exist, and they're sometimes fixed more slowly than we'd like, but given the size of the GitHub ecosystem this is probably just one of many outstanding bugs.

Sorry to be blunt, but you've said nothing of substance. To address the actual criticism you need to explain why these specific "inexcusable bugs" they cite are excusable from your perspective. Otherwise if the whole website doesn't function for months your statement "bugs exist, fixed slower than we'd like" would also apply and be just as meaningless


There was no mind change, just a change in published words from a true expression of his mind into a more bland corporate speak

Yes, of course, many people use the same shortcut for the same action in all the apps

How does Windows without such force to contribute code back have better drivers?

I imagine because the Windows team at Microsoft have an annual budget measured in billions of dollars.

> but a non-profit structure allows others to contribute financially without fear of misappropriation or misuse of funds (as protected by legal requirements and oversight from the fiscal sponsor).

None of the parenthesised provide any strong guarantees against these to alleviate such fears, are there not enough non-profits that misuse funds, say, on too high of an executive compensation instead of product development?


Transactions are public: https://hcb.hackclub.com/ghostty/transactions

HCB staff also do not take kindly to missing receipts or fraudulent behavior.


The fiscal sponsor (Hack Club) is a sponsor to many projects, and they presumably do keep an eye on the finances.

Why would you presume that? Especially given what it is:

> We are teen hackers from around the world who code together

(besides, "many projects" is more likely a downside here as it spreads the oversight resources)


For sure not perfect, but I would simply expect them to care and have an occasional look, which is a lot more than nothing.

Teen hackers, yes. Incompetent or neglectful, absolutely not.


It would be helpful if you actually listed the ways and the else

For one, it’s way faster than both iTerm and terminal.app, the two most used terminal apps on MacOS

Yes, it would.

> other than plain HTML. It's literally a machine-readable format

But we're talking about humans, it's not very human-readable!


You can thank the semantic web dweebs for forcing us to use <stronglyemboldenatethistext> rather than <b> and <emphaticallyitalic> rather than just <i>. They're also the ones responsible for the horrors of xml.

> What a mess! Why is the DirectX 12 documentation so scattered across so many websites in different shapes and forms? Of course, I don't know

While you're at it, do you also not know why they break the URLs from time to time so that you can't follow old guides because they point to empty pages?


Microsoft does this all the time, and it reeks of a lack of continuity or ownership internally.

This and also due to the fact that positions that involve writing and managing documentation typically do not have great paths for promotions.

Not just at Microsoft, but it's an entire industry issue. It's not a job most Software companies value, so ambitious people constantly leave for better positions and the jobs constantly get moved around to the cheapest cost center where ownership and knowledge gets lost and quality declines.


Many of those sites, incredible for a $4 trillion valued company, are managed by teams themselves on their own infra, thus when there is one of those restructuring rounds that big corps love doing almost every year, some of that gets lost.

Every few years they break all links to The Old New Thing. There is no job Microsoft can't botch.

Yup I also enjoyed that time when they replaced everyone's usernames with their hidden emails on that blog.

"continuous development & adaption at a high paced environment"

;-)


Are there display pipelines that cache the generated-for-my-device-resolution svgs instead of doing all the slower parsing etc from scratch every time, achieving benefits of both worlds? And you can still have runtime-defined scaling by "just" rebuilding the cache?

Haiku (OS) caches the vector icons rendered from HVIF[1][2] files which are used extensively for UI.

I didn't find details of the caching design. Possibly it was mentioned to me by waddlesplash on IRC[3].

[1] 500 Byte Images: The Haiku Vector Icon Format (2016) http://blog.leahhanson.us/post/recursecenter2016/haiku_icons...

[2] Why Haiku Vector Icons are So Small | Haiku Project (2006) https://www.haiku-os.org/articles/2006-11-13_why_haiku_vecto...

[3] irc://irc.oftc.net/haiku


Increasingly I think you’ll find that the efficient format for simple icons like this actually isn’t raster, due to (simplifying aggressively) hardware acceleration. We definitely haven’t reached that stage in wide deployment yet, but multiple C++ and Rust projects exist where I strongly suspect it’s already the case, at least on some hardware.

The best place for such a cache is a GPU texture, and in a shader that does simple texture mapping instead of rasterizing shapes it would cost more memory reads in exchange for less calculations.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: