Hacker News new | past | comments | ask | show | jobs | submit login

20 MB is less than 0.25% of my desktop machine's memory, and 2% of a 2010-era netbook. 20 MB, while much larger than what it has to be, is tiny even by the standards of decade-old computers.

RAM is cheap and plentiful. If you don't use it, its value is almost zero (the "almost" comes from OS-level caching of files and CPU-level cache misses of code).




> If you don't use it, its value is almost zero

Yet if you use all of it, its value is also zero - to every other program on the same computer. Don't be a dick; Use what you need, not all you can.


This is ridiculous. 20 MB is so far from "all of it" on any modern computer that it makes me question if you're using something from the 90's. How are you even on Hacker News?

The reduction of the complex trade-off between memory use, CPU, disk, development environment, ease of deployment, and the dozen other variables that go into a choice like picking what toolkit to use to "don't be a dick" is so absurdly simplistic. It's a trade-off - not a single-axis "good or bad" decision.

Moreover, 20 MB of memory usage is going to be an acceptable trade-off for the majority of HN users, who skew webdev, not embedded.


> This is ridiculous. 20 MB is so far from "all of it" on any modern computer that it makes me question if you're using something from the 90's.

If I look at today's top seller computers in Amazon for my country (France, 6th economic power in the world), the top two models both come with 4G of RAM (a chromebook, and a win10). The windows one will already use ~2gigs just for the OS. That leaves 2 gigs of RAM for your apps.

And that's for computers being sold today - they will still be in use in five years.


Meanwhile, you can fit 100 20MB apps into 2GB. I've never seen a normal desktop user use more than 10 graphical applications at once. I've never used more than 20 at once myself.

It's pretty clear that 2O MB of RAM as baseline memory consumption for a graphical application (we're not talking about a runtime that might be used for a bunch of background processes - that would be an issue) is a non-issue for the vast majority of users of such programs.


> you can fit 100 20MB apps into 2GB.

But if a browser like chrome already uses 1.5 gigs there will be a big difference between the app that uses 500 (your average electron app) and the app that uses 50 (your average Qt, GTK, ex, fltk... app). One will swap and make the whole system slow, the other not.


Conversely, to the end user, if they could pick a bit more resources usage (knowing they can close your app, or heaven forbid uninstall it) versus an extra feature what should the dev team prioritise?


> RAM is cheap and plentiful. If you don't use it, its value is almost zero (the "almost" comes from OS-level caching of files and CPU-level cache misses of code).

I have fond memories of my DOS days and the simplicity inherent in a single tasking environment, however nowadays operating systems allow for more than a single application to run at the same time and each application should play nice with the system resources, so even if RAM is cheap and plentiful it doesn't automatically mean that every application should feel entitled to it.

(and also even during the DOS days we had TSRs which had to be RAM conscious too)


>and each application should play nice with the system resources, so even if RAM is cheap and plentiful it doesn't automatically mean that every application should feel entitled to it.

Yes, you shouldn't be a bad neighbor, but the OS will generally move things around to accommodate you as necessary. RAM is an afterthought for most of these applications for a reason.

This attitude is like buying a sports car and never redlining it.


> Yes, you shouldn't be a bad neighbor, but the OS will generally move things around to accommodate you as necessary.

I'm not sure what you mean with that. The OS will not "move things around" to the point where the resource abuse wont be noticeable, all it can do is swap stuff to the disk, perhaps compress some RAM and maybe unload any cold code (though code doesn't that that much RAM) and all that take time, slowing down the system.

> RAM is an afterthought for most of these applications for a reason.

Yes and that reason is disinterest from the application developers for RAM usage.

> This attitude is like buying a sports car and never redlining it.

Sports cars have nothing to do with this, i do not see the relevance.


At least on Linux swap is a great tool for shuffling things you probably statistically wont use again out while retaining the ability to transparently recall them if it turns out your system guessed wrong.

If you find yourself in a situation where your OS is actually shuffling things around for your system to function you will find your performance and desktop experience has gone to absolute dog shit. It's entirely likely that the user will actually hard reboot the machine because they conclude it has frozen.

The absolutely only way to have a nice desktop experience in Linux is to ensure you have enough ram for all the things you intend to run at once which means have at least 8GB-16GB of RAM and don't run too many app once from people who think unused RAM is wasted RAM.


> don't run too many app once from people who think unused RAM is wasted RAM

The baseline memory usage of Svelte NodeGUI is 20 MB. 400 instances of that can fit into 8 GB of RAM. Don't you even try to tell me that you've run 400 separate GUI applications at once.

Let me repeat it again: unused RAM is wasted RAM. This is a fact. It does nothing when neither you nor the OS is using it - and the value of the OS using a byte of RAM for caching is tiny compared to the value of you using it for an application you care about.

The above also has nothing to do with wasting RAM. If you've spent any significant amount of time developing programs for actual users (read: not programmers), you'll know that development is a complex, multi-variable tradeoff - and one of the biggest trade-offs is RAM usage for performance, so if you solely optimize for minimal RAM usage, you'll always (except for the most trivial of programs written specifically as a counterexample to this claim) end up sacrificing performance.

The wastefulness of 1 GB of RAM usage varies wildly depending on whether you're running a video editing program on a large file (hey, that's not that bad!) or a simple textual chat application. 20 MB for a graphical tool is an acceptable tradeoff in the vast majority of use-cases.


> Let me repeat it again

Any time you actually say this delete the sentence if you want anyone to actually read what you are saying. I made no assertions specifically about NodeGUI. The idea I was responding to is

> the OS will generally move things around to accommodate you as necessary

Because this isn't accurate performance goes to hell when applications contend for ram. If you haven't noticed it you probably have enough ram to not have that issue not because your OS "moved stuff around" at least on windows/linux I have never owned a mac.

I agree that 20MB baseline for a gui app is fine.


> This attitude is like buying a sports car and never redlining it.

This is a perfect analogy. To stretch it, the majority of sports car owners are developers. The vast majority of your users are not.


We can play this game, sure.

The vast majority of cars on the road today have more than enough horsepower and can handle being redlined.

Most users have enough RAM unless they're running on some absurdly low 4GB< device, which is just nowhere near as common these days.


> Most users have enough RAM unless they're running on some absurdly low 4GB< device, which is just nowhere near as common these days.

Citation needed. My experience with people outside the tech bubble is that they don’t know what RAM is, and will not consider it when purchasing a computer. Most of these people now also live primarily on a phone or tablet too, because their computers are too slow.


In case this discussion is still about a 20 MB RAM NodeGUI application, I don't think this is a comparable situation because DOS was really not about developing cross-platform applications with a Turing complete styling language. It was about building text mode tools that ran on x86 and nothing else. I swear even the "bloated" Windows 10 will have absolutely no trouble running a Win32 Console application at < 1 MB RAM consumption but that's not what this is about.


My reference of DOS was tongue-in-cheek and i made it because it only allowed a single program to run (ignoring TSRs) at any time so programs using all the memory wasn't much of a problem.


HN: "RAM is cheap and plentiful."

Also HN: "The MacBook Pro is worthless if I can't get it with at least 64GB of RAM."


Those don't seem contradictory to me. RAM is cheap and plentiful, and getting 64GB of RAM is easy, but Apple just doesn't want to sell that config right now. You can still get a 64GB config on another system.


Everyday users don’t get Ram upgrades. Many programmers are elitist snobs without any consideration for the end user


> If you don't use it, its value is almost zero

But the value is also zero if you're using extra for no benefit. Actually it's negative, because it prevents other programs from using it.

You should try to use all your RAM, yes, _but in ways that are actually useful_. The OS can always use leftover RAM for caching frequently used files if you don't have a better use for it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: