I find it ludicrous that the developers of an app as insignificant as a screen recorder would think it's necessary to check for updates every 5 minutes.
The big clouds love these people too. So much of the software industry is just an outrageous combination of inexperience and "YOLO". Every problem can be solved by just giving AWS another $100,000 this month because we don't have time (and don't know how) to make even basically optimized software. So just burn the gas and electricity and give more money to the YAML merchants at Amazon.
Data centers are big and scary, no body wanted to run their own. The hypothetical cost savings of firing half the IT department was too good to pass up.
AWS even offered some credits to get started, first hit's free.
Next thing you know your AWS spend is out if control. It just keeps growing and growing and growing. Instead of writing better software, which might slow down development, just spend more money.
Ultimately in most cases it's cheaper in the short term to give AWS more money.
Apart of me wants to do a 5$ VPS challenge. How many users can you serve with 5$ per month. Maybe you actually need to understand what your server is doing ?
I work for a big org. We’re about to deploy a small grafana setup for monitoring some test environments. Double digit spend per month, maximum. It’s pretty close to impossible to get IT, infosec, purchasing and finance to agree to this in a period of time that I’ll still be employed (and I’m not planning on leaving).
But, on the AWS marketplace I can click a button, a line item is added to our bill, and infosec are happy because it’s got the AWS checkmark beside it. Doesn’t matter what it costs, as long it goes through the catalog.
That’s why big companies use AWS.
At my last job, I worked for a vc backed startup. I reached out to our fund, and they put us in touch with AWS, who gave us $100k in credits after a courtesy phone call.
We are throwing everything under the bus, including the user's battery, CPU, memory, bandwidth, the company's cloud costs and energy usage, just so developers can crap out software slightly faster.
We are providing users with valuable features at a faster rate, saving them and us time, which is a far more valuable asset than "the user's battery, CPU, memory, bandwidth, the company's cloud costs and energy usage".
Doing 'the cloud' right at scale has to involve running your own cloud at some point. We should not pollute the good ideas around API-delivered infrastructure with the more questionable idea of outsourcing your infrastructuree.
OpenStack has been around 15 years powering this idea at scale for huge organizatons, including Wal-Mart, Verizon, Blizzard and more.
Not really. I run several web applications on one 15$ VPS. Although the user count is <5. But I think it would need quite a lot of users for the resource usage to go up to a critical level.
> Every problem can be solved by just giving AWS another $100,000 this month because we don't have time (and don't know how) to make even basically optimized software.
Don't forget the Java + Kafka consultants telling you to deploy your complicated "micro-service" to AWS and you ending up spending tens of millions on their "enterprise optimized compliant best practice™" solution which you end up needing to raise money every 6 months instead of saving costs as you scale up.
Instead, you spin up more VMs and pods to "solve" the scaling issue, which you lose even more money.
Oh, cloud native. For a few years people used to look at you funny if you were ...gasp... using battle-tested open source software instead of the overpriced AWS alternative. I'm so glad we're finally seeing pushback.
This is overblown fear mongering, especially for desktop apps.
There are only a few applications with exposed attack surface (i.e. accept incoming requests from the network) and a large enough install base to cause "massive damage all of the Internet". A desktop screen recorder app has no business being constructed in a manner that's "wormable", nor an install base that would result in significant replication.
The software that we need the "average user" to update is stuff like operating systems. OS "manufacturers" have that mostly covered for desktop OS's now.
Microsoft, even though their Customers were hit with the "SQL Slammer" worm, doesn't force automatic updates for the SQL Server. Likewise, they restrict forcing updates only to mainstream desktop OS SKUs. Their server, embedded, and "Enterprise" OS SKUs can be configured to never update.
Hrm... might depend on the purpose of the update. "New feature X" announcements every few days... I hate and disable. "Warning - update now - security bug"... I want to be notified of those pretty quickly.
Ironically the only real call for an update check every 5 mins would be so you can quickly fix a problem like everyone downloading the update every 5 mins.
I honestly lost so much respect for the author after reading this that I completely bailed on the article. Every 5 minutes update check is entirely unhinged behaviour.
It's a fucking screen recorder. Why does it need to check for updates more than once a month? Why does it need to check for updates at all? It's an appliance. It either records your screen, or it doesn't.
Once a day would surely be sufficient.