I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
Don't worry, if you are happy with those specs you can get corporate ewaste dell towers on ebay for low prices. "Dell precision tower", I just saw a listing for 32gb ram, Xeon 3.6ghz for about 300 usd.
Personally, at work I use the latest hardware at home I use ewaste.
I got a junk Precision workstation last year as a "polite" home server (it's quiet and doesn't look like industrial equipment, but still has some server-like qualities, particularly the use of ECC RAM). I liked it so much that it ended up becoming my main desktop.
My main desktop is temporarily a Dell server from around 2012 or so. Two were thrown out, each with two 2 GiB sticks of RAM, so I poached the other machine's RAM for a grand total of 8 GiB. I also threw in a small SSD for the / partition (/home is on the old HDD). The thing is dirt slow but I never notice, even YouTube video playback works fine. Even on hardware well over a decade old, Debian runs fine.
Ha, I bought one of those for $500 from Ebay. It's a dual Xeon Silver workstation with a Nvidia Quadro P400 8GB, 128GB RAM and 256G SSD. I threw in a 1TB SSD and it's been working pretty well.
I too have a crippling dual CPU workstation hoarding habit. Single thread performance is usually worse than enthusiast consumer desktops, and gaming performance will suffer if the game isn't constrained to a single NUMA domain that also happens to have the GPU being used by that game.
On the other hand, seeing >1TiB RAM in htop always makes my day happier.
Personally I use eBay and find the most barebones system I can, then populate the CPU+RAM with components salvaged from e-wasted servers. There are risks with this, as I've had to return more than one badly-bent workstation that was packed poorly.
---
So the Dell Precision T7920 runs dual Intel Scalable (Skylake) and has oodles of DIMM slots (24!), but you'll need to use a PCIe adapter to run an NVMe drive. FlexBays give you hot-swappable SATA, SAS too but only if you're lucky enough to find a system with an HBA (or add one yourself). But if you manage to salvage 24x 64GB DDR4 DIMMs, you'll have a system with a terabyte-and-a-half of ECC RAM - just expect to deal with a very long initial POST and a lot of blink codes when you encounter bad sticks. The power supply is proprietary, but can be swapped from the outside.
The T7820 is the single-CPU version, and has only 6 DIMM slots. But it is more amenable to gaming (one NUMA domain), and I have gifted a couple to friends.
If you're feeling cheap and are okay with the previous generation, the Haswell/Broadwell-based T7910 is also serviceable - but expect to rename the UEFI image to boot Linux from NVMe, and it's much less power efficient if you don't pick an E5 v4 revision CPU. I used a fully-loaded T7910 as a BYOD workstation at a previous job, worked great as a test environment.
Lenovo ThinkStation P920 Tower has fewer DIMM slots (16) than the T7920, but has on-motherboard m.2 NVMe connectors and three full 5.25" bays. I loaded one with Linux Mint for my mother's business, she runs the last non-cloud version QuickBooks in a beefy network-isolated Windows VM and it works great for that. Another friend runs one of these with Proxmox as a homelab-in-a-box.
The HP Z6 G4 is also a thing, though I personally haven't played with one yet. I do use a salvaged HP Z440 workstation with a modest 256GB RAM (don't forget the memory cooler!) and a 3090 as my ersatz kitchen table AI server.
>and a lot of blink codes when you encounter bad sticks
Which sadly happens quite a lot with ECC DDR4 for whatever reason.
>If you're feeling cheap and are okay with the previous generation, the Haswell/Broadwell-based T7910 is also serviceable
The T5810 is a known machine, very tinkerable, just works with NVMe adapters (they show up as a normal NVMe boot option in UEFI) and even have TPM 2.0 (!!!) after a BIOS update. Overall, they are the 2nd best affordable Haswell-EP workstations after the HP Z440 in my opinion.
>E5 v4 revision CPU
They are less efficient than V3 CPUs due to the lockdown of Turbo Boost, but then again on a Precision you'd have to flash the BIOS with an external flasher regardless to get TB back.
Forgot about Dell gimping Turbo Boost on that firmware.
Another route is the PowerEdge T440 (tower server), which does respect Broadwell-EP turbo logic without a reflash. Not quite as quiet as a workstation, though.
The CPU is far less powerful than a single Ryzen chip from now and the new system is far more power efficient. No super fast USB connections like a new system has.(It does have a USB-C 10GB connection though)
Overall if you can live with a bit older machine, it's pretty decent.
I bought a Dell Precision 7910 2x Xeon E5-2687W v3 (10 cores, 20 threads each) with 32GB RAM and 512GB SSD for $425 including shipping. I found that Windows 11 Pro will recognize only 20 of the virtual cores/threads. I don't feel a need to upgrade to more expensive Microsoft OSs at this time, so I just run Ubuntu natively on that box, which recognizes all of it. Assuming used DDR4 RAM returns to more reasonable prices at some point, I intend to load that box up to the 768GB max.
Just performance when compared to current generation hardware. Not significantly worse, but things like DDR4 ram and single thread performance show the signs of aging. Frankly for similar $$$ you can get a new hardware from beelink or equivalent.
Got it so basically it's one of those things you do if 1) the project interests you and/or 2) you get one dirt cheap and don't have high expectations for certain tasks
At home some of my systems are ewaste from former employers who would just give it to employees rather than paying for disposal. A couple are eBay finds. I do have one highish-end system at a time specifically for games. Some of my systems are my old hardware reassembled after all the parts for gaming have been upgraded over the years.
ive dealt a bit with ewaste kinds of machines, old Dells and such and have two still running here, the issue is they use a crapton of power. I had one such ewaste Dell machine that I just had to take to the dump it was so underpeforming while it used 3x more power than my other two Dells combined.
But it's a cascading effect, OpenAI gobbled up all of DDR5 production to the point that consumers are choosing to upgrade their older DDR4 systems instead of paying even more to upgrade to a new system that uses DDR5. As a result, DDR4 ram is at a new all time high - https://pcpartpicker.com/trends/price/memory/
DDR4 prices are up 2-6x in the last couple months depending on frequency. High end, high speed modules (e.g. 128GB 3200MHz LRDIMM) are super expensive.
Isn’t that due to different reasons (like the end of production for older standards)? I recall the same happening shortly after manufacturing for DDR3 ceased, before eventually demand essentially went to 0
Even RDIMM / LRDIMM prices have recently started going up.
And I thought that those would be safe, because neither "big AI" nor regular consumers need them.
Except nobody earns the minimum wage today, it's less than 1/2 of 1% of US labor.
The median full-time wage is now $62,000. You can start at $13 at almost any national retailer, and $15 or above at CVS / Walgreens / Costco. The cashier positions require zero work background, zero skill, zero education. You can make $11-$13 at what are considered bad jobs, like flipping pizzas at Little Caesars.
This will probably be an unpopular reply, but "real median household income" — aka, inflation-adjusted median income — has steadily risen since the 90s and is currently at an all-time high in the United States. [1] Inflation includes the cost of housing (by measuring the cost of rent).
However, we are living through a housing supply crisis, and while overall cost of living hasn't gone up, housing's share of that has massively multiplied. We would all be living much richer lives if we could bring down the cost of housing — or at least have it flatline, and let inflation take care of the rest.
Education is interesting, since most people don't actually pay the list price. The list price has gone up a lot, but the percentage of people paying list price has similarly gone down a lot: from over 50% in the 90s for state schools to 26% today, thanks to a large increase in subsidy programs (student aid). While real education costs have still gone up somewhat, they've gone up much less than the prices you're quoting lead you to believe: those are essentially a tax on the rich who don't qualify for student aid. [2]
I think everyone has quibbles about the CPI. Ultimately though, it would take a lot of cherry-picking to make it seem like overall cost of living has gone up 3x while wages have gone up less. As a counterexample, an NES game in 1990 cost $50 new (in 1990 dollars! Not adjusted for inflation). Battlefield 6 cost $70 new this year (in 2025 dollars), and there were widespread complaints about games getting "too expensive." In real terms games have become massively less expensive — especially considering that the budget for Battlefield 6 was $400MM, and the budget for Super Mario World in 1990 was less than $2MM.
There are a zillion examples like this. Housing has gone way up adjusted for inflation, but many other things have gone way, way down adjusted for inflation. I think it's hard to make a case that overall cost of living has gone up faster than median wages, and the federal reports indicate the opposite: median real income has been going up steadily for decades.
Housing cost is visible and (of course, since it's gone up so much) painful. But real median income is not underwater relative to the 90s. And there's always outrage when something costs more than it used to, even if that's actually cheaper adjusted for inflation: for example, the constant outrage about videogame prices, which have in fact massively declined despite requiring massively more labor to make and sell.
Housing, vehicles, groceries, and health insurance are all up massively. Who gives a shit how much a game costs if you can't afford groceries and rent?
In 2010 I paid 3k for a 10 year old truck with 100k miles. That same truck today costs easily 15k. Same story for rent. Same story for groceries. Same story for health insurance.
Who gives a shit how much trinkets costs if you can't afford groceries and rent?
You're identifying the right problem (school and housing costs are completely out of hand) but then resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.
The easy way to realize this is to notice that the median wage has increased by proportionally less than the federal minimum wage has. The people in the middle can't afford school or housing either. And what happens if you increase the minimum wage faster than overall wages? Costs go up even more, and so does unemployment when small businesses who are also paying those high real estate costs now also have to pay a higher minimum wage. You're basically requesting the annihilation of the middle class.
Whereas you make housing cost less and that helps the people at the bottom and the people in the middle.
>resorting to an ineffective solution (minimum wage) when what you actually need is to get those costs back down.
I'm not really resorting to any solution.
My comment is pointing out that when you only do one side of the equation (income) without considering the other side (expenses), it's worthless. Especially when you are trying to make a comparison across years.
How we go about fixing the problem, if we ever do, is another conversation. But my original comment doesn't attempt to suggest any solution, especially not one that "requests the annihilation of the middle class". It's solely to point out that adventured's comment is a bunch of meaningless numbers.
> It's solely to point out that adventured's comment is a bunch of meaningless numbers.
The point of that comment was to point out that minimum wage is irrelevant because basically nobody makes that anyway; even the entry-level jobs pay more than the federal minimum wage.
In that context, arguing that the higher-than-minimum wages people are actually getting still aren't sufficient implies an argument that the minimum wage should be higher than that. And people could read it that way even if it's not what you intended.
So what I'm pointing out is that that's the wrong solution and doing that rather than addressing the real issue (high costs) is the thing that destroys the middle class.
> (school and housing costs are completely out of hand)
On the housing side, the root problem is obvious:
Real estate cannot be both affordable and considered an investment. If it's affordable, that means the price is staying flat relative to inflation, which makes it a poor investment. If it's a good investment, that means the value is rising faster than inflation, which means unaffordability is inevitable.
The solution to the housing crisis is simple: Build more. But NIMBYs and complex owners who see their house/complex as an investment will fight tooth-and-nail against any additional supply since it could reduce their value.
> Real estate cannot be both affordable and considered an investment. If it's affordable, that means the price is staying flat relative to inflation, which makes it a poor investment. If it's a good investment, that means the value is rising faster than inflation, which means unaffordability is inevitable.
This is a misunderstanding of what makes something a good investment. Something is a good investment if it's better for you than your other alternatives.
Suppose you buy a house and then have a mortgage payment equivalent to the amount you'd have been paying in rent until the mortgage is paid off. At that point you have an asset worth e.g. $200,000 and you no longer have a mortgage payment. By contrast, if you'd been paying rent instead then you'd have to continue paying rent. That makes the house a good investment even if its value hasn't increased by a single cent since you bought it -- it could even have been a good investment if its value has gone down, because its true value is in not having to pay rent. Paying $300,000 over time for a house which is now worth $200,000 leaves you $200,000 ahead of the person who paid $300,000 in rent in order to end up with the asset you can find on the inside of an empty box.
Likewise, suppose you're in the landlord business. In one city it costs a million dollars to buy a two bedroom unit and then you can rent it out for $10,000/month. In another city the same two bedroom unit costs $10,000 to buy but then you could only rent it out for $100/month. If your business is to buy the property and rent it out, is one of these a better investment than the other? No, the ROI is exactly the same for both of them and either one could plausibly be a good investment even without any appreciation.
In both cases the value of the property doesn't have to increase to make it a good investment and in both cases the value of the property may not even come into play, because if you're planning to keep the asset in order to live in it or rent it out then you can't simultaneously sell it. And for homeowners, even if you were planning to sell it eventually, you'd then still need somewhere to live, so having all housing cost more isn't doing the average homeowner any good. If they sold they'd only have to pay the higher price to live somewhere else.
However, there is one major difference between homeowners and landlords. If you increase the supply of housing, rents go down. For homeowners that doesn't matter, because they're "renting" to themselves; they pay (opportunity cost) and receive (imputed rent) in equal amounts, so it doesn't matter to them if local rents change -- or it benefits them because it lowers local cost of living and then they pay lower prices for local things. Whereas landlords will fight you on that to their last breath, because that's their actual return on investment. Which is why they're the villains and they need to lose.
sweet! according to austintexas.gov, that's only $2.63 below the 2024 living wage. $5.55 below, if you use the MIT numbers for 2025.
As long as you don't run into anything unforseen like medical expenses, car breakdowns, etc., you can almost afford a bare-bones, mediocre life with no retirement savings.
I don't disagree that there has been a huge issue with stagnant wages, but not everybody who works minimum wage needs to make a living wage. Some are teenagers, people just looking for part time work, etc. Pushing up minimum wage too high can risk destroying jobs that are uneconomical at that level that could have been better than nothing for many people.
That being said, there's been an enormous push by various business groups to do everything they can to keep wages low.
It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...
>but not everybody who works minimum wage needs to make a living wage
I think this is a distraction that is usually rolled out to derail conversations about living wages. Not saying that you're doing that here, but it's often the case when the "teenager flipping burgers" argument is brought up.
Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.
And, in any case, the solution could also be (totally, or in part) a reduction in expenses instead of increase in income.
>It's a complicated issue and one can't propose solutions without acknowledging that there's a LOT of nuance...
That's for sure! I know it's not getting solved on the hacker news comment section, at least.
> I think this is a distraction that is usually rolled out to derail conversations about living wages. Not saying that you're doing that here, but it's often the case when the "teenager flipping burgers" argument is brought up.
If you're focusing on minimum wage, they tent to be highly coupled, though some jurisdictions have lower minimum wages for minors to deal with this.
> Typically in conversations about living wages, people are talking about financially independent adults trying to make their way through life without starving while working 40 hours per week. I don't think anyone is seriously promoting a living wage for the benefit of financially dependent minors.
Few minimum wage jobs even offer the option to work full time. Many retail environments have notoriously unpredictable shifts that are almost impossible for workers to plan around. I've heard varying reasons for this (companies like having more employees working fewer hours for flexibility down to avoiding people on the full time payroll means they legally don't have to offer benefits). The result is that minimum wage earners often have to juggle multiple jobs, childcare, and the negative effects of commuting to all of them.
This also ignores many other factors around poverty, such as housing costs and other inflation.
> That's for sure! I know it's not getting solved on the hacker news comment section, at least.
For sure! 99% of people on HN haven't had to experience living long term off of it. I did for awhile in college, where outside of tuition I had to pay my own way in a large city (I fully acknowledge that this is anecdotal and NOT the same as poverty living). I only had to feed myself, not think about saving for the future, and I was sharing a house with other geeky roommates where we had some of the best times of our lives. I don't think we could have pulled that off in today's economic environment...
The part time workers has been sorted out as living wage calculations assume full time work.
Even if you are a teenager you deserve a living wage - if a teenager living at home needs to work full time, then that home likely need some of those money.
Counterpoint: affording average rent for a 1-bedroom apartment (~$1,675) requires that exact median full-time wage. $15 an hour affords you about $740 for monthly housing expenses. One can suggest getting two roommates for a one-bedroom apartment, but they would be missing the fact that this is very unusual for the last century. It's more in line with housing economics from the early-to-mid 19th century.
In addition to the other comments, I presume the big box retailers do not hire for full-time positions when they don't have to, and gig economy work is rapidly replacing jobs that used to be minimum wage.
in that case it should be completely uncontroversial to raise the minimum wage and help that .5% of labor out. yet somehow, it's a non-starter. (btw, googling says the number is more like 1.1%. in 1979, 13.4% of the labor force made minimum wage. this only shows how obsolete the current minimum wage level is).
My uncle was running a number of fast food restaurants for a franchise owner making millions. His statement about this topic is simple, "they are not living wage jobs ... go into manufacturing if you want a living wage".
I don't like my uncle at all and find him and people like him to be terrible human beings.
If a business can't pay a living wage, it's not really a successful business. I, too, could become fabulously wealthy selling shoes if someone just have me shoes for $1 so I could resell them for $50.
> If a business can't pay a living wage, it's not really a successful business.
Let's consider the implications of this. We take an existing successful business, change absolutely nothing about it, but separately and for unrelated reasons the local population increases and the government prohibits the construction of new housing.
Now real estate is more scarce and the business has to pay higher rent, so they're making even less than before and there is nothing there for them to increase wages with. Meanwhile the wages they were paying before are now "not a living wage" because housing costs went way up.
Is it this business who is morally culpable for this result, or the zoning board?
There are certainly elements of this. And there are also elements like my city, where some of the more notable local business owners and developers are all _way too cozy_ with the City Council and Planning/Zoning Boards (like not just rubbing shoulders at community events, fundraisers, but in the "our families rent AirBnBs together and go on vacation together) which gives them greater influence.
All that being said, though, Robert Heinlein said once:
> There has grown up in the minds of certain groups in this country the notion that because a man or corporation has made a profit out of the public for a number of years, the government and the courts are charged with the duty of guaranteeing such profit in the future, even in the face of changing circumstances and contrary to the public interest. This strange doctrine is not supported by statute or common law. Neither individuals nor corporations have any right to come into court and ask that the clock of history be stopped, or turned back.
> And there are also elements like my city, where some of the more notable local business owners and developers are all _way too cozy_ with the City Council and Planning/Zoning Boards (like not just rubbing shoulders at community events, fundraisers, but in the "our families rent AirBnBs together and go on vacation together) which gives them greater influence.
But now you're just condemning the zoning board and their cronies as it should be, as opposed to someone else who can't pay higher wages just because real estate got more expensive since it got more expensive for them too.
> Neither individuals nor corporations have any right to come into court and ask that the clock of history be stopped, or turned back.
Which is basically useless in this context because when costs increase you could apply it equally to not raising the minimum wage (the individual has to suck it up) or raising the minimum wage (the small business owner has to suck it up). Meanwhile neither of them should have to suck it up because we should instead be getting the costs back down.
But in that case they are successful; they're just not paying very much relative to the cost of living as a result of someone else's imposition of artificial scarcity
But they do have employees. Their employees are just unhappy because now they're barely scraping by, but most employers can't afford to pay them more because the employers are in the same boat paying the high real estate costs themselves.
Can we use the same argument for all of the businesses that are only surviving because of VC money?
I find it rich how many tech people are working for money losing companies, using technology from money losing companies and/or trying to start a money losing company and get funding from a VC.
Every job is not meant to support a single person living on their own raising a family.
That's what VC money is for. When it comes to paying below a living wage, we typically expect the government to provide support to make up the difference (so they're not literally homeless). Businesses that rely on government to pay their employees should not exist.
That’s kind of the point, a mom and pop restaurant or a McDonald’s franchise owner doesn’t have the luxury of burning $10 for every $1 in revenue for years and being backed by VC funding.
Oh and the average franchise owner is not getting rich. They are making $100K a year to $150K a year depending on how many franchises they own.
Also tech companies can afford to pay a tech worker more money because you don’t have to increase the number of workers when you get more customers.
YC is not going to give the aspiring fast food owner $250K to start their business like they are going to give “pets.ai - AI for dog walkers”
In that case they probably shouldn't be running a McDonald's. They aren't owed that and they shouldn't depend on their workers getting government support just so the owners can "earn" their own living wage.
Yet tech workers are “owed” making money because they are in an industry where their employers “deserve” to survive despite losing money because they can get VC funding - funded by among others government pension plans?
I find it slightly hypocritical that people can clutch their pearls at small businesses who risk their own money while yet another BS “AI” company’s founders can play founder using other people’s money.
Classically, not all jobs are considered "living wage" jobs. That whole notion is something some people made up very recently.
A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances... and if it does, the owner has a strong incentive to automate it away.
> A teenager in his/her first job at McDonald's doesn't need a "living wage." As a result of forcing the issue, now the job doesn't exist at all in many instances
The majority of minimum wage workers are adults, not teenagers. This is also true for McDonald's employees. The idea that these jobs are staffed by children working summer jobs is simply not reality.
Anyone working for someone else, doing literally anything for 40 hours a week, should be entitled to enough compensation to support themselves at a minimum. Any employer offering less than that is either a failed business that should die off and make room for one that's better managed or a corporation that is just using public taxpayer money to subsidize their private labor expenses.
A teenager is presumably also going to school full time and works their job part time, not ~2000 hours per year.
If we build a society where someone working a full time job is not able to afford to reasonably survive, we are setting ourselves up for a society of crime, poverty, and disease.
Let's talk about steelmanning, shall we? Why should I show good faith when absolutely no one else in the conversation is?
Take it from the top. First reply: The majority of minimum wage workers are adults, not teenagers. This is also true for McDonald's employees. The idea that these jobs are staffed by children working summer jobs is simply not reality.
The steelman position would grant that in fact, it's traditional for teenagers working summer jobs to do just that, and proceed to explain why high minimum wages as a one-size-fits-all policy are still a net win for society. Instead, autoexec starts by attacking an unstated position -- that these are necessarily 40-hour/week full-time jobs -- and wraps up by plainly and literally denying reality.
Second reply: A teenager is presumably also going to school full time and works their job part time, not ~2000 hours per year. If we build a society where someone working a full time job is not able to afford to reasonably survive, we are setting ourselves up for a society of crime, poverty, and disease.
At least kube-system doesn't make the mistake of assuming that all jobs require 2000 hours of work per year, but they fail to acknowledge, much less address, my point that not all jobs are done for survival purposes. Not only is that not an example of steelmanning, it's followed up by an irrelevant bare assertion made without the faintest trace of historical grounding.
Moving on to array_key_first: Just the simple fact that mcdonalds is open during school hours is enough to demolish the "teenagers flipping burgers" type arguments.
Once again, the fact is that jobs such as burger-flipping have traditionally provided part-time and summer jobs for young people living at home who are looking to save up a bit of money and get some work experience. This reply doesn't care to acknowledge the basic facts of the matter, much less address the strongest possible interpretation of my argument.
Then there's this zinger from swiftcoder: Turns out our supply of underage workers is neither infinite, nor even sufficient to staff all fast food jobs in the nation. If you are looking for an example of a strawman argument in this thread, how about picking on an actual one before jumping on my case?
Come back with your "steelman" bullshit when you're willing to apply the same rules to all sides of the argument.
> At least kube-system doesn't make the mistake of assuming that all jobs require 2000 hours of work per year, but they fail to acknowledge, much less address, my point that not all jobs are done for survival purposes.
I think we both can agree that there's a lot of nuance when it comes to wages and employment. How much time is someone spending at that job? What type of living situation do they have? What other sources of income do they have in their living situation? What are their expenses? etc.
You're right that not all jobs are done for survival purposes. But colloquially, when people use the term "living wage", they're talking specifically about people who work a wage in order to survive. Which is the reason that most people have jobs.
> Not only is that not an example of steelmanning, it's followed up by an irrelevant bare assertion made without the faintest trace of historical grounding.
Do you mean that in the past people did not survive on a single income? Social and family structures in the past definitely look different than they do today. But that doesn't really have any relevance to the people living in the present. Socioeconomics changes over time. Many of the living situations of the past are straight up illegal today. I know relatives who grew up without electricity or running water and grew their own food for survival. They didn't need the modern concept of a "living wage", but at the same time you can't reasonably expect someone from today's world to do the same thing they did.
If you fast forward just a few years though, it wasn't too bad.
You could put together a decent fully parted out machine in the late 90s and early 00s for around $600-650. These were machines good enough to get a solid 120 FPS playing Quake 3.
That's kinda like saying the mid-20s were pretty scary too, minimum wage was AMOUNT and a MacBook M4 Max was $3000..
In the mid-90s me and my brother were around 14 and 10, earning nothing but a small amount of monthly pocket money. We were fighting so much over our family PC, that we decided to save and put together a machine from second-hands parts we could get our hands on. We built him a 386 DX 40 or 486SX2 50 or something like that and it was fine enough for him to play most DOS games. Heck, you could even run Linux (I know because I ran Linux in 1994 on a 386SX 25, with 5MB RAM and 20MB disk space).
The TCO was much higher, considering how terrible and flimsy this laptop was. The power plug would break if you looked at it funny and the hinge was stiff and brittle. I know that’s not the point you are making but I am still bitter about that computer.
It was kind of the opposite in the early 90s. In 93 I got Slackware Linux on a CD-ROM that was included, but I couldn’t get it to run because our system only had 2MB RAM. At the same time, what was common in the day (MS-DOS and Windows 3.1) ran fine.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
I know it's not the same. But I think a lot of people had a similar feeling going from Intel-Macbooks to Apple Silicon. An insane upgrade that I still can't believe.
This. My M1 MacBook felt like a similarly shocking upgrade -- probably not quite as much as my first SSD did, but still the only other time when I've thought, "holy sh*t, this is a whole different thing".
The M1 was great. But the jump felt particularly great because Intel Macbooks had fallen behind in performance per dollar. Great build quality, great trackpad, but if you were after performance they were not exactly the best thing to get
For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.
After the M1, my casual home laptop started outperforming my top-spec work laptops.
> For as long as I can remember, before M1, Macs were always behind in the CPU department. PC's had much better value if you cared about CPU performance.
But not if you cared about battery life, because that was the tradeoff Apple was making. Which worked great until about 2015-2016. The parts they were using were not Intel’s priority and it went south basically after Broadwell, IIRC. I also suppose that Apple stopped investing heavily into a dead-end platform while they were working on the M1 generation some time before it was announced.
It's a lot more believable if you tried some of the other Wintel machines at the time. Those Macbook chassis were the hottest of the bunch, it's no surprise the Macbook Pro was among the first to be redesigned.
I usually use an M2 Mac at work, and haven't really touched Windows since 2008. Recently I had to get an additional Windows laptop (Lenovo P series) for a project my team is working on, and it is such a piece of shit. It's unfathomable that people are tolerating Windows or Intel (and then still have the gall to talk shit about Macs).
It's like time travelling back to 2004. Slow, loud fans, random brief freezes of the whole system, a shell that still feels like a toy, a proprietary 170W power supply and mediocre battery life, subpar display. The keyboard is okay, at least. What a joke.
Meanwhile, my personal M3 Max system can render Da Vinci timelines with complex Fusion compositions in real time and handle whole stacks of VSTs in a DAW. Compared to the Lenovo choking on an IDE.
A lot of this is just windows sucking major balls. Linux distros with even the heaviest DEs like KDE absolutely fly on mediocre or even low range hardware.
I got a lunar lake laptop and slapped fedora on it and everything is instant. And I hooked up 2 1400p/240hz over thunderbolt.
Expensive PCs are also crap. My work offers Macbooks or Windows laptops (currently, Dell, but formerly Lenovo and/or HP), and these machines are all decidedly not 'cheap' PCs. Often retailing in excess of $2k.
All my coworkers who own Windows laptops do is bellyache about random issues, poor battery life, and sluggish performance.
I used to have a Windows PC for work about 3 years ago as well, and it was also a piece of crap. Battery would decide to 'die' at 50% capacity. After replacement, 90 minute battery life off charger. Fan would decide to run constantly if you did anything even moderately intensive such as a Zoom meeting.
To me that reads 3x, not "almost 10x". The main differrence here is probably power. A desktop/server is happy to send 15W to the SSD and hundreds of watts to the CPU, while a laptop wants the SSD running in the ~1 watt range and the CPU in the 10s of watts range.
There's over twice as much content in the first test. It's around 3.8gb/s vs 30gb/s if you divide both folder size and both du durations. That makes it 7.9 times faster and I'm comfortable calling this "almost 10 times".
The total size isn't what matters in this case but rather the total number of files/directories that need to be traversed (and their file sizes summed).
I thought so too on my mini PC. Then I got myself my current Mac mini M4 and I have to give it to Apple, or maybe in part to ARM... It was like another SSD moment. It's still not spun up the fan and run literally lukewarm at most my office, coding and photo work.
The only time I had this other than changing to SSD was when I got my first multi-core system, a Q6600 (confusingly labeled a Core 2 Quad). Had a great time with that machine.
"Core" was/is like "PowerPC" or "Ryzen", just a name. Intel Core i9, for instance, as opposed to Intel Pentium D, both x86_x64, different chip features.
As other mentionned, there are plenty of refurbished stuff and second hand parts that there isn't any risk of finding yourself having to buy something at insane prices if your computer was to die today.
If you don't need a GPU for gaming you can get a decent computer with an i5, 16GB of ram and an nvme drive for usd 50. I bought one a few weeks ago ago.
For a DDR3-era machine, you'd be buying RAM for that on Ebay, not Newegg.
I have an industrial Mini-ITX motherboard of similar vintage that I use with an i5-4570 as my Unraid machine. It doesn't natively support NVMe, but I was able to get a dual-m2 expansion card with its own splitter (no motherboard bifurcation required) and that let me get a pretty modern-feeling setup with nice fast cache disks.
About a month ago, the mobo for my 5950x decided to give up the ghost. I decided to just rebuild the whole thing and update from scratch.
So went crazy and bought a 9800X3D, purchased a ridiculous amount of DDR5 RAM (96GB, which matches my old machine’s DDR4 RAM quantity). At the time, it was about $400 USD or so.
I’ve been living in blissful ignorance since then. Seeing this post, I decided to check Amazon. The same amount of RAM is currently $1200!!!
>For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
I only ever noticed it on my windows partition. IIRC on my linux partition it was hardly noticeable because Linux is far better at caching disk contents than windows and also linux in general can boot surprisingly fast even on HDDs if you only install modules you actually need so that the autoconfiguration doesn't waste time probing dozens of modules in search of the best one.
maybe not on an SSD but it definitely helps a lot on HDD by virtue of having far less disk traffic. The kernel's method for figuring out which modules to load is effectively to load every single module that might be compatible with a given device in series and then ask the module for its opinion before unloading it, and then once it has a list of all (self-reported) compatible modules for a given device it picks one and reloads it.
IDK; at the time i was using gentoo, in which it's natural not to have more modules than necessary because part of the installation process involves generating your own kernel configuration.
Even though it's not the normal way to install debian there ought to be some sort of way to build your own custom kernels and modules without interferance from the package manager (or you can just run it all manually and hope that you dont end up in a conflict with apt). Gentoo is the only distro where it's mandatory but im pretty sure this is supported on just about every distribution since it would be necessary for maintainers.
That's not very encouraging because that statement has been true most every day in computing for the past 50 years. The rate at which the compute per dollar increases is what matters.
Yeah moores law is slowing down for sure. I’m just pushing back on the whole sky is falling doomerism in the PC community. I will admit I’m lucky that my current system with 64GB of memory and a 4090 is likely to be good for years to come so I can wait out the ram shortage.
A few years later but similarly - I am still running a machine built spur-of-the-moment in a single trip to Micro Center for about $500 in late 2019 (little did we know what was coming in a few months!). I made one small upgrade in probably ~2022 to a Ryzen 5800X w/ 64GB of RAM but otherwise untouched. It still flies through basically anything & does everything I need, but I'm dreading when any of the major parts go and I have to fork out double or triple the original cost for replacements...
intel arc b580 (i think that's the latest one) isn't obnoxiously priced but you're going to have to face the fact that your PCIE is really very slow. But it should work.
if you want to save even more money get the older Arc Battlemage GPUs. I used one it was comparable with an RTX 3060; i returned it because the machine i was running it in had a bug that was fixed 2 days before i returned it but i didn't know that.
I was seriously considering getting a b580 or waiting until the b*70 came out with more memory, although at this point i doubt it will be very affordable considering VRAM prices going up as well. A friend is supposedly going to ship me a few GTX 1080ti cards so i can delay buying newer cards for a bit.
By older Arc, I presume you're referring to Alchemist and not Battlemage in this case.
One of my brothers has a PC I built for him, specced out with an Intel Core i5 13400f CPU and an Intel Arc A770 GPU, and it still works great for his needs in 2025.
Surely, Battlemage is more efficient and more compatible in some ways over Alchemist. But if you keep your expectations in check, it will do just fine in many scenarios. Just avoid any games using Unreal Engine 5.
yeah i had an A770; it should be ~$200-$250 now on ebay, lightly used. It's, in my opinion, worth about $200, if it's relatively unused. As i mentioned, it's ~= RTX 3060 at least for compute loads, and the 16GB is nice to have for that. But for a computer from the 4th gen i'd probably only get a A380 or A580; the A380 is $60-$120 on ebay.
Note that some tinkering may be required for modern cards on old systems.
- A UEFI DXE driver to enable Resizable BAR on systems which don't support it officially. This provides performance benefits and is even required for Intel Arc GPUs to function optimally.
I’m worried about the Valve mini PC coming out next year.
Instant buy $700 or under. Probably buy up to $850. At, like, $1,100, though… solid no. And I’m counting on that thing to take the power-hog giant older Windows PC tower so bulky it’s unplugged and in a closet half the time, out of my house.
If I needed a budget build, I'd probably look in the direction of used parts on AliExpress, you can sometimes find good deals on AM4 CPUs (that platform had a lot of longevity, even now my main PC has a Ryzen 7 5800X) and for whatever reason RX 580 GPUs were really, really widespread (though typically the 2048SP units). Not amazing by any means, but a significant upgrade from your current setup and if you don't get particularly unlucky, it might last for years with no issues.
Ofc there's also the alternate strategy of going for a mid/high end rig and hoping it lasts a decade, but the current DDR5 prices make me depressed so yeah maybe not.
I genuinely hope that at some point the market will get flooded with good components with a lot of longevity and reasonable prices again in the next gens: like AM4 CPUs, like that RX 580, or GTX 1080 Ti but I fear that Nvidia has learnt their lesson in releasing stuff that pushes you in the direction of incremental upgrades rather than making something really good for the time, same with Intel's LGA1851 being basically dead on arrival, after the reviews started rolling in (who knows, maybe at least mobos and Core Ultra chips will eventually be cheap as old stock). On the other hand, at least something like the Arc B580 GPUs were a step in the right direction - competent and not horribly overpriced (at least when it came to MSRP, unfortunately the merchants were scumbags and often ignored it).
Man, it was just GPU for a while. But same boat. I regret not getting the 4090 for $1600 direct from Nvidia. "That's too much for a video card", and got the 4080 instead. I dread the day when I need to replace it.
Is the CUDA compat layer AMD has that transparently compiled existing CUDA just fine insufficient somehow or buggy somehow? Or are you just stuck in the mindshare game and haven’t reevaluate whether the AMD situation has changed this year?
I haven't checkout out AMD's transparency layer and know nothing about it. I tried to get vkFFT working in addition to cuFFT for a specific computation, but can't get it working right; crickets on the GH issue I posted.
I use Vulkan for graphics, but Vulkan compute is a mess.
I'm not in a mindshare, and this isn't a political thing. I am just trying to get the job done, and have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.
> have observed that no alternative has stepped up to nvidia's CUDA from a usability perspective.
I’m saying this is a mindshare thing if you haven’t evaluated ROCm / HIP. HIPify can convert CUDA source to HIP automatically and HIP is very similar syntax to CUDA.
Don't all RAM manufacturers offer a lifetime warranty?
That said, if the shortage gets bad enough then maybe they could find themselves in a situation where they were unable/unwilling to honor warranty claims?
You can still buy DDR4 for pretty cheap, and if you're replacing a computer that old any system built around DDR4 will still be a massive jump in performance.
you should upgrade to a used or new 12900k with DDR4 since its still cheaper than dd5 even if it is up. then get a used 3080ti with 12gb. youll be able to do proper h.265 decode/encode with that for up to 4k (unfortuantly not 8k but hey no one really has that yet)
Do yourself a favor and order $25 Xeon E3-1231 v3/E3-1241 v3 from China. Those work in all desktop motherboards. Used DDR3 ram is also so cheap you can bump to 32GB for another $20.
easy cheap upgrades. CPU will be noticeable straight away, ram only if you are running out.
I'm glad I kept my ageing Dell Studio XPS 7100. I need to stock up on some Dell Precision towers from the surplus in case my Studio XPS 7100 breaks. This AI bubble needs to burst soon.
> For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
The last one were I really remember seeing a huge speed bump was going from a regular SSD to a NVMe M.2 PCIe SSD... Around 2015 I bought one of the very first consumer motherboard with a NVMe M.2 slot and put a Samsung 950 Pro in it: that was quite something (now I was upgrading the entire machine, not just the SSD, so there's that too). Before that I don't remember when I switched from SATA HDD to SATA SSD.
I'm now running one of those WD SN850X Black NVMe SSD but my good old trusty, now ten years old, Samsung 950 Pro is still kicking (in the wife's PC). There's likely even better out there and they're easy to find: they're still reasonably priced.
As for my 2015 Core i7-6700K: it's happily running Proxmox and Docker (but not always on).
Even consumer parts are exceptionally reliable: the last two failures I remember, in 15 years (and I've got lots of machines running), are a desktop PSU (replaced by a Be Quiet! one), a no-name NVMe SSD and a laptop's battery.
Oh and my MacBook Air M1's screen died overnight for no reason after precisely 13 months, when I had a warranty of 12 months, (some refer to it as the "bendgate") but that's because first gen MacBook Air M1 were indescribable pieces of fragile shit. I think Apple got their act together and came up with better screens in later models.
Don't worry too much: PCs are quite reliable things. And used parts for your PC from 2014 wouldn't be expensive on eBay anyway. You're not forced to upgrade to a last gen PC with DDR5 (atm 3x overpriced) and a 5090 GPU.
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
The vast majority of humans across the planet aren’t making their money with their computer, which was the qualifier in the first line of my comment.
Furthermore, even if they did, the vast majority of them still won’t be using their computer to generate revenue - they’ll be using an employer-provided one and the things I’m talking about have nothing to do with them.
What is the actual return on that investment, though? This is self indulgence justified as « investment ». I built a pretty beefy PC in 2020 and have made a couple of upgrades since (Ryzen 5950x, 64GB RAM, Radeon 6900XT, a few TB of NVMe) for like $2k all-in. Less than $40/month over that time. It was game changing upgrade from an aging laptop for my purposes of being able to run multiple VMs and a complex dev environment, but I really don’t know what I would have gotten out of replacing it every year since. It’s still blazing fast.
Even recreating it entirely with newer parts every single year would have cost less than $250/mo. Honestly it would probably be negative ROI just dealing with the logistics of replacing it that many times.
> This is self indulgence justified as « investment ».
Exactly that. There's zero way that level of spending is paying for itself in increased productivity, considering they'll still be 99% as productive spending something like a tenth of that.
It's their luxury spending. Fine. Just don't pretend it's something else, or tell others they ought to be doing the same, right?
I think my point was lost, then. I agree with you there is a HUGE falloff in productivity ROI above maybe $2k/year.
My point is that the extreme right end of the slider, where you go from “diminishing returns” to “no return whatsoever”, still costs less than leasing a Kia. It costs less than my minsicule shabby office in the sketchy part of town. Compared to serious computer business revenues, it isn’t even worth spending the time to talk about. I spend more on housekeepers or car insurance.
Given that, why not just smash the slider to the right and stop worrying about it? For serious computer professionals the difference between a $2k/year hardware budget and a $7k/year hardware budget does not matter.
My main workstation is similar, basically a top-end AM4 build. I recently bumped from a 6600 XT to a 9070 XT to get more frames in Arc Raiders, but looking at what the cost would be to go to the current-gen platform (AM5 mobo + CPU + DDR5 RAM) I find myself having very little appetite for that upgrade.
Disclosing that you spend half the median income on top-spec Apple hardware every year is a confession, dude. There's no justifying that spend, past, "I like having the newest toys." Happy for you and whatever sales rep whose performance review you're making a slam dunk. It's still not good advice for the vast majority of people who use their computers for work.
You're an economic elite living in what is commonly known as a "bubble"; consider the response to your initial post a momentary popping of it.
I don’t spend anywhere near that. It resells for 60-80% when I replace it a year or two later. That offsets the cost drastically.
Spending $700 per month on your work tools (where that represents 2-3% of revenue) is not unreasonable. My minuscule office space in the shitty part of town costs as much.
I think anyone running their small business that depends on high performance computers should have an annual budget of at least 1% of revenue for hardware.
> Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?
Yes. This is how we get websites and apps that don't run on a normal person's computer, because the devs never noticed their performance issues on their monster machines.
Modern computing would be a lot better if devs had to use old phones, basic computers, and poor internet connections more often.
Yes? I think that's crazy. I just maxed out my new Thinkpad with 96 GB of RAM and a 4 TB SSD and even at today's prices, it still came in at just about $2k and should run smoothly for many years.
Prices are high but they're not that high, unless you're buying the really big GPUs.
Where can you buy a new Thinkpad with 96GB and 4TB SSD for $2K? Prices are looking quite a bit higher than that for the P Series, at least on Lenovo.com in the U.S. And I don't see anything other than the P Series that lets you get 96GB of RAM.
You have to configure it with the lowest-spec SSD and then replace that with an aftermarket 4 TB SSD at around $215. The P14s I bought last week, with that and the 8 GB Nvidia GPU, came to a total of USD $2150 after taxes, including the SSD. Their sale price today is not quite as good as it was last week but it's still in that ballpark; with the 255H CPU and iGPU and a decent screen, and you can get the Intel P14s for $2086 USD. That actually becomes $1976 because you get $110 taken off at checkout. Then throw in the aftermarket SSD and it'll be around $2190. And if you log in as a business customer you'll get another couple percent off as well.
The AMD model P14s, with 96 GB and upgraded CPU and the nice screen and linux, still goes for under $1600 at checkout, which becomes $1815 when you add the aftermarket SSD upgrade.
It's still certainly a lot to spend on a laptop if you don't need it, but it's a far cry from $5k/year.
> maybe $250/month (...) which you can then use to go and earn 100x that.
25k/month? Most people will never come close to earn that much. Most developers in the third world don't make that in a full year, but are affected by raises in PC parts' prices.
I agree with the general principle of having savings for emergencies. For a Software Engineer, that should probably include buying a good enough computer for them, in case they need a new one. But the figures themselves seem skewed towards the reality of very well-paid SV engineers.
Yes, that's an absolutely deranged opinion. Most tech jobs can be done on a $500 laptop. You realise some people don't even make your computer budget in net income every year, right?
That's a bizarrely extreme position. For almost everyone ~$2000-3000 PC from several years ago is indistinguishable from one they can buy now from a productivity standpoint. Nobody is talking about $25 ten year old smartphones. Of course claiming that a $500 laptop is sufficient is also a severe exaggeration, a used desktop, perhaps...
Overspending on your tools is a misallocation of resources. An annual $22k spend on computing is around 10-20x over spend for a wealthy individual. I'm in the $200-300k/year, self-employed, buys-my-own-shit camp, and I can't imagine spending 1% of my income on computing needs, let alone close to 10%. There is no way to make that make sense.
Look at it a different way: if you'd invested that $10K/year you've been blowing on hardware, how much more money would you have today? How about that $800/month car payment too?
I don’t understand a world where spending $1k/mo on business equipment that is used to earn dozens of times more than that is crazy. It’s barely more than my minuscule office space costs.
My insurance is the vast majority of that $800, fwiw.
Having a 10% faster laptop does not enhance your ability to earn money in any meaningful way. Just like driving around in a luxury car doesn't enhance your ability to travel from point A to point B in any meaningful way.
It's okay to like spending money on nice things, it's your money and you get to decide what matters to you. What you're getting hate for here is claiming it's justified in some way.
Yes, you don't want to under spend on your tools to the point where you suffer. But, I think you are missing the flip side. I can do my work comfortably with 32GB RAM, but my 1% a year budget could get me more. But, why not pocket it.
The goal is the right tool for the job, not the best tool you can afford.
I agree with the general sentiment - that you shouldn't pinch pennies on tools that you use every day. But at the same time, someone who makes their money writing with with a pen shouldn't need to spend thousands on pens. Once you have adequate professional-grade tools, you don't need to throw more money at the problem.
If you are consistently maxing out your computers performance in a way that is limiting your ability to earn money at a rate greater than the cost of upgrades, and you can't offload that work to the cloud, then I guess it might make sense.
If, you are like every developer I have ever met, the constraint is your own time, motivation and skills, then spending $22k dollars per year is a pretty interesting waste of resources.
DOes it makes sense to buy good tools for your job? Yes. Does it make sense to buy the most expensive version of the tool that you already own last years most expensive version of? Rarely.
It is crazy for anyone making any amount. A $15k desktop is overkill for anything but the most demanding ML or 3D work loads, and the majority of the cost will be in GPUs or dedicated specialty hardware and software.
A developer using even the clunkiest IDE (Visual Studio - I'm still a fan and daily user, it's just the "least efficient") can get away without a dedicated graphics card, and only 32GB of ram.
> Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum?
Yes, you are crazy for saying that.
> but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
There is a very wide range of incomes of people using their computers, and, more to the point, $5k/yr on hardware is way past the point where, for most people using their computer for income, additional hardware expenditure has any benefit to income generation.
> Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Most people using their computer to earn income do not earn anywhere close to $25,000/mo ($300k/yr), and hardware expenditures aren't the limiting factor holding them back.
Also, the minimum of $5k/yr you suggested is not $250/mo, but more than 1.5× that at $417/mo.
> Spend at least 1% of your gross revenue on your tools used to make that revenue.
Median annual wage for a US software developer is ~$140k per most recent BLS numbers, and that's one of the higher-paying fields of work that people use computers for. Neither your original $5k per year nor even the $3k/year suggested by your later $250/mo suggestion are warranted by your 1% on tools rule for most people earning income with their computer, especially on hardware alone, as that is far from all of the "tools" that are relevant to most computer work.
Most people who use computers for the main part of their jobs literally can't spend that much if they don't want to be homeless.
Most of the rest arguably shouldn't. If you have $10k/yr in effective pay after taxes, healthcare, rent, food, transportation to your job, etc, then a $5k/yr purchase is insane, especially if you haven't built up an emergency fund yet.
Of the rest (people who can relatively easily afford it), most still probably shouldn't. Unless the net present value of your post-tax future incremental gains (raises, promotions, etc) derived from that expenditure exceeds $5k/yr you're better off financially doing almost anything else with that cash. That's doubly true when you consider that truly amazing computers cost $2k total nowadays without substantial improvements year-to-year. Contrasting buying one of those every 2yrs vs your proposal, you'd need a $4k/yr net expenditure to pay off somehow, somehow making use of the incremental CPU/RAM/etc to achieve that value. If it doesn't pay off then it's just a toy you're buying for personal enjoyment, not something that you should nebulously tie to revenue generation potential with an arbitrary 1% rule. Still maybe buy it, but be honest about the reason.
So, we're left with people who can afford such a thing and whose earning potential actually does increase enough with that hardware compared to a cheaper option for it to be worth it. I'm imagining that's an extremely small set. I certainly use computers heavily for work and could drop $5k/yr without batting an eye, but I literally have no idea what I could do with that extra hardware to make it pay off. If I could spend $5k/yr on internet worth a damn I'd do that in a heartbeat (moving soon I hope, which should fix that), but the rest of my setup handily does everything I want it to.
Don't get me wrong, I've bought hardware for work before (e.g., nobody seems to want to procure Linux machines for devs even when they're working on driver code and whatnot), and it's paid off, but at the scale of $5k/yr I don't think many people do something where that would have positive ROI.
It's too late to edit, but I do have one more thought on the topic.
From the perspective of an individual, ROI has to be large to justify a $5k/yr investment. HOWEVER, the general principle of "if something is your livelihood, then you should be willing to invest in it as appropriate" is an excellent thing to keep in mind. Moreover, at the scale of a company and typical company decisions the advice makes a ton of sense -- if a $1k monitor and $2k laptop allow your employees to context-switch better or something then you should almost certainly invest in that hardware (contrasted with the employee's view of ROI, the investments are tax-deductible and just have to pay off in absolute value, plus they don't have the delay/interaction with wages/promotions/etc introducing uncertainty and loss into the calculation) (the difference between a few hundred dollars and a few thousand dollars in total capital investment probably does have a huge difference in outcomes for a lot of computer-based employee roles).
It's when you find ways to spend the minimum amount of resources in order to get the maximum return on that spend.
With computer hardware, often buying one year old hardware and/or the second best costs a tiny fraction of the cost of the bleeding edge, while providing very nearly 100% of the performance you'll utilize.
That and your employer should pay for your hardware in many cases.
I try to come at it with a pragmatic approach. If I feel pain, I upgrade and don't skimp out.
========
COMPUTER
========
I feel no pain yet.
Browsing the web is fast enough where I'm not waiting around for pages to load. I never feel bound by limited tabs or anything like that.
My Rails / Flask + background worker + Postgres + Redis + esbuild + Tailwind based web apps start in a few seconds with Docker Compose. When I make code changes, I see the results in less than 1 second in my browser. Tests run fast enough (seconds to tens of seconds) for the size of apps I develop.
Programs open very quickly. Scripts I run within WSL 2 also run quickly. There's no input delay when typing or performance related nonsense that bugs me all day. Neovim runs buttery smooth with a bunch of plugins through the Windows Terminal.
I have no lag when I'm editing 1080p videos even with a 4k display showing a very wide timeline. I also record my screen with OBS to make screencasts with a webcam and have live streamed without perceivable dropped frames, all while running programming workloads in the background.
I can mostly play the games I want, but this is by far the weakest link. If I were more into gaming I would upgrade, no doubt about it.
========
PHONE
========
I had a Pixel 4a until Google busted the battery. It runs all of the apps (no games) I care about and Google Maps is fast. The camera was great.
I recently upgraded to a Pixel 9a because the repair center who broke my 4a in a number of ways gave me $350 and the 9a was $400 a few months ago. It also runs everything well and the camera is great. In my day to day it makes no difference from the 4a, literally none. It even has the same storage space of which I have around 50% space left with around 4,500 photos saved locally.
========
ASIDE
========
I have a pretty decked out M4 MBP laptop issued by my employer for work. I use it every day and for most tasks I feel no real difference vs my machine. The only thing it does noticeably faster is heavily CPU bound tasks that can be parallelized. It also loads the web version of Slack about 250ms faster, that's the impact of a $2,500+ upgrade for general web usage.
I'm really sensitive to skips, hitches and performance related things. For real, as long as you have a decent machine with an SSD using a computer feels really good, even for development workloads where you're not constantly compiling something.
One concern I'd have is that if the short-term supply of RAM is fixed anyway, even if all daily computer users were to increase their budget to match the new pricing and demand exceeds supply again, the pricing would just increase in response until prices get unreasonable enough that demand lowers back to supply.
For starters, hardware doesn't innovate quickly enough to buy a new generation every year. There was a 2-year gap between Ryzen 7000 and Ryzen 9000, for example, and a 3-year gap between Ryzen 5000 and Ryzen 7000. On top of that, most of the parts can be reused, so you're at best dropping in a new CPU and some new RAM sticks.
Second, the performance improvement just isn't there. Sure, there's a 10% performance increase in benchmarks, but that does not translate to a 10% productivity improvement for software development. Even a 1% increase is unlikely, as very few tasks are compute-bound for any significant amount of time.
You can only get to $15k by doing something stupid like buying a Threadripper, or putting an RTX 4090 into it. There are genuine use-cases for that kind of hardware - but it isn't in software development. It's like buying a Ferrari to do groceries: at a certain point you've got to admit that you're just doing it to show off your wealth.
You do you, but in all honesty you'd probably get a better result spending that money on a butler to bring your coffee to your desk instead of wasting time by walking to the coffee machine.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
Entirely suboptimal, granted. Massively reduced ROI for each dollar spent above maybe $2000 per year. But when the maximum extreme end of the spectrum (without going to 0 ROI) is still less than $1000 per month, why is it even a question?
People who make a fraction of what I do spend more than that on their shiny pickup truck with a perpetually empty bed. These are my work tools.
Malagasy data annotators work for like $100 a month. You're pretty crazy to suggest that they should spend more on the hardware than they earn from it.
I mean, as a frontline underpaid rural IT employee with no way to move outward from where I currently live, show me where I’m gonna put $5k a year into this budget out of my barren $55k/year salary. (And, mind you - this apparently is “more” than the local average by only around $10-15k.)
I’m struggling to buy hardware already as it is, and all these prices have basically fucked me out of everything. I’m riding rigs with 8 and 16GB of RAM and I have no way to go up from here. The AI boom has basically forced me out of the entire industry at this point. I can’t get hardware to learn, subscriptions to use, anything.
8GB or 16GB of RAM is absolutely a usable machine for many software development and IT tasks, especially if you set up compressed swap to stretch it further. Of course you need to run something other than Windows or macOS. It's only very niche use cases such as media production or running local LLM's that will absolutely require more RAM.
No modern IDE either. Nor a modern Linux desktop environment either (they are not that much more memory efficient than Macos or windows). Yes you can work with not much more than a text editor. But why?
I assume those aren't US dollars? My suggestion is to go on a classifieds site and find a bargain there. You can find 2x8GB SODIMM DDR4 for like 20€ in Germany, because it's the default configuration for laptops and people are buying aftermarket RAM to upgrade to 2x16GB leaving a glut in 2x8GB configurations. Something similar happened to the desktop DIMMs but to a lesser extent because you can put four of them into a PC.
It's the "how much can the banana cost, $10?" of HN.
The point they're trying to make is a valid one - a company should be willing to spend "some money" if it saves time of the employee they're paying.
The problem is usually that the "IT Budget" is a separate portion/group of the company than the "Salary" budget, and the "solution" can be force a certain dollar amount has to be spent each year (with one year carry-forward, perhaps) so that the employees always have good access to good equipment.
(Some companies are so bad at this that a senior engineer of 10+ years will have a ten year old PoS computer, and a new intern will get a brand new M5 MacBook.)
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.