When I finished my degree in Spain 15 years ago, programmers weren't a well paid or well regarded job. In fact I used to live paycheck to paycheck (making less than some of my friends in non-tech jobs), and programming was considered "for nerds".
I think this didn't change much for the following decade, but COVID turned the tables. Suddenly Spanish programmers could take jobs for global companies for very competitive salaries. I've heard of people making 70-80k euros working from home, which is more than 2x the average national. Recently someone told me that when they offered a Spanish programmer a job for 100k USD, she started crying.
So the "correction" that we are seeing seems more like a logical consequence of the contracting job market, plus the move away from remote work for many companies. It seems more like a correction than a crisis.
> programmers weren't a well paid or well regarded job...and programming was considered "for nerds"
It was similar in the US in the 90s. The pay was OK but nothing compared to the past 15 years or so, and socially it was not well regarded at all. The movie "Office Space" depicts what it was like pretty accurately.
In the UK, which was nowhere as bad as Spain or Italy, programmers were making £17k/year pre-08 outside London. IT was just a cost base, minimized at all costs, the main job for programmers was academia, most people working in companies were hired out of high school. A job in IT or even a CS degree was career suicide, employment rates were one of the lowest of any degree and if you got a job you were just a cost.
It is hilarious to see the change. Companies that have tried to re-sell themselves as tech companies when ten years ago all their "IT" staff were just working on managed services.
The reason why this is relevant is because most of the change recently has been spillover from the US. If that spillover stops, there are no jobs domestically, you are a mechanic living in a horse-drawn society. The advantage the US has is compounding rapidly.
I've spend the past 7 years living in about 25 or so countries around the world and one thing that really surprised is how far most of the world is behind the US in terms of business digitalization at a process level. You don't necessarily notice it at first as a visitor because these companies all have websites and apps and whatnot, many of which are nicer than their US counterparts, but these are often just a superficial gloss and fundamentally there's a lot more companies either literally paper driven or who have created a digital copy of the same paper process they've been using for decades and called it a day without really changing anything at all.
Japan in particular stood out because I'd always thought of it as the ultra futuristic land of robots, but a lot of simple tasks I'd expect to do on an app or website involve sending a fax or an in-person visit.
This type of approach doesn't add much value to the business. They still employee the same people doing the same work to accomplish the same result in the same amount of time, but now they need to pay a programmer as well.
Again, we had this problem in the US in the 90s, people would make jokes about how tech had promised a "paperless office" but they had more paper than ever--but that was only because the old dudes working back then insisted on printing every damn thing they received digitally! So much work at that time used to go into making printer friendly versions of every piece of information, experience with Crystal Reports was always an important box to tick on any job application.
I do wonder how willing EU countries would be to do this though, with an older population and a lot more sensitivity to preserving jobs at any cost.
Yes, people worrying about AI when the gain from computers still isn't really here yet.
In Europe, there is still a massive reluctance to hire programmers. So most large corporates/governments are buying from Oracle or SAP, get in an Indian contractor to implement, it inevitably goes wrong, and then they say how unreasonably expensive it all is...and it would be even more expensive if they hired onshore.
And then you talk to the person who is managing the contract for the purchaser...struggles with Excel, struggles with basic tasks in Windows...I am sure this happens in the US too btw, but it is just unreasonably extreme in Europe. The dominance of SAP in particular is an indication that something is very wrong with enterprise software.
Not sure this is true. I started contracting in the UK as a programmer in 2001, no qualifications, a high school drop out. Outside London (the north) I could make £300 a day back then pretty steady.
Never been short of work for long, always contracted. Not needed my CV for 15 years at this point and £800-£900 a day is common amongst peers. My story also doesn't feel unusual amongst people I work with.
Subjective account, but I've personally only felt the curve going up. Perhaps I'll have my shock one day.
You didn't do any IT work in the late 90s I guess? During the dotcom boom and Y2K hype it was major money. I was only starting so I didn't make a lot but every job came with a company car. They were screaming for people.
There was a bit of a lull pre crisis but even in 2008 I wrote custom callcenter integrations and it was really good money. About 50k euro plus car. And this was in Holland, nothing like London kind of money.
High incomes in tech were a symptom of the money printing that was supposed to keep the economy afloat largely never reaching past the ranks of the alumnae of certain colleges and universities. They exploded from the liquidity surplus that, in a more sane world*, would have gone towards lifting, say, service and retail wages (which instead languished for the better part of a decade). Six-figure SWE salaries have less to do with expertise than QE.
*One where restrictions on advertising and data collection/sales kept software as a pay-to-play enterprise, necessitating higher wages among those who would be users.
High tech incomes don't come from empty money printing, but they come from the value that automation at scale unlocks for societies. To stay with the service industry analogy, in a service industry job you can only make X burgers an hour, but you can make software that runs on a billion computers and makes the lives of billions of people better.
Tech incomes got high precisely when Google etc. started making big money from their super scalable services. It also very happily coincided with the age of (well networked) college dropouts being able to found billion dollar companies within just a few years. So the established companies had to dry up the market and make it less profitable to join a startup, on average, than to stay at your FAANG job.
You rather have an engineer develop a service at your FAANG that will never see customers than have them build the competitor to your core money maker.
That, and the demand created by digitalization literally outpacing the number of graduates joining the industry.
Incorrect. As GP pointed out, tech salaries were relatively in line with other mid-level professional roles until the late 2000s, well after software and networked automation had transformed many industries. This was because, at the time, tech companies had to actually have a business plan that sought and gained profits relatively early in the company's life.
What changed - what allowed college dropouts to become billionaires - was first the pre-GFC bubble market (making their net worths essentially inflated). Afterwards, the endless rounds of money-printing used to escape a depression and then sustain growth drove high tech salaries themselves. Without this liquidity, startups would not have had access to VC money and would have had to turn a profit (not just attract further rounds or a successful IPO) or perish within the first few years of existence, kneecapping competition for workers.
This is also why tech salaries in Europe and Asia have stayed relatively sane. Save Japan, they didn't run massive QE regimes and so did not see the same VC environment develop. Why do you think tech companies were the first to see large layoffs after the FFR started rising? It wasn't just that they'd overhired during COVID; that personnel was genuinely needed for the projects they'd had planned. High interest rates screw mightily with their business model, particularly labor pay. You say that demand outpaces the supply of workers, so how could we see layoffs at all? Simple: it's the only alternative to salaries falling when the labor budget is shrunk (through a complex chain of knock-on effects that turn monetary policy into fiscal consequences) by the money printer getting turned off.
As for why tech, and not other industries: in software, networking, etc., it's easy to obfuscate costs and value, particularly when it's unclear what can or has been automated, what a service is worth to a customer, what that customer is actually paying with and for, etc. Arbitrage opportunities abound, and all you have to do is set up shop and choose a growth metric that you can sell the market.
All the major Internet properties are falling to pieces. I’ve encountered more hung GMail loading screens and empty IG story trays (which should be impossible if feed loads at all) in the last week than in the decade starting at 2012.
GPT-4 didn’t go hyperbolic or whatever, the regular Internet needs to keep working after all.
They’ll keep kicking for a few more months, they really want hackers brought to heel, but rates aren’t doubling again and “AI” hasn’t hockeysticked.
> All the major Internet properties are falling to pieces. I’ve encountered more hung GMail loading screens and empty IG story trays (which should be impossible if feed loads at all) in the last week than in the decade starting at 2012.
Nothing like that on my end. Sounds like a confirmation bias story: you think the tech industry is collapsing, so you see failure everywhere.
Well I was basing that more on my years of managing IG tray load outs than some guess, but I suppose I have concede that me and a few friends observing these things shit the bed isn’t data.
Zero Spanish programmers would have gotten hired for salaries that were worth more than the money they brought in to the company hiring them, so the only way for a "correction" is if their work is not worth much anymore. Companies do not overpay when they outsource.
Why would any company fire a Spanish programmer that brings in money to the company, only to hire somebody more expensive in-office?
Why do you think “leaders” are pushing for return to office?
Why do you think Paul Graham compares it to communism?
It’s not because of some culture building nonsense - it’s 2023 and by now we know it works.
They do it because they hate the leverage and mobility it gives employees. How someone can’t see why SV investors and the CEOs of various companies and the giant conflict of interest is beyond me.
There is an economic advantage to having "nerds" in "tech cities" like SF or general Bay Area. And if you disagree, then I recommend you attend a simple Tech meetup in SF and then go to one in say, Fort Wayne, Indiana, and witness the difference.
I see where you are pointing, of course, but what is economic advantage, and for who? I suspect that whatever meetups are rather a social byproduct of whatever people concentrated in a place where US-specific culture of networking reigns. From my observations, such kind of events are noticeably less popular outside of the North America, and it doesn't really seem to hurt.
Anecdotally, my friends from India make trips to SV just for in-person interactions. Their claim is that India has good tech talent and a somewhat decent tech job scene, but a bleeding edge expertise (eg. LLMs) can only be found in SV. That doesn't mean there are no experts in India, but they are very few and scattered around. Whereas you can come to SV for a week, attend 2-3 meetups each in SF / Palo Alto coupled with some other expo / conference and you get a great bang for the buck on your trip.
I feel in today's workplace we need to collaborate all the time (preferably in a loud open office) and nobody needs to do any actual work. I love to collaborate but my best work got done when I had several weeks of uninterrupted focus on a task.
Yes, there's a balance. Best focused, individual work is probably done alone, in quiet.
But with collaboration, sometimes weeks of work can be cut out if one discovers the right work to do. And even better if a new direction or idea can be found. This is all of course technically possible remotely, but it doesn't happen in the general case because we didn't evolve to work by ourselves without f2f contact.
> This isn't nearly as good of a point as you think it is.
Why not? If we evolved to relate to a small tribe and now are exposed to huge numbers of people then I'd say there's proof humans can and are adapting.
What does adapting mean to you? Surviving? Yes. Thriving? Epidemics of loneliness, mental health declines in young people exposed to technology and social media since birth and online echo chambers allowing violent conspiracy theorists to hype each other up to commit crimes. Doesn't seem like thriving to me.
And you didn't respond to my point about being as effective. This is all relative to baseline in office working.
It's funny to bring up hospital and grocery stores when the subthread is about RTO and collaboration in person. Those would probably be the last places to go fully remote.
I'm not sure what you're trying to say here. Are you saying because you think tech jobs aren't important then collaboration in tech jobs isn't important as well?
Yes, that's what I'm saying. The commuting and associated infrastructure are wasted on most office jobs, and they're not important enough to warrant it.
The business benefits from collaboration though, and people tend to grow professionally as well. It's not important as being a surgeon saving lives sure.
Taking your logic further, why bother with anything? Why even work remotely? Just don't work. Nothing is as important as putting food on the table and a roof over your head, so become a builder and a farmer.
Sure, but employees would rather work and live away from the grind-zero. If businesses want to attract talent, they need to accept this. Again, it's about leverage. Employees need to hold the line
As I said in another comment, part of the job as an experienced/senior engineer in tech is to work with others, and not be a hero programmer.
I actually think that there's room here for a hybrid. Have one track for remote engineers and one for in person. Remote engineers get different work, they generally have a lower ceiling in career potential. In exchange they get what they want, they work mostly by themselves with videoconf/slack as aid.
Of course there's always the exceptions. If you're the top 0.0001% in domain experts in a field, you can do whatever you want.
> employees would rather work and live away from the grind-zero
Citation needed. Most of the people I talk to prefer a hybrid setup - 2/3 days from home, 2/3 days in office. For office days, an obvious caveat is that the entire team has to be present on those days to realize the benefits.
> If businesses want to attract talent, they need to accept this.
Looking around here in SV, plenty of businesses seem to be attracting great talent even with RTO policies.
End of the day, you can always quit and do another job. Part of the role of senior+ engineers in tech is to be a force multiplier and work across silos. If you don't want to do that, there's still remote jobs out there where if you're a good enough pure coder you can get paid. Sure you won't have the same career progression but it's different priorities right?
Yes, I have. I love it. It doesn't however contradict anything I said, because unlike say the laws of thermodynamics, it's just something that someone made up one day having watched the best version of The Office.
Sure, for some jobs there's probably not much benefit to collaboration. I'm mostly talking from the perspective of FAANG type companies, where there's very high potential impacts from small groups of focused people. If you're not working at that scale, the stakes are lower and it may not matter as much.
> They do it because they hate the leverage and mobility it gives employees
This really goes both ways. If you can work from anywhere, I can hire from anywhere, possibly from somewhere where the cost of living is much lower, where people are willing to do the job for less. The leverage really is coming from there still being a limited supply of labor, but that is not a perpetual given.
Offshoring has been around for decades, and if it produced the desired results at the desired price then employers would replace every expensive employee with a cheaper one (where possible). As it stands, offshoring is commonly used by large body shops, and a non-zero amount of work is done offshore, yet the cost for tech talent has increased (at least on average, the last year or two has seen somewhat of a correction).
There are definitely highly talented individuals in low cost areas, but they quickly realise they can earn a Big Tech wage, and are willing to relocate (often relocation is a bonus; as much as we complain sometimes, quality of life in a HCOL area is comparably quite nice compared to many LCOL areas).
The threat of hiring a bunch of cheap offshore labour is a bogeyman; companies would already have done this if it were feasible.
I'm not even talking about offshoring, the difference in cost of living even within the US can be massive, especially if you take off the commute and associated expenses. Quality of life is subjective of course, but in many respects is inversely correlated with the availability of high-paying jobs.
> There are definitely highly talented individuals in low cost areas, but they quickly realise they can earn a Big Tech wage, and are willing to relocate
It is also lopsided in favor of employees in SV or other tech hubs. They can command high salaries at a mediocre talent level simply because they have no competition.
In the linked Paul Graham comment, he is not wholly supportive of remote work. The comparison is that communism, like remote work, can appear to work, but he attributes this to the fact that it sprung from a healthy foundation (his example is in-person work). If anything, Graham leans toward in-person work in this tweet.
I think you have significantly misunderstood things. Paul Graham comparing it to communism is an example of a business person attacking remote work using one of the standard boogeymen.
So, that link answers "Why do you think Paul Graham compares it to communism?"
Communism seems to be his favorite example of something that can work initially from a good starting point, but fails over time, and he also believes that's true of Work From Home.
He might "hate the leverage and mobility it gives employees", but his limited comparison of it to Communism isn't strong evidence in either direction.
You are being way too generous with this interpretation.
Paul Graham is not dumb and he knows what a loaded term communism is.
On top of that, you’re taking his comment and interpreting it in the most superficial way without even attempting to think about his choice of words or why he’s so against the idea in general.
Maybe because fully remote simply doesn't work? Covid was a forced event and people coasted for an year on the relationships build in office until Jan 2020. Over time as people moved on and colleagues one has never met in person showed up, things started fraying.
Also, I believe there are some companies who can be fully remote but that has to be in the DNA of the company culture, right down to how we communicate and collaborate. Very few people have skills to rewire their communication habbits and so they go back to what has worked previously for them.
This is a poorly translated article with a sensationalist headline. The only real facts here reflect the general trend seen in the last year where the tech job market is slightly cooling off.
I'm kinda seeing the opposite. Recruiters are falling over themselves contacting me and this wasn't the case a year ago. I'm in cyber.
It's a bit annoying because while I'm mildly interested in other opportunities, most don't even bother reading my profile and just dump their crap on me. I selected the remote only option but most jobs that are shared with me are office based or hybrid. I'm officially hybrid now but I rarely visit the office and I want to keep it that way. Offices have become horrible places now with the hotdesking.
As someone that works in media/advertising, the first thing I check when I see something like this is their ads.txt [1] -- It looks like they're managed by ezoic whom you forward your entire dns and they inject ads in the middle (AI is mentioned, naturally). Definitely ruining this whole site's experience.
This reads as if it was written with a large number of buzzwords that did not translate well.
Also, there's another article on the same site, "End to the great era of layoffs: big technology companies reactivate hiring but with conditions". Which came first? There's no date.
I think this didn't change much for the following decade, but COVID turned the tables. Suddenly Spanish programmers could take jobs for global companies for very competitive salaries. I've heard of people making 70-80k euros working from home, which is more than 2x the average national. Recently someone told me that when they offered a Spanish programmer a job for 100k USD, she started crying.
So the "correction" that we are seeing seems more like a logical consequence of the contracting job market, plus the move away from remote work for many companies. It seems more like a correction than a crisis.