What you say is true, but the amount of "grunt work" is not constant over the years. In fact, I think the amount of "grunt work" in teh tech industry is just growing and not shrinking; I think the following look is quite obvious:
- amount of current grunt work: X
- new tech Z appears that makes X be reduced to 0.1X
- at the same time Z enables new ways of doing things. Some things become grunt work because they are a byproduct of Z
- amount of current grunt work: Y (where Y ~= X)
- ...
If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be esentially zero today. New tech just brings automation and grunt work. I don't think we will live in a society where there's practically no grunt work.
The most recent example is AI: there are AI tools that generate sound, images, video and text... but if you want to create a differentiating product/experience, you need to combine (do the grunt work) all the available tools (chatgpt, stable difussion, etc.)
>If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be essentially zero today.
If you wanted to have a simple database application in the 1990s, Delphi, VB6 or MS-Access were most of what you needed to get it done. The UI was drag and drop, the database was SQL, but you almost never touched it, mostly it was wiring up events with a few lines of code.
The work was commodified out of the way! Domain experts routinely built crude looking but functional programs that got the job done. It was an awesome time to be a programmer, you just had to refactor an already working system, fix a few glitches, and document everything properly, and everyone was happy.
Then everyone decided that all programs had to work on Steve Jobs' magic slab of glass in a web browser connected through janky Internet, and all that progress was lost. 8(
Are all of those proprietary products? I can't speak on your experience, but if linux was created in 1991, seems like in another angle you're bemoaning the rise of OSS and web.
I'm just a web developer that learned everything from online resources. So i think we are both biased on different ends on the spectrum.
Open source is great, Lazarus does a pretty good job of replacing Delphi.
Microsoft went insane with .NET so VB6 was killed in the process.
Access automatically handled table relationships, building queries and seeing them as SQL, and the report engine was pretty good. Thanks to ODBC, you could use the same database across all of them, or hook up to a real SQL server when it came time to scale up.
What's missing is the desktop and a stable GUI API these days. Windows apps from the 1990s still work, because they are distributed as binaries. Most source code from back then will not compile now because too many things have changed.
I love Open Source, but it doesn't solve everything.
> Microsoft went insane with .NET so VB6 was killed in the process.
I'd love to hear more about this perspective or any links to get more of it.
I did a (very) little hobby VB6 and loved it. Never made switch to .NET at that time (I was young, it was a hobby).
Having recently worked through part of a .NET book, I was pretty impressed by how far MS took it (although it seems extremely mind-numbing). Obviously it took a long time and had false starts, but MS stuck with it. On a personal level, I am very opposed to the entire model in an ideological sense, but it does seem to make a lot of business sense for MS, and it seems to cover a lot of cases for a lot of businesses.
So, was Microsoft's insanity with .NET just the obsession part, or doing things poorly for a while, until eventually getting it "righter", or is the insanity still pretty apparent?
I really would love to learn more about the historical-technical aspects of this specific comment quote, from VB6 to modern day, because it fits my experience perfectly, but I've had second thoughts about the position more recently. The more the specifics the better.
The insanity was to abandon the advantage they had with VB/COM, in order to challenge Java on its own ground. They threw away the baby with the bathwater. The C# pivot also slowed down their desktop efforts pretty dramatically, doubling the blow.
They were lucky Sun squandered the opportunity they had engineered with Java, focusing on the hardware side and missing the boat on browser, virtualization and services. If Sun had bought Netscape and then focused on building something like Azure, instead of fighting the inevitable commoditization of server hardware, they would have eaten Ballmer's lunch.
Disclaimer: I am not a .Net programmer, so these are just my thoughts and impressions as someone on the outside who followed the development from a distance.
I think a lot of the focus on .Net was driven by MS and Balmer's fear of Java. At the time, almost all desktop computers were running Windows 9x/2k. If 3rd party applications were developed with cross-platform Java, the customers would no longer be locked in to Windows.
First they tried the famous embrace/extend/extinguish approach by creating a Windows-specific version of Java. Sun fought back, and MS decided to push .Net instead.
It seemed to me that the initial strategy was to claim .Net was cross platform, but focus more on Windows and let open source projects like Mono be their cross platform "alibi". They changed strategies after a while, and now I guess the cross platform is more real.
> Windows apps from the 1990s still work, because they are distributed as binaries.
Only if you have the right libraries, and runtimes, and OS interfaces, and even if you have all that, oh no, it's a MIPS binary and you don't live in 1996!
Any proprietary API exists precisely as long as the owner says it does. Open standards don't suffer from that malady.
Totally agree. There is no backward compatibility with .NET runtime - if your application is built/linked to a given version, it won't work with any other version of .NET
That's simply not true. Newest .NET 8 does not need the assemblies you reference to target .NET 8 - as long as the TFM is any version of 'netstandardx.x', 'netcoreappx.x' or 'net5'+ it will work.
You can even make proxy-projects that target netstandard2.0 but reference .NET Framework and with certain compat shims the code will just run on .NET 8 unless it relies on some breaking changes (which have mostly to do with platform-specific behavior, there have been no breaking changes for the language itself since I think C# 1 or 2? some odd 20 years ago).
As for the runtime itself - the application can restrict itself from being run by a newer version of runtime but you can absolutely do so. The lightweight executable that just loads runtime and executes the startup assembly may complain but just try it - build a console app with 'net5.0' target and then run it with latest SDK with 'dotnet run mynet5app.dll' - it will work.
I think the point is that the Access, Lotus Notes tooling was in largish corporations somewhat ubiquitous.
The experience of this tooling was make a change and it was in production. It was incredibly simple and productive to work with given the needs of the time.
There was also plenty of opportunities to make a mess, but I don't think that has really changed.
Learning was not difficult, you just had to be prepared to spend time and some money on books and courses.
It is not a tooling set you would want to go back to for a bunch of different reasons but it worked well for the time.
> It is not a tooling set you would want to go back to for a bunch of different reasons but it worked well for the time.
I remember using lotus domino at one of my first jobs. There were all sorts of things I hated about it. But you could have a database - like the company’s mail database. And define views on that database (eg looking at your inbox, or a single email). And the views would replicate to a copy of that database living on all of your users’ computers. And so would the data they needed access to. It was so great - like, instead of making a website, you just defined the view based on the data itself and the data replicated behind the scenes without you needing to write any code to make that happen. (At least that’s how I understood it. I was pretty junior at the time.)
Programming for the web feels terrible in comparison. Every feature needs manual changes to the database. And the backend APIs. And the browser code. And and and. It’s a bad joke.
Commodification has a problem that for awkward teenagers to make the same fries every day, we have to ossify the process of making fries. But making good software needs us to work at both the level of this specific feature and the level of wanting more velocity for the 10 other similar features we’re implementing. Balancing those needs is hard! And most people seem content to give up on making the tooling better, and end up using whatever libraries to build web apps. And the tools we have are worse in oh so many ways compared to lotus domino decades ago.
I wonder what the original lotus notes designers think of web development. I think they’d hold it in very low regard.
10/20/x years ago we didn't have DevOps, CloudOps, CloudFinOps, CloudSecOps, IaC experts, Cloud Architects, Cloud transformation experts, Observability architects, SREs, plus all the permutations of roles around "data" that didn't exist discretely, etc etc etc.
I'm not sure what the parent post meant exactly, but I do agree there is tons of grunt work -- I've seen big name SV companies where large parts of their work flow include parts like "and then just every hour you need to do something in a slow UI that can't be automated" to keep vital systems working. I would say that's really grunt work, and there are even persons in such companies where their only task is doing such grunt work. Truly I've been told by clients I work with they have entire double-digit sized teams where the members only responsibility is to reboot VMs that breach specific resource thresholds -- easily automated and even built into most hypervisors, but for whatever reason these tech giants opted for a human to do it -- the only semi-reasonable explanation I got from one client was that their infrastructure team got outsourced and they laid off the only people who knew how to use the automation tooling. It's a dumb reason for sure, but at least I can understand why they opted for the manual grunt work.
Similarly, keep in mind a lot of this grunt work is just to satisfy some reporting requirement from somewhere -- some person(s) in the company want to see at least X% of uptime or Y LOC every day, so you get people trying to write a lot of yak shaving code that basically does nothing except satisfy the metrics or ensure that uptime % always looks good (i.e., they don't fix the cause of the downtime entirely, they just get the endpoint that is checked to determine update working well enough so it reports to the monitoring system and they leave it at that)
- amount of current grunt work: X
- new tech Z appears that makes X be reduced to 0.1X
- at the same time Z enables new ways of doing things. Some things become grunt work because they are a byproduct of Z
- amount of current grunt work: Y (where Y ~= X)
- ...
If the technological progress had stopped in the 2000s, then all the grunt work (originated in the 90s) would be esentially zero today. New tech just brings automation and grunt work. I don't think we will live in a society where there's practically no grunt work.
The most recent example is AI: there are AI tools that generate sound, images, video and text... but if you want to create a differentiating product/experience, you need to combine (do the grunt work) all the available tools (chatgpt, stable difussion, etc.)