I haven't had to deal with this in open source, but I have had to deal with coworkers posting slop for code reviews where I am the assigned reviewer.
I've noticed that slop code has certain tell tale markers (such as import statements being moved for no discernible reason). No sane human does things like this. I call this "the sixth finger of code." It's important to look for these signs as soon as possible.
Once one is spotted, you can generally stop reading; you are wasting your time since the code will be confusing and the code "creator" doesn't understand the code any better than you do. Any comments you post to correct the code will just be fed into an LLM to generate another round of slop.
In these situations, effort has not been saved by using an LLM; it has at best been shifted. Most likely it has been both shifted and inflated, and you bear the increased cost as the reviewer.
Although I'm not wild about the new `io` parameter popping up everywhere, I love the fact that it allows multiple implementations (thread based, fiber based, etc.) and avoids forcing the user to know and/or care about the implementation, much like the Allocator interface.
Overall, I think it's a win. Especially if there is a stdlib implementation that is a no overhead, bogstock, synchronous, blocking io implementation. It follows the "don't pay for things you don't use" attitude of the rest of zig.
Isn’t “don’t pay for what you don’t use” a myth? Some other person will using unless you are a very small team with discipline, and you will pay for it.
Or just passing around an “io” is more work than just calling io functions where you want them.
I don't see how those links are relevant. Nobody claimed there is no malware on linux.
However the feature and culture of software distribution very much makes it safer. The overwhelming majority of malware gets distributed over ads from websites or search results. Package manager prominently used by all linux distros remove that attack vector or at the very least minimize it.
Ofc it does not prevent somebody from still executing random binaries from the internet if they really want to, nothing does.
The most disturbing is "Raw Cow Milk from Farm in Glass". It still is loaded with plastic, even though it is one of the least processed things on the list.
My only question is was the cow milked by hand or by machine? The tubing in a milking machine almost certainly contains plastic.
I visited the largest pig farm in Ohio and they grind up bags of old dog food, plastic bags and everything. Literally pallets full of expired food, just dumped into the grinder. Then they spread the waste and sell it as organic fertilizer, plastic is now everything.
Hay is often bound up into bales with plastic twine. Cattle happily eat bits on accident. They used to use wire, but that caused a much more serious problem for the cattle.
They used to use natural twines like jute, which was better. They really don't eat much of the plastic stuff. The larger pieces of it do kill them (choking, cholic, etc).
I bring up raw milk because it is minimally processed (I don't even consume it personally). I used it as an example because it shows how much plastic is embedded in the food chain and ecosystem by looking at one of the least processed items on the list.
I use it on my own docs to remove extraneous details. I often write too many words in early drafts and LLMs summarize my writing faster than I can (although I don't know if they do it better than I would/could).
Then the next version of my doc becomes the summarization, and I only flesh out details where the summarization went too far and removed critical details.
And that increase in LLM usage has resulted in an enormous increase of code duplications and code churn in said open source projects. Any benefit from new features implemented by LLMs is being offset by the tech debt caused by duplication and the maintenance burden of constantly reverting bad code (i.e. churn).
Yes. The internet has also created a ton of email spam but I wouldn't say "we've never seen a single valid contribution to our project that had internet help". Many millions of developers are using AI. Sometimes in a good way. When that results in a good MR, they likely don't even mention they used Google, or stackoverflow, or AI, they just submit.
I mean, I certainly would say “I’ve never seen a single commercial email that was valid and useful to me as a customer”, and this is entirely because of spam. Any unsolicited email with commercial intent goes instantly, reflexively, to the trash (plus whatever my spam filters prevent me from ever seeing to begin with). This presumably has cost me the opportunity to purchase things I genuinely would’ve found useful, and reduced the effectiveness of well-meaning people doing cold outreach for actually-good products, but spam has left me no choice.
In that sense, it has destroyed actual value as the noise crowds out the signal. AI could easily do the same to, like, all Internet communication.
I mostly like Vim because it is available everywhere and in the terminal. In its default configuration it isn't the most powerful IDE, but it is vastly more powerful that any other default experience in any other text editor (IMO). It also makes it trivially easy to shell out to external tools for modifying the text, so even for a set up that isn't heavily riced you can still do a ton.
Even when I'm forced to use another IDE for work, I try to find Vim keybindings ASAP. I like that I can learn Vim motions and actions once and use them in nearly every IDE.
My understanding is that TIOBE is primarily based on number of search engine results. Other results like number of active (a commit within the last 6-12 months) open source repos or number of new repos in a given language are other metrics worth considering. The project you linked seems to be one like this.
A high search rate doesn't necessarily mean high rate of real world usage. Correlating across multiple metrics would be a better way to measure popularity.
Although popularity itself may still be a weak signal depending on your purposes.
Waiting ages for basic serif pages to load over your 56k (or 128k connection if you were rich and had ISDN)? Nope.
Downloading tracks from KaZaa/WinMX/Limewire/Napster for a million hours only for them to be some warped shit that the studios planted? Nope.
Getting malware just for existing? Early software firewalls that burned CPU cycles/crashed your PC? That were the only option because hardware firewalls were stupid expensive and not at all practical for residential use? Nope.
Norton Antivirus? ABSOLUTELY NOPE.
Blue screens when you looked at IE or Navigator the wrong way? Nope.
Flash? Lol, nope.
WAP? The 2004 kind? Lol, hell nope.
"This page is best viewed on Internet Explorer", i.e. IE4/5/6 or it's basically unusable? Nope.
Having to actually go seven or eight o's into the Gooooooooooooooooooooooogle footer to find what you were looking for? Def nope.
Almost everything about using the Internet is better today IMO. Faster, prettier, more secure and more cross-platform.
You have to work hard to get hit with a virus these days, especially on iOS/macOS or Linux, though it's much harder on Android these days too. Also, I loved wasting my life on /., but Reddit is so much better, even after the API-pocalyse.
I definitely miss open messaging platforms though. AIM for life.
The old internet had something today's internet lacks: a justified belief that the future would be better. Things were new and exciting, and you saw opportunities and rapid improvements everywhere. Today it's just governments and megacorporations, and bureaucracy upon endless bureaucracy.
> Waiting ages for basic serif pages to load over your 56k (or 128k connection if you were rich and had ISDN)? Nope.
> Downloading tracks from KaZaa/WinMX/Limewire/Napster for a million hours only for them to be some warped shit that the studios planted? Nope.
I was there too, and realized that these sort of reductions in speed made one far more mindful of what one was doing
> Almost everything about using the Internet is better today IMO. Faster, prettier, more secure and more cross-platform
This too is particularly debatable. Applications are thin wrappers around web browsers, there are constant annoyances (want to receive notifications for this webpage? Not now? We’ll ask you later.) I bet if I pulled someone from 2005 they’d look at a lot of things on a current website and see malware. And is it really more cross platform when we’ve achieved that by having less platforms?
I'm not sure anything on your list is even about "the internet", as opposed to other tech around it. Do you know what people mean when they say "the internet?".
Absolutely. I remember being about 14 or 15 years old, reading old .txt files about, like- how to build blue boxes and experiment with the phone system, C programming tutorials for MUDs that had peaked in the late 90s, IRC archives (even though IRC was still around, I had no way of finding my way to good channels), and getting this distinct sense that I had just missed something really cool, and was stuck with an Internet that had already passed its prime.
I've noticed that slop code has certain tell tale markers (such as import statements being moved for no discernible reason). No sane human does things like this. I call this "the sixth finger of code." It's important to look for these signs as soon as possible.
Once one is spotted, you can generally stop reading; you are wasting your time since the code will be confusing and the code "creator" doesn't understand the code any better than you do. Any comments you post to correct the code will just be fed into an LLM to generate another round of slop.
In these situations, effort has not been saved by using an LLM; it has at best been shifted. Most likely it has been both shifted and inflated, and you bear the increased cost as the reviewer.
reply