This has been recognised as the next important milestone for Rust and there's work towards making it happen. The details are uncertain yet, because move constructors are a big change for Rust (it promised that address of owned objects is meaningless, and they can be simply memcpy'd to a new address).
The Rust folks want a more usable Pin<> facility for reasons independent of C++ support (it matters for the async ecosystem) and Pin allows objects to keep their addresses once pinned.
Non-trivial C++ programs depend on OOP design patterns, webs of shared-mutable objects, and other features which Rust doesn't want to support (Rust's safety also comes from not having certain features that defeat static analysis).
Rust really needs things written the Rust way. It's usually easier with C, because it won't have clever templates and deep inheritance hierarchies.
Even if your goal is to move to Rust, it may make sense to start with Carbon or Circle to refactor the code first to have more immutability and tree-like data flow.
In Cloudflare's case, Rust is much more productive.
Rust modules can handle any traffic themselves, instead of a split between native code and Lua that is too slow to do more than config (Lua is relatively fast for a scripting language, but on the critical path it was a "peanut butter" slowdown adding latency).
At the size and complexity of a server serving 20% of the Web, Lua's dynamic typing was scary. Modules in Rust can enforce many more requirements.
Rust's solid dependency management also helps share implementations and config logic across modules and even different products, instead of everything having to go literally through the same server.
Rust is trying to systemically improve safety and reliability of programs, so the degree to which it succeeds is Rust's problem.
OTOH we also have people interpreting it as if Rust was supposed to miraculously prevent all bugs, and they take any bug in any Rust program as a proof by contradiction that Rust doesn't work.
> Rust is trying to systemically improve safety and reliability of programs, so the degree to which it succeeds is Rust's problem.
GNU coreutils first shipped in what, the 1980s? It's so old that it would be very hard to find the first commit. Whereas uutils is still beta software which didn't ask to be representative of "Rust", at all. Moreover, GNU coreutils are still sometimes not compatible with their UNIX forebears. Even considering this first, more modest standard, it is ridiculous to hold this software to it, in particular.
You would not be able to find the first commit. The repositories for Fileutils, Shellutils, and Texutils do not exist, at least anywhere that I can find. They were merged as Coreutils in 2003 in a CVS repository. A few years later, it was migrated to git.
If anyone has original Fileutils, Shellutils, or Textutils archives (released before the ones currently on GNU's ftp server), I would be interested in looking at them. I looked into this recently for a commit [1].
In this case I agree. Small, short-running programs that don't need to change much are the easy case for C, and they had plenty of time to iron out bugs and handle edge cases. Any difficulties that C may have caused are a sunk cost. Rust's advantages on top of that get reduced to mostly nice-to-haves rather than fixing burning issues.
I don't mean to tell Rust uutils authors not to write a project they wanted, but I don't see why Canonical was so eager to switch, given that there are non-zero switching costs for others.
>OTOH we also have people interpreting it as if Rust was supposed to miraculously prevent all bugs, and they take any bug in any Rust program as a proof by contradiction that Rust doesn't work.
Yeah, that's such a tired take. If anything this shows how good Rust's guarantees are. We had a bunch of non-experts rewrite a sizable number of tools that had 40 years of bugfixes applied. And Canonical just pulled the rewritten versions in all at once and there are mostly a few performance regressions on edge cases.
I find this such a great confirmation of the Rust language design. I've seen a few rewrites in my career, and it rarely goes this smoothly.
It might be a bit of bad publicity for those who want to rewrite as much as possible in Rust. While Rust is not to blame, it shows that just rewriting something in Rust doesn't magically make it better (as some Rust hype might suggest). Maybe Ubuntu was a bit too eager in adopting the Rust Coreutils, caring more about that hype than about stability.
> OTOH we also have people interpreting it as if Rust was supposed to miraculously prevent all bugs
That is the narative that rust fanboys promote. AFAIK rust could be usefull for a particular kind of bugs (memory safety). Rust programs can also have coding errors or other bugs.
People in Europe don't have the automatic anti-regulation sentiment that US has. Regulations, at least from consumer perspective, seem to be working pretty well in the EU.
- My mobile operator wanted to charge me $6/MB for data roaming, until the anti-business EU regulation killed the golden goose. Roaming is free across EU. The mobile operator is still in business.
- USB-C not just on iPhone, but also all the crappy gadgets that used to be micro-USB. Consumer prices on electronics probably rose by $0.01 per unit.
- Chip & pin and NFC contactless payments were supported everywhere many years before ApplePay adopted them. European regulators forced banks to make fraud their problem and cooperate to fix it.
- The card payment system got upgraded despite card interchange fees being legally capped to ~0.3%. The bureaucrats killed an innovative business model of ever-increasing merchant fees given back to card owners as cashback, which made everyone else paying the same prices with cash the suckers subsidising the card businesses.
- Apple insinuates they only give 1 year of warranty, but it magically becomes 2 years if you remind them they're in the EU.
It's hard to find good data sources for this, especially that StackOverflow is in decline[1].
IEEE's methodology[2] is sensible given what's possible, but the data sources are all flawed in some ways (that don't necessarily cancel each other out). The number of search results reported by Google is the most volatile indirect proxy signal. Search results include everything mentioning the query, without promising it being a fair representation of 2025. People using a language rarely refer to it literally as the "X programming language", and it's a stretch to count all publicity as a "top language" publicity.
TIOBE uses this method too, and has the audacity to display it as a popularity with two decimal places, but their historical data shows that the "popularity" of C has dropped by half over two years, and then doubled next year. Meanwhile, C didn't budge at all. This method has a +/- 50% error margin.
By far the most useful and helpful is job ads: it literally defines the demand side of the programming language market.
Yes, that does not show us how much code is running out there, and some companies might have huge armies with very low churn and so the COBOL stacks in banks don’t show up, but I can’t think of a more useful and directly measurable way of understanding a languages real utility.
I would assume so. I expect there to be a lot of job postings looking for more "sexy" technologies to create the visage that those companies are growing and planning towards the future. And conversely I wouldn't expect any job postings of old "streets behind" technologies like COBOL to be fake, as they wouldn't help with such signalling.
Yes to your point, COBOL which ranks very low here is still fundamental to the infrastructure of several major industries, with some sources [1] reporting that it is used in:
43% of all banking systems.
95% of all US ATM transactions.
80% of all in-person credit card transactions.
96% of travel bookings.
This may very well dramatically change in the next few years with such an emphasis on enterprise AI tools to rewrite large COBOL repositories. [2]
I can only speak to the two bigger German banks (i.e., Sparkasse and VR banks), but if you look at their outsourced development providers (Atruvia and Sparkasse Informatik), they're still offering incentives for their apprentices to learn COBOL, especially in the german dual apprenticeship programs which they can steer more easily than university courses. My wife has been doing COBOL for one of them since 2012, and the demand has never diminished. If anything, it's increased because experienced developers are retiring. They even pull some of these retired developers back for particularly challenging projects.
Sparkasse and VR aren't the two largest German banks. DB is at least double the size of Commerzbank which is again 100mn in assets ahead of DZ. I don't find it all that surprising that these small banks are still trying to keep their legacy systems alive, but it's not the case for the bigger boys. (Source: work for several of them)
Cobol is used in pretty much all enterprise legacy systems.
But "used in" doesn't mean that it's actively being developed by more then a tiny team for maintaining it.
As this graph we're commenting on is mostly talking about popularity/most used it's never going to rate higher, because for every one Cobol dev there are more then 100 Java devs employed by the same company
That's a pretty wild claim. What's legacy for you? I'd consider legacy e.g J2EE crap running on web[sphere|logic] as holding most of the points in that league table vs COBOL.
A legacy software to me is whatever the company that employs me says is said legacy software.
Pretty much every business I've worked at to date has had such legacy software, which was inevitably still used in some contexts.
It's not always obvious, because - following with the previous example numbers - only 1-2 Java devs will have to interact with the legacy software again, hence from the perspective of the remaining 98, Cobol doesn't exist anymore.
In retail banking I'm sure that this could be true. Working in investment banking, I never saw a single COBOL application, or had to have my C++/Java/$MODERNLANGUAGE code interact with one.
Corp bank here, everyone has rumours about COBOL systems but no one I've ever spoke to has seen, interacted or has any other evidence these really exist anymore either.
But I asked for a bank statement from my old savings account a few years old and it took two weeks to print out, printed in monospace dot matrix.
Or the betting company that I was a customer that suspends betting everyday 6:30am for an hour for daily maintainance. Ironically, they would accept bets for football matches played at the time, but the system was shut down.
You haven’t seen or heard them because they are abstracted away by APIs, circuit breakers and proxies. Almost ALL banks, credit card companies, travel systems and other high throughput transaction systems run on mainframe that is written in COBOL.
I think the issue here is that people working in fintech don't seem to come across these systems much, if at all - if you know one specifically, please tell us.
It's still there at the accounting/backend level. Automated Financial Systems Level 3 and it's replacement Vision are commercial loan systems.
LVL3 is pure cobol. It has been recently deprecated but there are many banks who own the code and are still self hosting it, along with it's IBM green screen support.
Vision is a java front end in front of an updated cobol backend. When your reputation is based on your reliability and long term code stability, at what point do you risk making the conversion, versus training new developers to work on your system.
No, we are not afraid of our own systems. The idea that there is some fabled computer system which everyone is too scared to touch doesn’t exist (I work in payment processing). There are levels of controls way outside these systems which provide these safety nets (e.g settlement / reconciliation controls).
If the cobol is still there, it’s not due to risk. If anything, the cobol is a much higher operational risk than replacing it.
Analogously, GDSes like SABRE still ran on mainframes until very recently (c. 2023) [0]. SABRE was written in some combination of assembly and some kind of in-house dialect of PL/I, if I recall.
I worked briefly at a company that wrote applications that interacted with bank mainframes. Think end point bank teller systems and in branch customer/account management. They definitely do exist - every major bank has a mainframe written in (usually) cobol.
But it's very abstracted, part of our main product offering WAS abstracting it. On top of our ready to use applications, we offered APIs for higher-level data retrieval and manipulation. Under the hood, that orchestrates mainframe calls.
But even then that there could be more level of abstractions. Not every bank used screen-level mainframe access. Some used off the shelf mainframe abstractors like JxChange (yes, there's a market for this).
Fintech would be even more abstracted, I imagine. At that point you can only interact with the mainframe a few levels up, but it's still there. Out of sight.
> Working in investment banking, I never saw a single COBOL application
What was the back office settlement or wire transfer system written in? There is a good chance that some part of them was written in COBOL. And while Bloomberg terminals are a vendor product, for a bloody long time, many of their screens had some COBOL.
Also, lots of quantitative software at i-banks use LINPACK or BLAS, which use FORTRAN.
Well, I had a very badly specified project to write a library for our back office systems to do Swift payments from our C++ applications, via COM. There was no obvious COBOL involved, on either side, but it has to be said that the whole use case for the library was very murky. And it never worked, due to the lack of spec, not the languages.
First hand knowledge:
ERGO and MunichRE both have a lot of cobol still doing the core business. You will most likely never run into the system because they just run batch jobs - sometimes configured via a “nice” web UI… you configure your job, submit and the next morning you have your report… that’s why you never actually see COBOL.
1. Not all roles are advertised. I've actually only been interviewed for two of the jobs I've ever had, both at the same place - my current employer because it's a public institution and so it always advertises and interviews for jobs even if it has an internal candidate who is likely to be a good fit. In fact the first of those jobs was basically my shape on purpose, another candidate was an equally good fit and they hired both of us.
Everywhere else people hired me because they knew who I was and what I could do and so in place of an "interview" maybe I grab lunch with some people I know and they explain what they want and I say yeah that sounds like a job I'd take and maybe suggest tweaks or focus changes. No shortlist of candidates, no tech interview, no tailoring a CV to match an advert. Nothing -> Lunch or Drinks -> Job offer.
So that can cause some distortion, especially for the niche languages where there are like six experts and you know them - an advert is futile there.
> measurable way of understanding a languages real utility
It feels like that metric misses "utility" and instead comes from a very American (or capitalistic maybe is better) mindset.
What about Max/MSP/Jitter? Huge impact in the music scene, probably has very small amount jobs available, so it'd rank fairly low while it's probably the top media/music language out there today. There are tons of languages that provide "the most utility for their domain" yet barely have any public job ads about them at all.
I think such metric would be useful to see the "employability of someone who knows that language" if anything, but probably more pain than gain to link "# of job ads" with "utility".
Yeah except job adverts have enormous lag behind what's actually popular. For example we used Rust quite a lot at my previous company but we didn't advertise for Rust developers at all.
Also then you're looking at which languages were popular in the past whereas the interesting stat is which languages are being used to start new projects.
Well, we have to define what a language's popularity mean. Because Rust is surely more 'hyped', than Java, but Java has at least an order of more developers/software written, etc.
If you look at programming language list- Apart from Python, Java. Most are targeted to specific platforms(databases, browsers, embedded systems) or tech(SQL for database).
The general purpose programming languages today are still- Python, Java, and Perl. Make whatever of this you will.
Larry Wall at one point said, if you make something very specific to a use case(like awk, sed, php etc), it sort of naturally starts to come out of general purpose use.
Its just that Kotlin, Rust, Go, SQL, Julia, SQL, Javascript etc. These are not general purpose programming languages.
This data is kinda worthless for popularity contests, since they may get picked up by aur packages, but this gives a solid insight into wich languages are foundational
Yep. And the sources are too often self-reinforcing and self-referential.
Use the "right"/better tool from the toolbox, the tool you know best, and/or the tool that the customer wants and/or makes the most money. This might include Ada[0] or COBOL[1]. Or FORTH[2] or Lua[3]. Popularity isn't a measure of much of anything apart from SEO.
The sad thing is that Apple seemed more inviting to developers before they got high on the App Store cut.
Every boxed Mac OS X came with a second disc containing the SDK (Xcode has always been an unstable cow, tho). They used to publish tech notes that explained how the OS works, rather than WWDC videos with high-level overviews that feel more like advertisements.
Back then they've at least made attempts to use some open standards, and allowed 3rd parties to fill gaps in the OS, instead of acting like a Smaug of APIs.
Because they were coming out of being at the edge of bankruptcy and needed any help they could get becoming profitable again.
My graduation thesis was porting a visualisation framework from NeXTSTEP into Windows, Objective-C => C++, because my supervisor saw no future on keeping the NeXT hardware in our campus, if he only knew what would happen a few years later.
They said it in the Epic vs Apple litigation, something along the lines of "we create the entire App Store market", like the 3rd party developers aren't.
Properties of a language shape the tooling and culture that develops around it.
JS has exploded in popularity when Internet Explorer was still around, before ES6 cleanup of the language. JS had lots of gotchas where seemingly obvious code wasn't working correctly, and devs weren't keeping up with all the dumb hacks needed for even basic things. Working around IE6's problems used to be a whole profession (quirksmode.org).
Browsers didn't have support for JS modules yet, and HTTP/1.1 couldn't handle many small files, so devs needed a way to "bundle" their JS anyway. Node.js happened to have a solution, while also enabled reusing code between client and server, and the micro libraries saved developers from having to deal with JS engine differences and memorize all the quirks.
In other languages people build abstraction libraries for that, like apache portable runtime that gives you a consistent api for most things you need to build a web-server, using just one dependency. That would also save you needing to memorise all the micro libraries needed to work around the relevant quirks.
Splitting it into a library per quirk seems like an unforced error in that context.
One could have excused it as being a way to keep the code size down, but if you use npm you also usually use a build step which could drop the unused parts, so it doesn't really hold water
All of the differences are attributable to the language.
The Apache runtime isn't sent over the network every time it's used, but JS in the browser is.
JS ecosystem has got several fix-everything-at-once libraries. However, JS is very dynamic, so even when "compiled", it's very hard to remove dead code. JS compiling JS is also much slower than C compiling C. Both of these factors favor tiny libraries.
Abstractions in JS have a higher cost. Even trivial wrappers add overhead before JIT kicks in, and even then very few things can be optimized out, and many abstractions even prevent JIT from working well. It's much cheaper to patch a few gaps only where they're needed than to add a foundational abstraction layer for the whole app.
I have a vague memory of using dead code removing javascript toolchains pretty long ago, like the closure compiler
I'm not sure the dependency tree madness actually translates to smaller code in the end either, given the bloat in the average web app... but to be fair it would be perfectly plausible that javascript developers opted for micro-libraries motivated by performance, even if it wasn't the effect of that decision.
The press release doesn't give any concrete numbers, but if it doubles efficiency of Peltier coolers, it's still 3-5× less efficient than heat pumps.
Thermoelectric cooling is notable for not having any moving parts and ability to scale down to small sizes, so it might end up having many specialized applications, but for A/C heat pumps are already very effective.
And what about service life? I had a mini-fridge that used this technology, and it stopped working after about 2 years. Was that just bad luck or poor quality, or some inherent lifetime of the components?
In principle peltier elements should be very robust over time, as a solid state system where the only moving parts are fans (versus traditional refrigeration which includes a high pressure pump...).
In practice I strongly suspect most peltier based systems are built very cheaply... because their inefficiency means the majority of the market is bordering on a scam. Sophisticated consumers aren't going to be buying very many fridges built with them (of course you might have a niche use case where they actually make sense and you're willing to pay for a quality product, but do most purchasers?).
Thermal cycles is murder on rigid electronic connections; the mechanical connection between the heatsink on each side of the peltier cell being a prime example.
reply