Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think that's true. If you look at the computing industry, virtually all basic research is done by companies or academics who are grant funded by companies.

AI? Basically all funded by the private sector and has been for a decade. Universities try to keep up but complain they can't.

Chip design? When was the last time academia contributed much to that? Probably RISC in the 80s? There have been tons of breakthroughs in basic research in silicon throughout my entire lifetime, all of them privately funded. For example look at the recent Optane DIMM tech (based on new chalcogenide chemistry).

Compilers? No, the most advanced compiler in the world is Graal, which is more or less funded by grants from Oracle. The second most is LLVM, which was originally written by Chris Lattner as his master's thesis i.e. he paid for it with his own student debt. Apple pretty quickly saw that it was good work and took over the funding of it.

PL design? Here, academia has indeed done a lot of basic research into various ML derived languages like Haskell. But this has largely been reviewed carefully and then ignored by the languages people actually use - it was basic research into ideas that didn't work well and not many people really care about. All new languages that have got popular for real-world usage in the past 10 years are imperative languages that trace their heritage all the way back to C i.e. Bell Labs (Rust, Kotlin, Swift, Go, etc).

The reality is, the computer industry demonstrates that you don't need government funding of basic research. Industries will certainly accept such research if it happens to be useful because they were taxed to pay for it regardless, but they don't need it.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: