Hacker Newsnew | past | comments | ask | show | jobs | submit | nritchie's commentslogin

It is worth noting that the research that Martinis is being awarded the Nobel prize was largely performed while at NIST (National Institute of Standards and Technology), part of the Dept of Commerce.


I was just double billed by the third-party that Enterprise Rentals uses to handle tolls. Fraud? Incompetence? Is there a difference?


What he says is still correct for academics. There are too many candidates for too few positions. The pay is lousy. The hours are long. You don't really get to follow you best creative instincts. You spend an inordinate amount of time writing grants. Teaching, particularly pre-meds, can suck. Now with Trump, the problem has only been compounded. That isn't to say that there are no non-academic jobs for PhDs that can be satisfying. Just you may be a glorified engineer. No shame in that if that is what you want.


It feels like part of the problem is that society has failed to provide alternatives. There are two many academics for too few jobs. However, there aren't really alternatives for intellectually stimulating careers.


Is it really societies job to provide intellectually stimulating careers? If these people are really that smart shouldn't they be able to figure out something themselves? A way to get funded?


The oversupply has extended from phds down to bachelor's degrees. With 37% of Americans getting bachelor's degrees it doesn't mean anything anymore and is wasteful overtraining. I'm no Trumper but somebody has to stop the greedy life-wasting that academics have created by overfunding a lot of stupid duplicate research and excessive college educations by people who never have an impact...


Some would say you educate people to cultivate an engaged citizenry


Very few people say that. Overwhelmingly the rhetoric one hears is that the purpose of higher education is to get a better job.

Personally, I think it would be great if we educated people to cultivate an engaged citizenry. But if we're going to do that we have to be up-front about it an work on an economic model that supports it. So, for example, you can't have student loans that are predicated on being able to obtain a certain level of income on graduation, and you certainly can't make those loans impossible to discharge even in bankruptcy. If you lie about it, as we have been for decades now, it all unravels sooner or later.


> Very few people say that.

It’s not very fashionable on HN because of the faux-tough utilitarian outlook, sure. I’m the real life, there might be such a thing as over-education, but the US are certainly not there.


Thomas Jefferson said that a bunch


You need to keep in mind context. He lived in a time when the overwhelming majority of society was self employed and there was no formalized, let alone compulsory, educational system whatsoever. Looking up the exact history there, the first compulsory education began in 1852 (Jefferson died in 1826 at the age of 83), where children 8 to 14 were required to spend at least 3 months a year in 'schooling', with at least 6 weeks of it being consecutive. [1]

And in the early 19th century near to 100% of Americans lived in rural areas where access to centralized information was minimal. There was no internet, radio, or other means of centralized communication. For that matter, there wasn't even electricity. The closest they'd have had would have been local newspapers. So people without any education would have had very little idea about the world around them.

And obviously I don't mean what's happening half-way around the world, but in their own country, their own rights, and so on. Among the political elite there was a raging battle over federalism vs confederalism, but that would have had very little meaning to the overwhelming majority of Americans. Jefferson won the presidency in 1800 with 45k votes against John Adams' 30k votes, when the country's population was 5,300,000!

[1] - https://www.ebsco.com/research-starters/history/history-publ...


Even into the 1950's and early 60's, my dad went to a one-room school, probably until he was around 14 years old. No running water or air conditioning, the job of the first student to arrive in colder months was to start a fire in the stove to heat up the room.

Had he been born a few years earlier, it would have been unlikely for him to even graduate. 1940 was the first year that the graduation rate hit 50%.


Absolutely, although by then electricity, radio, urbanization, and other such things had already radically reshaped the overall character of society to be something much closer to today than of Jefferson's time.

Jefferson, in modern parlance, would probably be a 'pragmatic libertarian.' He envisioned independent self-reliant people, and in fact (like many of the Founding Fathers) was somewhat opposed to 'economically dependent' people, including wage laborers, voting - for fear that their vote could be coerced too easily, and that they might otherwise be irresponsible. That's where things like property ownership came from as a voting requirement.

And a major part of self reliance is an education that is both broad and fundamental which is where the 'pragmatic' part comes in, as I think fundamental libertarianism would view education as exclusively a thing of the private market, whereas Jefferson supported broad and public education precisely as part of this formula to independence.


Thomas Jefferson was one person, and he died over 200 years ago.


Nations with high GDP tend to be service economies. Service professions tend to require good reading and writing skills, and often a college-level specialization. (No need for PhDs, though, except for scientists.)


This seems like an example of black and white thinking. Did they never drink water? beer? wine? Of course not. A better question - under what circumstances did they prefer beer? wine? cider? water? And later on tea? coffee? Etc?


Yes. But many myths are based on such black and white thinking.


millibits per second? No actually megabits per second...


Or maybe mebibits per second


Seems like a good idea but I do wonder what the cost is as the overhead of allocate the extra resource space (whatever it is) would be added to every Go application.


Raising the soft limit to the hard limit is also recommended by the author of systemd: https://0pointer.net/blog/file-descriptor-limits.html

I doubt the kernel would actually allocate the resource space upfront. Like SO_SNDBUF and SO_RCVBUF, it's probably only allocated when it's actually needed.


It's just a limit, you pay the cost when you open the file descriptor


Beyond mundane living expenses, bicycling and brewing supplies.


This begs the question: How many languages can be accessed via AI translators?


I had Cgpt translate into a very local actual dialect (as opposed to the official language of the country I'm in). According to the locals, they couldn't believe how accurate it is.


Being able to add an interpreted script engine to a Java application is a super-power for some uses. I embedded a Jython (Python in the JVM) command line into a Java Swing app to provide a level of flexibility that I never could with a GUI. Every time I look at JRuby I wonder if Jython was the right choice. It is too late now but JRuby looks awfully nice.


IMHO it is one of the big tragedies of modern IT history, that JavaScript and Python 'own' the market for scripting languages in the mainstream.

From a pure technical perspective, I would guess JRuby or one of the JavaScript implementations would have been a better choice for scripting, especially given the poor state of Jython.

From a pragmatic perspective and what your users are mostly able to figure out, Python might have been the best choice. I even saw software developers with years of experience in imperative languages struggling to understand Rubys blocks...

Out of pure interest: What was the purpose of the Java application and which aspects did you allow the users of the application to script with Jython?


I wonder the complete opposite. On Hacker News, people are excited about AI. Outside this bubble, in the real world, less so.


I read more sceptical takes about AI on Hacker News than anywhere else (since I stopped following Gary Marcus, at least). My hunch is that some people here might feel professionally threatened about it so they want to diminish it. This is less of an issue with some of the 'normies' that I know. For them AI is not professionally threatening but use it to translate stuff, ideate about cupcake recipes, use it as a psychologist (please don't shoot the messenger) or help them lesson plan to teach kids.


> My hunch is that some people here might feel professionally threatened about it so they want to diminish it.

I don't think it's this. At least, I don't see a lot of that. What I do see a lot of is people realizing that AI is massively overhyped, and a lot of companies are capitalizing on that.

Until/unless it moves on from the hype cycle, it's hard to take it that seriously.


Speaking as a software engineer, I'm not at all threatened by it. I like Copilot as fancy autocomplete when I'm bashing out code, but that's the easy part of my job. The hard part is understanding problems and deciding what to build, and LLMs can't do that and will never be able to do that.

What I am annoyed by is having to tell users and management "no, LLMs can't do that" over and over and over and over and over. There's so much overhype and just flat out lying about capabilities and people buy into it and want to give decision making power to the statistics model that's only right by accident. Which: No.

It's a fun toy to play with and it has some limited uses, but fundamentally it's basically another blockchain: a solution in search of a problem. The set of real world problems where you want a lot of human-like writing but don't need it to be accurate is basically just "autocomplete" and "spam".


I disagree with the characterisation of AI as "another blockchain: a solution in search of a problem". The two industries have opposite problems: crypto people are struggling to create demand, AI people are struggling to keep up with demand.


> crypto people are struggling to create demand, AI people are struggling to keep up with demand.

Today, 10 years ago crypto was what everyone wanted, you can see how bitcoin soared and crypto scams was everywhere and made many billions.

And no AI is not struggling to keep up with user demand, it is struggling to keep up with free but not paid demand. So what you mean is AI is struggling to keep up with investor demand, more people want to invest into AI than there are compute to buy, but that was the same for bitcoins, bitcoin mining massively raised the prices on GPU due to how much investors put into it.

But investor driven demand can disappear really quickly, that is what people mean with an AI bubble.


Google has built a multibillion dollar business on top of "free" users. ChatGPT has more than 400 million weekly active users and this is obviously going to grow. You are overlooking how easily that "free" demand will be monetized as soon as they slap ads on the interface.


"Obviously" is doing a lot of heavy lifting there. They have a lot of competition and no killer use case and no IP rights. From the consumer point of view, they're fungible.

That's not even considering the probability that demand could slow as people lose interest.


I'm not claiming that OpenAI will be the winner of the AI race, but _somebody_ will win big time.

Regarding people losing interest: are you willing to bet with me that there will be fewer than, let's say 500 million active users of LLMs in 5 years?


Yep. Ten years ago HN was hyping blockchain as the future and there were a million blockchain startups. Just look at the list of startups funded by YC around then lol


HN is a highly technical audience, and AI is showing the most benefit on highly technical tasks, so it seems logical to me that HN would be more excited than "the real world". (What is the real world, btw? Do people on HN not exist in the real world?)


> AI is showing the most benefit on highly technical tasks

It must be truly abysmal everywhere else then, because it doesn't show much value on highly technical tasks when I try.


Is it really unthinkable that you could be using it in domains different than those that provide value to other people?


My sister, who is a pretty technical kinesiology PhD student, does not know how to input Alt+F4 and insists that is esoteric knowledge. There's a litmus test for how out of touch HN users may be with the way normal people use computers.


Exactly, and of course there's a relevant xkcd (Average Familiarity) https://xkcd.com/2501/


I meet a non-technical woman using it all day long to help manage a landscaping business. This was a data point for me.


Anectodally all of my non-tech friends seem to be using ChatGPT much more than I do.


Maybe you as a tech person mostly get tech enthusiast friends? Enthusiasts are much more into trends than professionals.


I don't think so. None of them seemed to be a tech enthusiast before, if you don't consider using social media a trait of a tech enthusiast.

I think people who are interested in how things work and AI users are two entirely different cohorts. ChatGPT from user's perspective is more like a search engine or an autotranslator rather than some sophisticated technical gizmo.


Is that true? I have three kids now, two of them in high school, that are perhaps more AI-savvy than me (both good and bad). I think the article, and my limited professional view, is informed by SoftwareDev, IT infrastructure and Enterprise technology. I think a lot of younger people are happily plugging AI into their life.


ChatGPT is the number one free iPhone app on the US App Store, and I'm pretty sure it has been the number one app for a long time. I googled to see if I could find an App Store ranking chart over time... this one[0] shows that it has been in the top 2 on the US iPhone App Store every month for the past year, and it has been number one for 10 of the past 12 months. I also checked, and ChatGPT is still the number one app on the Google Play Store too.

Unless both the App Store and Google Play Store rankings are somehow determined primarily by HN users, then it seems like AI isn't only a thing on HN.

[0]: https://app.sensortower.com/overview/6448311069?tab=category...


Close to 100% of HN users in AI threads have used ChatGPT. What do you think the percentage is in the general population, is it more than that, or less than that?


I was at a get-together last weekend with mostly non-tech friends and the subject was brought up briefly. Seemed to be a fair amount of excitement and use by everyone in the conversation, minus one guy who thought it was the "devil"...only slightly joking.


If I were to write a "hard" sci-fi story of how the devil might take over the world in the near future, AI would be my top choice, and it would definitely fit with The Usual Suspects' "The greatest trick the devil ever pulled was convincing the world he didn't exist".


It's because we're excited about the possibilities. It's potentially revolutionary tech from a product perspective. Some claim that it increases their speed of development by a not insignificant amount.

The average consumer does not appear to be particularly excited about products w/ AI features though. A big example that comes to mind is Apple Intelligence. It's not like the second coming of the iPhone, which it should be, given the insane amount of investment capital and press in the tech sphere.


I don't know, I know many people (including non-technical people) that use a lot of the chatbots. (And I even heard some parents at the playground talk about it to each other. Parents that I didn't know, it was a random public playground.)

Not sure if they are 'excited', but they are definitely using it.

Lots of interns and students also use the bots.


The real world is made of bubbles.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: