We place plenty of trust with strangers to do their jobs to keep society going. What’s their error rate?
It all ends up with the track record, perception and experience of the LLMs. Kinda like self-driving cars.
Strangers have an economic incentive to perform. AI does not. What AI program is currently able to modify its behavior autonomously to increase its own profitablity? Most if not all current public models are simply chat bots trained on old data scraped off the web. Wow we have created an economy based on cultivated Wikipedia and Reddit content from the 2010s linked together by bots that can make grammatical sentences and cogent sounding paragraphs. Isn't that great? I don't know, about 10 years ago before google broke itself, I could find information on any topic easily and judge its truth using my grounded human intelligence better than any AI today.
For one thing AI can not even count. Ask google's AI to draw a woman wearing a straw hat. More often than not the woman is wearing a well drawn hat while holding another in her hand. Why? Frequently she has three arms. Why? Tesla self driving vision can't differentiate between the sky and a light colored tractor trailer turning across traffic resulting in a fatality in Florida.
For something to be intelligent it needs to be able to think and evaluate the correctness of its thinking correctly. Not just regurgitate old web scrapings.
It is pathetic realy.
Show me one application where black box LLM ai is generating a profit that an effectively trained human or rules based system couldn't do better.
Even if AI is able to replace a human in some tasks this is not a good thing for a consumption based economy with an already low labor force participation rate.
During the first industrial revolution human labor was scarce so machines could economically replace and augnent labor and raise standards of living. In the present time labor is not scarce so automation is a solution in search of a problem and a problem itself if it increasingly leads to unemployment without universal bssic income to support consumption. If your economy produces too much with nobody to buy it then economic contraction follows. Already young people today struggle to buy a house. Instead of investing in chat bots maybe our economy should be employing more people in building trades and production occupations where they can earn an income to support consumption including of durable items like a house or a car. Instead because of the fomo and hype about AI investors are looking for greater returns by directing money toward scifi fantasy and when that doesn't materialize an economic contraction will result.
My point is humans make mistakes too, and we trust them, not because we inspect everything they say or do, but from how society is set up.
I'm not sure how up to date you are but most AIs with tool calling can do math. Image generation hasn't been generating weird stuff since last year. Waymo sees >82% fewer injuries/crashes than human drivers[1].
RL _is_ modifying its behavior to increase its own profitability, and companies training these models will optimize for revenue when the wallet runs dry.
I do feel the bit about being economically replaced. As a frontend-focused dev, nowadays LLMs can run circles around me. I'm uncertain where we go, but I would hate for people to have to do menial jobs just to make a living.