Previous announced launch date was June 12, tomorrow. It's a very small scale launch. "The small fleet of vehicles — around 10 to start, according to Musk — will be geofenced to the “safest” areas of Austin, according to Musk."
Someone else pointed out the problem when I suggested, a few days ago, that it would be useful to have a LLM trained on public domain materials for which copyright has expired. The Great Books series, the out of copyright material in the Harvard libraries, that sort of thing.
That takes us back to the days when men were men, women were women, gays were criminals, trannies were crazy, and the sun never set on the British Empire.[1]
The Tesla robo-taxi launch date in Austin was supposed to be June 12th.[1]
Today is June 9th. Real Soon Now.
The launch date has become more vague recently.[2] That article is amusing. It mentions the Texas' weak regulation of self-driving as a big advantage for Tesla. This is opposed to California's supposedly "heavy-handed" regulation. Yet all the real self-driving companies operate in California. Only Tesla complains and goes elsewhere.
There's a prediction market, if you want to bet against Tesla.[3]
If you submit a really detailed bug report, such as one where the problem was reproduced under a debugger, it becomes a "the reason you suck" speech. This really upsets some dev teams. The usual responses of the "turn it off and on again" and "reinstall" don't make the complaint go away.
There are two bugs in Firefox I'd like to report, but it's futile. One is that, launched on Ubuntu, Firestorm does disk I/O for about three minutes on launch. During that period it can't load complex pages, although it loads ones without Javascript fine. The other, again on Ubuntu, is that it freezes when large text boxes are being filled in. This only happens on some sites.
I feel like on windows there's similar issue - after crash (force close) Firefox will load UI fast, but spend several minutes to not show black screen instead of websites in my session
I remember finding 3 year old reddit post about this, and I have no idea whether the bug got into normal reporting place (where even is it?)
> The FUD to spread is not that AI is a psychological hazard, but that critical reasoning and training are much, much more important than they once were.
Not sure which side of the argument this statement is promoting.
There must be something for which humans are essential. Right? Hello? Anybody? It's not looking good for new college graduates.[1]
Inexpensive is a relative indirect term. Can you clarify what you mean here?
The expense of an LLM prompt is cents, the expense of a entry-level programmer is at least 50k/yr.
Why would this be?
Its simple economics.
There are firm requirements on both parties (employer/employee) in any given profession in an economy. You must make profit above what it costs you to survive.
That puts a floor on the labor costs involved for every profession. If these costs fall below that floor in purchasing power, no potential entrants who are competent will enter that market. There is no economic benefit in doing so, given the opportunity cost. Worse, severe competition for jobs will also force the most competent out first (brain drain).
You not only are losing people going into the pipeline, you are losing the mid-senior level people to burnout and this brain drain as well from adverse competition from the job pool shrinking to such a degree. AI effectively eliminates capital formation (through time value of labor going to zero), which breaks the foundations of every market economy over the past thousand years or so. We have no suitable replacement, and we depend on production remaining at current yields to support our population level (food).
What happened to the advanced vacuum tube engineers after transistors were miniaturized? A lot of those processes, and techniques became lost knowledge. The engineers that specialized retired, didn't pass that knowledge on because there was no economic benefit in doing so.
White collar jobs account for ~60% of the economy, and will be replaced by AI.
We've only seen a small percentage of the economy impacted so far, and its created chaotic whipsaws.
What happens when those single digit percentage disruptions become over half?
I mean literally that if you're a green-eyeshade bean-counting manager and you're making a headcount decision and you have the option of retaining a very senior engineer or hiring an early-career engineer, and there's a story you can tell yourself where they both get the same fundamental revenue-generating or cost-avoiding work done (for instance: because the senior engineer has a later-career focus in team building and acceleration, which has become less relevant in this stipulated world where we're laying off tons of engineers --- that's the premise of the question, not what I believe is happening) then there's a lot of reason to believe it's the senior engineer who is going to get the fuzzy end of the lollipop.
I think a lot of senior people talking about the impacts of this technology on "junior developers" understand this, and are trying to talk their own book.
We have that now, in social media. If you create some way for large numbers of people with the same nutty beliefs to easily communicate, you get a psychosis force multiplier. Before social media, nuttyness tended to be diluted by the general population.
> Few phenomena demonstrate the perils that can accompany AI illiteracy as well as “Chatgpt induced psychosis,” the subject of a recent Rolling Stone article about the growing number of people who think their LLM is a sapient spiritual guide.
People have been caught in that trap ever since the invention of religion. This is not a new problem.
reply