If you've ever watched the movie "Enemy of the State", which came out in 1998, I don't know how you can come away from that movie thinking anything other than someone in that script writing pipeline had some insider knowledge of what was happening. So many of the things they talk about in the film were confirmed by the Snowden releases that it's kinda scary.
Today, it's almost a national societal resignation that "you have no privacy, get over it." I wish that weren't the case, but I'd like to see more representation embrace privacy as the basic right it should be again.
There's an interesting falsifiable prediction lurking here. If the language network is essentially a parser/decoder that exploits statistical regularities in language structure, then languages with richer morphological marking (more redundant grammatical signals) should be "easier" to parse — the structure is more explicitly marked in the signal itself.
French has obligatory subject-verb agreement, gender marking on articles/adjectives, and rich verbal morphology. English has largely shed these. If you trained identical neural networks on French vs English corpora, holding everything else constant, you might expect French models to hit certain capability thresholds earlier — not because of anything about the network, but because the language itself carries more redundant structural information per token.
This would support Fedorenko's view that the language network is revealing structure already present in language, rather than constructing it. The "LLM in your head" isn't doing the thinking — it's a lookup/decode system optimized for whatever linguistic code you learned.
(Disclosure: I'm running this exact experiment. Preregistration: https://osf.io/sj48b)
Neveress all Austen's happy endings are due to the magical alignment of respect and love with security and social alliance. Jane's heroines are playing a (relatively, see below) high risk/high reward game of not wanting to sacrifice _anything_, which leads to their triumphs in the novels but most often led to loneliness and economic insecurity in the real world.
Similarly, all people have choices, but these choices are often pretty agonising ones, and Jane almost never has her protagonists or us confront such life-and-death, very-bad-vs-infinitely-worse choices. And this was a conscious choice since the novels of the 18th century had been more or less filled with them.
Linux, but not Rust: {'nios2', 'microblaze', 'arc', 'openrisc', 'parisc', 's390', 'alpha', 'sh'}
Rust, but not Linux: {'avr', 'bpfel', 'amdcgn', 'wasm32', 'msp430', 'bpfeb', 'nvptx', 'wasm64'}
Personally, I've never used a computer from the "Linux, but not Rust" list, although I have gotten close to a DEC Alpha that was on display somewhere, and I know somebody who had a Sega Dreamcast (`sh`) at some point.
Years back I attended someone doing an NSF outreach tour in support of Next Generation Science Standards. She was breathtaking (literally - bated breath on "how is that question going to be handled?!?"). Heartfelt hostile misguided questions, too confused to even attain wrong, somehow got responses which were, not merely positive and compassionate, but which managed to gracefully pull out constructive insights for the audience and questioner. One of those "How do I learn this? Can I be your apprentice?" moments.
The Wikipedia community (at least 2 decades back) was also notable. You have a world of nuttery making edits. The person off their meds going article by article adding a single letter "a". And yet a community ethos that emphasized dealing with them with gentle compassion, and as potential future positive contributors.
Skimming a recent "why did perl die" thread, one thing I didn't see mentioned... The perl community lacked the cultural infrastructure to cope with the eternal-September of years of continuous newcomer questions, becoming burned out and snarky. The python community emphasized it's contrast with this, "If you can't answer with friendly professionalism, we don't need your reply today" (or something like that).
Moving from tar files with mailing lists, to now community repos and git and blogs/slack/etc, there's been a lot of tech learned. For example, Ruby's Gems repo was explicitly motivated by "don't be python" (then struggling without a central community repo). But there's also been the social/cultural tech learned, for how to do OSS at scale.
> My only doubt is whether this has a good or bad effect overall
I wonder if a literature has developed around this?
Two-person team, eighteen months bootstrapped.
We just shipped Synrix, a flat fixed-width memory-mapped lattice that runs fifty million persistent nodes on an eight gigabyte Jetson Orin Nano with 186 ns hot-path latency (under 3.2 cycles steady-state), 33× larger than RAM via kernel-managed streaming, and full ACID persistence that survives kill -9 and power yanks with zero corruption. CPU-only, no GPU, no cloud, no telemetry. Redis RESP compatible drop-in.
Demo on the page: raw tegrastats, no cuts, cable pulled mid-run, everything comes back exactly where it left off.
We’ve never seen anything hit these numbers on commodity edge hardware before.
Curious what people think:
For real-world robotics, drones, or autonomy, is sub-200 ns persistent lookup actually useful or just a benchmark flex?
Are there workloads where surviving total power loss with zero data loss would change architecture decisions?
Has anyone else ever gotten close to 50 M persistent nodes on a Jetson without a GPU or external storage?
What would you try to break first if you had this running on your board tomorrow?
Happy to run it live on anyone’s hardware, share perf and cachegrind traces, or just talk through the weirdest edge cases you’ve seen. Feel free to check out our website for me info!
IANA economist, but if there were a debt crisis, it would ultimately be about the psychology of the investors who would buy government debt. They want to be very, very confident that they will be paid back (which is why they're willing to accept a low interest rate).
If those investors are satisfied with a return to a late-80s fiscal posture, then great. But if they're worried that spending would just creep up again once the pressure is off, they might "demand" further cuts.
This is the collision between two cultures that were never meant to share the same data: "move fast and duct-tape APIs together" startup engineering, and "if this leaks we ruin people's lives" legal/medical confidentiality.
What's wild is that nothing here is exotic: subdomain enumeration, unauthenticated API, over-privileged token, minified JS leaking internals. This is a 2010-level bug pattern wrapped in 2025 AI hype. The only truly "AI" part is that centralizing all documents for model training drastically raises the blast radius when you screw up.
The economic incentive is obvious: if your pitch deck is "we'll ingest everything your firm has ever touched and make it searchable/AI-ready", you win deals by saying yes to data access and integrations, not by saying no. Least privilege, token scoping, and proper isolation are friction in the sales process, so they get bolted on later, if at all.
The scary bit is that lawyers are being sold "AI assistant" but what they're actually buying is "unvetted third party root access to your institutional memory". At that point, the interesting question isn't whether there are more bugs like this, it's how many of these systems would survive a serious red-team exercise by anyone more motivated than a curious blogger.
I think even 50 years ago, that most people started businesses because they had a skill and could use it to help others meet their needs.
HP started (more than 50 years ago) with two friends who wanted to make better electronic test equipment. Profit was not forefront in their mind like it is to an MBA graduate today. Hewlett and Packard wanted to provide quality test equipment to people, because a lot of the test equipment of the day was subpar to them.
By the time the 80s rolled around, they paid 100% of an employee's college education (no matter how high they wanted to go with that) and paid them 75% of their salary while they were away at school. College was cheaper then, but zero employers today would even briefly consider paying people any amount at all to not be at work while also paying for the thing keeping them away from work.
corner stores in crowded neighborhoods are not started to maximize profit potential for shareholders. corner stores are started because someone saw the need for a corner store and wanted to make a living running it; they wanted that to be their job.
Until the invention of the MBA I don't think most people who started businesses did so purely for money. There are many easier ways to make money. Today people can start shitting mobile games with pay to win mechanics and they will be rich when the first one takes off. No one creates mobile games with pay to win mechanics because they want people to experience the joy of microtransactions.
Every business today (certainly every tech business) is designed to find out what people want via market research, pick the thing that looks the most profitable, then through a very well developed process, turn that business into a source of retirement money for the founder(s) and a source of return for the investors. It is literally a photocopy model of business creation. "Follow the process and you will succeed."
No one is opening shops today to help their neighbor. No one is opening new bakeries because their town needs one. No one is doing anything that one used to see people doing everywhere they went. Profit-driven motivation ruins everything it touches. Everything.
Everything is profit driven, now. Everything. The MBA is the most disasterous degree ever devised. It makes people think that starting a business purely to make money is a perfectly normal and healthy thing to do, and it simply isn't.
Here's 12 Sysadmin/DevOps (they're synonyms now!) challenges, straight from the day job:
1. Get a user to stop logging in as root.
2. Get all users to stop sharing the same login and password for all servers.
3. Get a user to upgrade their app's dependencies to versions newer than 2010.
4. Get a user to use configuration management rather than scp'ing config files from their laptop to the server.
5. Get a user to bake immutable images w/configuration rather than using configuration management.
6. Get a user to switch from Jenkins to GitHub Actions.
7. Get a user to stop keeping one file with all production secrets in S3, and use a secrets vault instead.
8. Convince a user (and management) you need to buy new servers, because although "we haven't had one go down in years", every one has faulty power supply, hard drive, network card, RAM, etc, and the hardware's so old you can't find spare parts.
9. Get management to give you the authority to force users to rotate their AWS access keys which are 8 years old.
10. Get a user to stop using the aws root account's access keys for their application.
11. Get a user to build their application in a container.
12. Get a user to deploy their application without you.
After you complete each one, you get a glass of scotch. Happy Holidays!
Higher ed is like employer based health insurance in that they are both weird path dependent historical accidents.
People want cheap healthcare, and it got shoehorned into an odd employer fringe benefit system that really is not at all related healthcare in any intrinsic way.
People want job training, and it got shoehorned into extra departments at liberal arts universities intended as aristocrat finishing schools. Job training really has little to no relationship to liberal arts.
And now both those two systems are failing to deliver those benefits because those benefits which were initially afterthought add-ons have outgrown the institutions that were their hosts. It's akin to a parasitic vine that is now much larger than the tree it grew on and is crushing it under its weight. Both will die as a result.
When I started working, more than 25 years ago, we had one team meeting per week (1 hour), very few other meetings. Cellphones were getting mainstream and people had these funny ringtones, but since communications were expensive, phones were not ringing often. The office phone was ringing even more seldomly. We had no ticketing system. Managers just trusted you for doing your work. When going to someone else desk we would start with "may I disturb you?", and the answer may have been "give me five minutes". We had like 2-3 emails a day. It turns out someone had the radio in the office. That was in Belgium and the radio was in Flemish. This was not a big deal since I do not understand Flemish. Despite being rather cramped, I remember this office as quiet. It was not a large open-space though.
I cannot remember the turning point. Of course "agile" did a lot of damage, then ticketing systems, the illusion that developers are swap-able, and now constant notification stream.
> When you're an employee, you can hide behind process and blame the market.
You can still blame the market. A good market makes everything easier, a bad market makes everything harder.
But here’s the catch: You choose the market.
To share an example: When I started my react teaching side business in 2015 it was so easy. Growing 2x year over year, I thought I was some kind of business genius. Then one day it stopped. React became old, no longer the exciting new thing, the market consolidated into 2 or 3 big players with The Default resources and my stuff wasn’t one of them. I totally missed the land grab aspect of the early market phase and didn’t go hard enough on pure growth. Not a business genius after all.
You want to get a pulse in the market, or need to clean up the backlog of recruiters invitations because you're a bit OCD and feel bad about not answering people who probably used AI to send you this apparently so personal message, and you also were born poor, so even all those years living a comfortable life your first instinct towards people who could maybe want to hire you is being a nice agreeable folk, as old survival instincts never really die.
Then, it is like, things that are not good for you, but you do anyway like drinking sugary beverages, staying awake too late in the night, drinking that last couple beers that you didn't need to drink and didn't enjoy but then give you this morning headache.
It's pretty clear what regulation should exist for 3rd party markets of licensed digital content: none.
If you choose to blow your finances on digital skins, or make it your trade and primary job, and you lose everything, then I have one word for you: "ha-ha".
May seem harsh, but we shouldn't encourage digital speculation nor the companies that seek to invent (more of) these markets wholecloth.
I study and write quite a bit of tech history. IMHO from what I've learned over the last few years of this hobby, the primary issue is quite simple. While hardware folks study and learn from the successes and failures of past hardware, software folks do not. People do not regularly pull apart old systems for learning. Typically, software folks build new and every generation of software developers must relearn the same problems.
I came here to post this. We make a lot of the same sorts of optimizations for our OS distro (debian based) -- disabling frequency scaling, core pinning, etc. Critically, CPU0 has a bunch of stuff you cannot push, and you're better off with using one of the other cores as an isolated island.
This is what the scheduler latency looks like on our isolated core:
# Total: 000300000
# Min Latencies: 00001
# Avg Latencies: 00005
# Max Latencies: 00059
# Histogram Overflows: 00000
But dietary cholesterol is not linked to cholesterol in the blood vessels
> We confirm from the review of the literature on epidemiological data, meta-analysis, and clinical interventions where dietary cholesterol challenges were utilized that there is not a direct correlation between cholesterol intake and blood cholesterol.
Reminds me of when Reddit posted their year end roundup https://web.archive.org/web/20140409152507/http://www.reddit... and revealed their “most addicted city” to be the home of Eglin Air Force Base, host of a lot of military cyber operations. They edited the article shortly afterward to remove this inconvenient statistic
The slow consistent degeneration of file explorer, the primary core service for users in the entire OS, is truly a sign of the times at Msoft.
This shit crashes all the time in our office workflow, hard restarting explorer process. USB issues, soft freezing on certain types of file transfer.
Mapping drives and credentials integration have been bugged as fuck for at least 3 years. I shouldn’t have to edit every connection in our office in credential manager to fix this.
And anything happening with the explorer process is infuriating to try and diagnose. It almost never throws an error code or event code when it fucks up. So good luck trying to fix it yourself.
Lay-off the engineers and promote the mouthpieces. It’s not a Microsoft issue, it’s an industry 1.
They’re making poor choices because there was a major shift to incompetence. Using web technologies on the desktop when we as engineers know it was birthed as a whack-a-mole hack job that continues today as a accumulation of human-centric decisions not computing (engineered) 1s.
Applications really don’t need such flexibility to look (ie. dom), they need to function and cohabitate to be resourceful, which clearly they’re not.
And that’s not to say engineers haven’t tried to fix those mistakes but the catalyst was already set.
Design should have never taken precedence over compute, just as much as interpreted/runtime over compiled. They need to be balanced if not swayed back.
We still have the capability to learn the machine and shift the narrative as long as we’re willing to lose the brand for the generic. That’s the biggest obstacle because we sell out to the language, the architecture, etc. by the marketing of efficiencies in time to create rather than compute and each iteration of that deteriorates the experience and the art.
The thing is, they work very well for that 15%. I suspect the eventual conclusion will be that depression is a syndrome with multiple causes rather than a single condition, and SSRIs treat one of the causes.
Edit: Mark Horowitz is one of the authors of both studies.
The NNT[1] of Prozac, and SSRIs in general, has been previously estimated around 6. Meaning that treatment is more helpful than a sugar pill in only 1 out of 6 cases (a dirty secret).
Meanwhile the NNH[2] is as low as 21, that is 1 in 21 cases will stop due to negative side effects.
People who have never experienced a particular challenge are quick to assume credit for its absence and assign moral failings to others who experience it. It's insufferable but common.
Imagine a millionaire who had millionaire parents lecturing his children on how they're not allowed to be poor. Lo! They're not! The lectures worked.
I was very young when my mom started Prozac but do remember how angry and sad she was before compared to after.
Years later there was a time when me and my sister noticed our mom was acting a bit strange -- more snappish and irritable than usual, and she even started dressing differently. Then at dinner she announced proudly that she had been off Prozac for a month. My sister and I looked at each other and at the same time went, "Ohhhh!" Mom was shocked that we'd noticed such a difference in her behavior and started taking the medication again.
I've been on the exact same dose as her for 15 years, and my 7-year-old son just started half that dose.
If I have a good day it's impossible to day whether that's due to Prozac. But since starting Prozac I have been much more likely to have good days than bad. So, since Prozac is cheap and I don't seem to suffer any side effects, I plan to keep taking it in perpetuity.
What I tell my kids is that getting depressed, feeling sad, feeling hopeless -- those are all normal feelings that everyone has from time to time. Pills can't or shouldn't keep you from feeling depressed if you have something to be depressed about. Pills are for people who feel depressed but don't have something to be depressed about -- they have food, shelter, friends, opportunities to contribute and be productive, nothing traumatic has happened, but they feel hopeless anyway -- and that's called Depression, which is different from "being depressed."
We had/have a lot of reservations about it too, and discussed it at length with our pediatrician over months of observation. We decided what was more horrifying was hearing a 7-year-old — who has supportive family and friends, good health, no traumatic events, no major life changes going on, never worries where food/shelter is coming from — say he feels like "he shouldn't be on Earth anymore" and suddenly react with extreme physical anxiety to almost everything. It was bad enough that he couldn't really implement any of the coping skills he learned in therapy. His therapist hoped that medication would bring him to a baseline where he was able to benefit more from therapy. My family's historical success with Prozac also made the decision more palatable since depression appears to be hereditary.
There has been a phenomenal positive shift in his behavior since he started medication. All that said, another commenter pointed out that the study specifically says that Prozac is no better than placebo for depression, which is similar to but distinct from anxiety, which is what my son is being treated for. My mom and I were both diagnosed with depression, but anxiety may be more accurate -- I'm not sure.
Today, it's almost a national societal resignation that "you have no privacy, get over it." I wish that weren't the case, but I'd like to see more representation embrace privacy as the basic right it should be again.