Hacker Newsnew | past | comments | ask | show | jobs | submit | bananabiscuit's commentslogin

the Pfizer scientist explained that it was not biologically plausible: "RNA cannot reverse transcribe to DNA and transport from the cytoplasm to the nucleus and then integrate. That requires a set of molecules and enzymes that don't exist in humans and are largely reserved for retroviruses."

How does https://www.jefferson.edu/about/news-and-events/2021/6/disco... relate to what he is saying?


All mRNA vaccines in use currently rely on modified nucleosides that cannot be replaced via our polymerases. Pseudouridine I think is the one.


But didn't people also think unmodified RNA cannot be transcribed back into DNA until this article showed it was possible just four years ago?

Also we are relying on mRNA vaccines being manufactured correctly, but that link in the chain of reasoning also doesn't seem to be too reliable as many studies seem to have shown, here's one of the more recent ones: https://pubmed.ncbi.nlm.nih.gov/40913499/


I love non-technical writing. Outside of work, the majority of what I like to read is non-technical.

It’s a mistake to confound non-technical writing with bad writing. Sometimes you have to admit when the emperor is nude.


Really? I’m a decent way through the piece and I still don’t understand what she is rambling about.


summarized via LLM

In her essay "Why I Cannot Be Technical," Cat Hicks, a psychologist specializing in software environments, explores the structural and social dynamics that define the label "Technical" in the tech industry. She argues that despite her expertise in human-centered aspects of software development—such as behavior, culture, and organizational change—she is often excluded from being considered "Technical" because the term is narrowly defined to prioritize engineering and coding skills.

Hicks emphasizes that this exclusion is not due to a lack of capability but stems from systemic biases related to gender, class, race, and professional roles. She notes that the designation of "Technical" often serves as a gatekeeping mechanism, determining who is deemed legitimate within tech spaces. This legitimacy is frequently withheld from those whose work focuses on human factors, regardless of its complexity or impact.


Maybe the author should just write that?

There's a reason there's a saying "brevity is a sister of talent".

Whatever point you are trying to make surely could benefit if it actually reaches more than a few % of people who don't give up reading 10 pages of rambling when it should have been a paragraph in reality.


> Whatever point you are trying to make

I rather believe she is not intending to "make a point" but instead express herself. One may prefer brief self-expression but certainly not all do. Put another way: the expression is the point.


It’s mostly a pleasant read, but even as someone who prefers the humanities and studied a lot more of that than most folks in computer jobs, I’ve made it about 40% in and couldn’t tell you what the author means by capital-T “Technical”.

Is this something that would be clear to me if I’d worked in FAANG or similar? Is it a cultural thing there? Something to do with a corner of social media I don’t engage with?

The closest I can come up with connected to my experience is the opposite: “tech” related labels used to exclude people and dismiss their ideas, in decision-making or business-social contexts, and design processes. I’ve not seen it used in this power(? I think? I really can’t figure this out)-conferring way.

[EDIT] The anecdotes are so confusing.

> An example of this is every time evidence of efficacy is not able to exert any power versus the votes of engineering disengagement. You could put your diligent little psychologist heart into it and make a good program or policy change and muster up extremely critical evidence for something no one else bothered to measure but you could not demand that all of the engineering managers do it, for instance. The engineering managers always had the power and always would.

This is a manager thing. Specifically, modern management culture. Management wants to appear "evidence based" and "scientific" but the appearing is the only part they consistently care about. The "technical" run into this same wall, when they mistakenly believe surface claims that management's serious about working with evidence and "metrics" and such, and try to sincerely help as if that's the actual goal—it isn't.

[EDIT 2]

> This is one of the paradoxes of software teams: rich people, rich teams, rich environments, described and experienced as utter wastelands by (statistically speaking) men who have (statistically speaking) socked away more than I’ve ever touched and more than generations of my family ever touched, and their entire ownership of not having enough.

OK, I think this is confirmation that the piece is about a slice of the tech industry I've not really engaged with, which may explain why I am nearly at the end of the piece and am still not sure what it's about.

[EDIT 3]

> Tech is immensely global in its activity and so fanatically geo-located in its employment that even the most senior and most unquestionably Technical people worry about moving away from 2-3 certain US cities.

OK, yes, this is about a tiny percentage of "tech". Under this article's usage, I'm not "Technical", and few or none of the programmers I personally know are. That helps, wish that'd been stated up front.


> Is this something that would be clear to me if I’d worked in FAANG or similar? Is it a cultural thing there? Something to do with a corner of social media I don’t engage with?

> The closest I can come up with connected to my experience is the opposite: “tech” related labels used to exclude people and dismiss their ideas, in decision-making or business-social contexts, and design processes. I’ve not seen it used in this power(? I think? I really can’t figure this out)-conferring way.

There's a recent-ish (5 or so years?) style change people have pushed to capitalize "Black" in news and articles [0], and I think this author is trying to do the same here. Whatever this distinction is, it's entirely possible it's in their own mind and nowhere else.

[0] https://apnews.com/article/archive-race-and-ethnicity-910566...


Do you mean you will likely make an error when implementing a standard textbook linked list? Or that a standard textbook linked list has non-obvious errors in its design? If so, would be curious to learn more.


Most programmers can implement a linked list.

Extremely few can use one safely.

They're an elegant weapon for a more civilized age - namely one without multithreading, aliasing-based optimization, and a host of other nice things.


Multithreading is exactly where linked lists are useful as in the implementation of MPSC, MPMC, SPMC, and so on it allows one to safely construct objects out of vire of other threads then use a CAS to append the new item to the list making it visible for other threads to consume even without needing to allocate additional memory.

They're also pretty useful for free-list structures which overwrite entries to be removed (they're scheduled for deletion thus assumptiong here is that the object is now dead) with linked list nodes that would fit in that space thus never needing to allocate any additional as you can reuse existing space.


Yeah, wait-free data structures often contain linked lists. That doesn’t cancel out all of the other problems.

Anecdotally, I’ve seen wait-free structures used incorrectly many more times than I’ve seen them used correctly. Some people think they are magically faster because “acquiring a mutex lock is expensive”, but that’s far from always the case.


a CAS-loop, as suggested in the message you're replying to, is certainly not wait-free, only lock-free.


Thanks, my mistake. :-)


I don't see a problem with their safety, even in C.

More the opposite, their simplicity makes them safer to deal with than the alternatives. And sometimes their unique features (mostly the intrusive variant) are exactly what you need.

Multi-threaded accesses to a vector would be just as bad?


Implementing anything is more error prone than using something battle-tested.


It’s necessary to whatabout when threads about Russia come up because there are usually plenty of top rated comments (see above) dehumanizing Russians and calling for their extermination. I have never seen any thread critical of US meddling in world affairs express the same hatred of US citizens and supporter's. “whataboutism” should not be used as defense against double standards. Double standards should be clearly called out.


I don't see any comments calling for the extermination of Russians. It sounds like you're making things up to support your own agenda.


[flagged]


What's an "Azovite"?

And "ancestral hatred of Russians and Russia"? If my grandparents and parents were victims of Soviet aggression/invasion/genocide, what right do you have to question "ancestral hatred"?

The fact that Russia is doing it again is enough to arouse intense emotions in many European populations.


https://en.wikipedia.org/wiki/Azov_Brigade You know, the ones with the funny tattoos. I’m sure your ancestors would have recognized those.

>Soviet

Equating modern Russia and the SU is a widely deployed sleight of hand at the moment. Same with pre-Soviet Russia. No matter how little sense it actually makes. “Show me on this map where the Tsar/Stalin hurt you.”

But there is a common thread: Russia was, is, and will remain for a long while the predominant regional power. Pays to have a friendly relationship with them. Ami will go home. Sooner or later.


[Replying to a now deleted post misquoting https://en.wikisource.org/wiki/Address_concerning_the_events...]

>claims these things

What he actually claims:

I would like to emphasise again that Ukraine is not just a neighbouring country for us. It is an inalienable part of our own history, culture and spiritual space. These are our comrades, those dearest to us – not only colleagues, friends and people who once served together, but also relatives, people bound by blood, by family ties.

>criticizes Lenin and Stalin

He criticizes the Soviets for creating Ukraine as we know it—Donbass, Crimea and so on. Artificial borders, a people divided; precisely to prevent the SU from becoming synonymous with Russia.

And it wasn’t just the Soviets who sought to use Ukraine against Russia.

Every day Ukraine-supporters whine about what they consider lackluster support from the West. What’s actually happening is obvious but wishful thinking and a quality reality distortion field prevent enlightenment: While there are real limits of supply, and at times they might have in fact entertained the idea of “winning,” the basic goal is and has always been to cause as much death and destruction as possible without actually getting into a shooting war with Russia. And, of course, most of that death and destruction happens in Ukraine.

And then they have the gall to cry “scorched earth.”


Kind of a way of turning something obvious on its head and painting it as a negative. When you are interacting with people, there is pressure to conform to consensus reality, regardless of the merit of that reality. Anybody who goes against the grain, wether they are wrong, or correct and eventually vindicated, first has to face negative social pressure from their peers.


This is definitely true. I find there's a certain irony in society that we condemn people who aren't sceptical of anything until they are then we condemn them again.

Some of the worlds greatest thinkers and innovators suffered social execution from their peers and were deeply depressed.


That's not really the mechanism at work here. Lonely people don't turn to conspiracy theorizing because they somehow happen upon "secret knowledge" that, in the absence of any external social pressures, spontaneously becomes apparent to them.

Rather, for these lonely people, conspiracy theories are how they project their own unmet emotional needs outward onto the perceived world around them -- as a coping strategy. They need to feel like they belong and are relevant in the world. This leads them to harbor resentment toward the rest of the world whom they perceive to be in league with an amorphous "them". The conspiracist's belief that he possesses "secret knowledge" about the world fulfills his needs for belonging and relevance by making him feel as if he is part of an in-group superior to the one he perceives to be alienating him[1].

But why do this through "secret knowledge"? Usually this is the conspiracy theorist's way of coping with some inexplicable world-changing event like 9/11, the COVID-19 pandemic, or the loss of their chosen political candidate in an election; they need to "know" why immediately, but cannot, and therefore move straight from the "thinking" stage (which requires holding uncertainty and multiple possible explanations) into the "knowing" stage (which is a kind of faith-based certainty about the world)[2][3].

--

1. https://www.apa.org/pubs/journals/releases/bul-bul0000392.pd...

2. https://overcast.fm/+CuhudQ56w

3. https://psmag.com/social-justice/thinking-vs-knowing-when-fa...


Very insightful comment. Do you also came across info on how to pull these people (mostly older) out of it?


Separate them from their sources of misinformation by blocking youtube/tiktok/etc. and give them better things to do with their time that fulfill their social and emotional needs for belonging and acceptance (e.g., community volunteering programs, family outings, meetup hobby groups, etc.).


Thanks for highlighting the role of "secret knowledge" in fringe theories (I say "fringe" not "conspiracy" b/c when you look at the "lost Atlantis" thing it may be just a story (Plato probably thought so?), or a plot by the establishment to gaslight the sheeple, or else something in between—a "true" story about a forgotten civilization that few know of (well actually no, but anyway) and still fewer believe in, for reasons that don't necessarily constitute a conspiracy).

While I'm at it I might just as well recommend David Miano's YouTube channel World of Antiquity, here's a link: https://www.youtube.com/watch?v=1dHNq8SURTU to his video "LIES told by Atlantis proponents" (accidentally had this tab open. Coincidendence!??!! I don't think so!).

BTW another red flag that occurs across all these Ancient and/or Alien Civilization fringe theories is that the argumentation of the believers disparages certain groups of people, mostly non-Europeans—"they could've never achieved this level of precision, they were much too primitive".

> move straight from the "thinking" stage [...] into the "knowing" stage

I've seen this time and again, people who talk, argue and act as if it's enough for a thought to cross the mind, and it's already almost taken for granted, accepted as fact.


It comes across to most people as obvious and negative from the get go. They’re not antonyms.


That is one of the possibilities that are highlighted in the abstract, Amoung other theories, including that lonely individuals seek friendship within conspiracy communities. Humans are very good at post-hoc rationalisation.


The only difference between a genius observation and a conspiracy theory is whether it turns out to be true or not.


To be fair, most conspiracy theories are already proven obviously false.


Anecdotally in my experience, autistic people are a lot more likely to "not care" about social consequences and also more often think about or believe conspiracy theories. It's also not a bad thing to have some people believing in conspiracies, especially when the media/government decides that any stories about them are crazy conspiracies and everyone believes them.


Herd mentality means you share conspiracy theories with the herd, so they don't get recognized as conspiracy theories. Think Nazi Germany etc, that only gets recognized afterwards.

So you might identify more crazy stuff in autistic people who don't align with the herd, but if you sum up all the crazy stuff the herds thinks that is roughly on a similar level to an average autistic person, just that its all aligned in a thought bubble so people don't see it unless you look outside your bubble.

To illustrate, do you think a devout democrat feels an average autistic person or a republican believe in more dumb stuff? And vice versa. Regardless which of those sides are right it still adds up to a ton of crazy shit for the average herd.


This article is propaganda.


Most people commuting to manhattan are not wealthy or middle class, judging by the makes and models and visual conditions of the cars that you see on the bridges entering Manhattan.


Is there something about RISC that is still makes it better than CISC when it comes to per-watt performance? Seems like nobody has any success making an x86 processor that's as power efficient as ARM or RISC.


> Is there something about RISC that is still makes it better than CISC when it comes to per-watt performance?

CLASSIC CISC was micro-coded (for example, IBM S/360 have feature, you could make your custom microcode for compatibility with your inherited equipment, like IBM-1401 machines or IBM-7XXX series, or for other purposes), and RISC was with pipeline from birth.

Second thing, as I understand, many CISC existed as multiple chips board or even as multiple boards, so have great losses on wires, but RISC appear in 1990s as one die immediately (only external cache added as additional IC), but I could mistake on this.

> nobody has any success making an x86 processor that's as power efficient as ARM or RISC

Rumors said, Intel Atom (essentially CMOS version of Pentium first generations) was very good in mobiles, but ARM far succeed it on software support of huge number of power saving features (modern ARM SOC allows to turn off near any part of chip any time and OS support this), and because of lack of software support, smartphones with Intel have poor time on battery.

More or less official info said, that Intel made bad power conversion circuit, so Atom consumes too much in mode between deep sleep and full speed, but I don't believe them, as this is too obvious mistake for hardware developer.


> CLASSIC CISC was micro-coded

Sometimes. Far from always. Some would have a complicated hardwired state machine. Some would have a complicated hardwired state machine and be pipelined. Some would have microcode and be pipelined (by flowing the microcode bits through the pipeline and of course dropping those that have already been used so less and less microcode bits survive at each stage).


Please give classic CISC examples, which was not microcoded and why you think they classic.

From my opinion, NONE of microprocessors could be considered classic CISC.


The PDP11-20 (the first PDP11) was not microcoded.

The PDP11 is the machine that unix was developed on.


Did you know for what purposes (targets) made mini-computers and why they was limited?


Well, as I see you don't have enough bravery to answer simple question about purpose of mini-computers, so I will.

When computers first appeared, they was big, just because technology limitations made small machines very expensive to use, so scale used to make computations cheaper.

In early 1970s, technology advanced to stage, where become possible to make simplified versions of big computers for some limited tasks, still too expensive for wide use.

Simple illustration, IBM-3033 mainframe with 16M RAM could serve 17500 3270 terminals, and PDP of same time could about few tens (may be 50, I don't know exactly), so mainframes even when was very expensive, but given good cost per workplace.

Known example, PDP used to control one of scientific nuclear reactor. PDP chosen, not because it have best mips/price ratio, but because it was cheapest adequate machine for this task, so is affordable for limited budget.

Very long time, mini machines stay in niche of limited machines, used to avoid much more expensive full-scale mainframes. They used to control industrial automation (CNC), chemical factories and other small things.

Once appeared microcomputers (CPU on one chip), they begin eat mini's space from bottom, when mainframes continue to become more cost effective (more terminals with appearance of cheap modems, etc) and eat mini's space from top.

And in 1990s, when appeared affordable 32-bit microprocessors and became affordable Megabytes of RAM, mini's disappear, because their place was captured by micro's.

To be honest, I just don't know anything we could not name microcomputer now, as even IBM Z mainframes are now have single-chip processor and largest supercomputers are practically clouds of SOCs (NUMA architecture).

And I must admit, I still see PDP's (or VAX's) on enterprises, where they still control old machines from 1990s (they are very reliable even when limited from modern view, but still work).

As I remember, last symmetrical multiprocessor supercomputer was Cray Y-MP, later machines become ccNUMA or just NUMA or even cloud.

https://en.wikipedia.org/wiki/LINPACK

Unix was simplified version of Multics, system considered to run on mainframes (BTW even exists officially certified Unix for mainframes).

You could try mainframes software yourself, it is very affordable now with emulator (sure, be careful about license):

https://en.wikipedia.org/wiki/Hercules_(emulator)

And you will see yourself, how many things borrowed by modern OS's from mainframes.

This is nature, people choose simpler, cheaper thing.


Have there been recent attempts? Maybe it's just, like, speciation, by this point in time.


A great discussion on this: Lex Fridman's interview of David Patterson.


Maybe modern society was a mistake.


how much time have you spent in a non-modern society? cuz I got deployed twice and saw some pretty backwards-ass places overseas.

like we can just pass laws that respect privacy, dawg. build some better public transit.


That’s pretty tough to argue by any measurement I can think of.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: