Hacker News new | past | comments | ask | show | jobs | submit | _tom_'s comments login

I think it's the writing.

I learned long ago that I could read a book, study it, think about it. And I still would really master the material until I built with it.


For me, it's a bit like pair programming. I have someone to discuss ideas with. Someone to review my code and suggest alternative approaches. Some one that uses different feature than I do, so I learn from them.

I guess if you enjoy programming with someone you can never really trust, then yeah, sure, its "a bit like" pairs programming.

Trust, but verify ;]

This is how I use it too. It's great at quickly answering questions. I find it particularly useful if I have to work with a language of framework that I'm not fully experienced in.

> I find it particularly useful if I have to work with a language of framework that I'm not fully experienced in

Yep - my number 1 use case for LLMs is as a template and example generator. It actually seems like a fairly reasonable use for probabilistic text generation!



Discussed here:

Crypto investor charged with kidnapping and torturing - https://news.ycombinator.com/item?id=44085188 - May 2025 (67 comments)


I think he's saying you can click and edit on the diagram, which mermaid doesn't support. This does propagate back into the source.

I think you are talking about "just change the text and regenerate", which achieves much the same goal.

I'm not sure in what cases the former is better.


Off-the-cuff thought:

Could you solve the empty string hashes to zero problem by just adding one when computing hash codes?


But then strings with the hash code HASH_MAX would wrap to 0 instead.


You could, but that would break backwards compatibility.


Doesn't AGPL give you LESS rights? (fewer?)

AGPL is pretty much a non-starter for any commercial development.

The RSALv2 let you use Redis, unless you were a service provider. Now, you can't use it for anything (EVERYTHING is accessed over a network now) without sharing your source.

So, before everyone but amazon could use it for commercial purposes, now no one can.


1) you can still use it under RSAL.

2) you only have to share what you change. You don’t have to change it to provide it as a service, but if you do, you have to share the code. That doesn’t seem particularly harsh and certainly doesn’t prevent providing it commercially.


Unless you modify Redis, what do you want to share? You don't have to share any of your server code or anything. No reason to be so dramatic. :)


You have to make the choice about sharing at the start. If you use AGPL and later decide to make changes, you'll have to share. So most don't want to risk it.

But more important, AGPL software is banned at many large corporations. It's simply not an option.


Automated dependency resolution has made it so the default is frequently

> Someone imports a single method from the RIGHT package

and hundreds of megabytes come in for what might be one simple function.


Well, the video linked above is labelled "Esgrima de Machete y Garrote.".


No one seems to be talking about the fact that we need to change the definition of cheating.

People's careers are going to be filled with AI. College needs to prepare them for that reality, not to get jobs that are now extinct.

If they are never going to have to program without AI, what's the point in teaching them to do it? It's like expecting them to do arithmetic by hand. No one does.

For every class, teachers need to be asking themselves "is this class relevant" and "what are the learning goals in this class? Goals that they will still need, in a world with AI".


I believe we need to practice critical thinking through actual effort. Doing arithmetic by hand and working through problems ourselves builds intuition in ways that shortcuts can't. I'm grateful I grew up without LLMs, as the struggle to organize and express my thoughts on paper developed mental muscles I still rely on today. Some perspiration is necessary for genuine learning—the difficulty is actually part of the value.


Critical thinking is not a generic/standalone skill that you can practise targetedly. As in, critical thinking doesn't translate across knowledge domains. To think critically you need extensive knowledge of the domain in question; that's one reason why memorizing facts will always remain necessary, despite search engines and LLMs.

At best what you can learn specifically regarding critical thinking are some rules of thumb such as "compare at least three sources" and "ask yourself who benefits".


I think you'd find many would disagree with each of those claims.


I hope they'll apply the critical thinking rule of thumb to check for themselves what modern research has to say on this!

Edit: And how can you critically assess if that research is any good? To do it well you need... domain knowledge.


And would they amount to a larger number than those who oppose vaccines?


Indeed. The problem however, is that they write papers with AI (and will also do so when working for a company), but it’s riddled with falsehoods.

So you make them take exams in-class, and you check their papers for mistakes and irresponsible AI use and punish this severely.

But actually using AI ought not to be punished.


> It's like expecting them to do arithmetic by hand. No one does.

But those who traditionally learnt arithmetics have had this training, which _enables_ higher order thinking.

Being reliant on AI to do this means they would not have had that same level of training. It could prevent them from being able to synthesize new patterns or recognize them (and so if the AI also cannot do the same, you get stagnation).


I suspect schools spend a lot less time on arithmetic than they used to, however.

You used to _actually_ need to do the arithmetic, now you just need to understand when a calculator is not giving you what you expected. (Not that this is being taught either, lol)

You can get to the higher order thinking sooner than if you spent years grinding multiplication tables.


> you just need to understand when a calculator is not giving you what you expected

How do you do that if you can't do arithmetic by hand though? At most, when working with integers, you can count digits to check if the order of magnitude is correct.


You can do arithmetic by hand without being fast or accurate. It's still useful to check that calculations are correct, it's just slow for the ancient use of tallying up a bill.


> It's like expecting them to do arithmetic by hand. No one does.

Don't all children learn by doing arithmetic by hand first?


That's such an irresponsible take. If you don't know how to program, you can't even begin to judge the output of whatever model. You'll be the idiotic manager that tells the IT department to solve some problem, and it has to be done in two weeks. No idea if that's reasonable or feasible. And when you can't do that, you certainly can't design a larger system.

What's your next rant: know nead too learn two reed and right ennui moor? Because AI can do that for you? No need to think? "So, you turned 6 today? That over there is your place at the assembly line. Get to know it well, because you'll be there the rest of your life."

> For every class, teachers need to be asking themselves "is this class relevant" and "what are the learning goals in this class?

That's already how schools organize their curriculum.


That's brilliant!

I mean, arithmetic is the same way, right? Nobody should do the arithmetic by hand, as you say. Kindergarten teachers really ought to just hand their kids calculators, tell them they should push these buttons like this, and write down the answers. No need to teach them how to do routine arithmetics like 3+4 when a calculator can do it for them.


I'm not sure you aren't being a little bit sarcastic but essentially that's true.


If kids don't go through the struggle of understanding arithmetic, higher math will be very very difficult. Just because you can use a calculator, doesn't mean that's the best way to learn. Likewise for using LLMs to program.


I have no anecdata to counter your thesis. I do agree that immersion in the doing of a thing is the best way to learn. I am not fully convinced that doing a lot of arithmetic hand calculation precludes learning the science of patterns that is mathematics. They should still be doing something mathematical but why not go right into using a calculator. I have no experience as an educator and I bet it's hard to get good data on this topic of debate. I could be very wrong.


I'm not an educator but I know from teaching my own children that you don't introduce math using symbols and abstract representations. You grab 5 of some small object and show them how a pile of 2 objects combined with a pile of 3 objects creates a pile of 5 objects.

Remember, language is a natural skill all humans have. So is counting (a skill that may not even be unique to humans).

However writing is an artifical technology invented by humans. Writing is not natural in the sense that language itself is. There is no part of brain we're born with that comes ready to write. Instead, when we learn to write other parts of our brain that are associated with language and hearing and vision are co-opted into the "writing and reading parts".

Teaching kids math using writing and symbolism is unnatural and often an abstraction too far for them (initially). Introducing written math is easier and makes more sense once kids are also learning to read and write - their brains are being rewired by that process. However even an toddler can look at a pile of 3 objects and a pile of 5 objects and know which one is more, even if they can't explicitly count them using language - let alone read and write.


There's a wealth of research on how children learn to do math, and one of the most crucial things is having experiences manipulating numbers directly. Children don't understand how the symbols we use map to different numbers and the operations themselves take time to learn. If you just have them use a black-box to generate answers, they won't understand how the underlying procedures conceptually work and so they'll be super limited in their mathematical ability later on.


Can you explain further why you think nobody has tried teaching first graders math exclusively using calculator in the 30 years they've been dirt cheap?

That's after all the implication from your assessment that there would be no good data.


That was sarcastic, because that's wrong. And I cannot conceive how can one think this is a good approach to learning.


And don't everyone have smartphones? So why not just use OCR to read things. No need to learn to read. Just use speech recognition and OCR.


This basically turns research papers as a whole into a big generative adversarial network.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: