Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I fully agree with you about liability. I was advocating for the other point of view.

Some people argue that it doesn’t matter if there is mistakes (it depends which actually) and with time it will cost nothing.

I argue that if we give up learning and let LLM do the assignments then what is the extent of my knowledge and value to be hired in the first place ?

We hired a developper and he did everything with chatGPT, all the code and documentation he wrote. First it was all bad because from the infinity of answers chatGPT is not pinpointing the best in every case. But does he have enough knowledge to understand what he did was bad ? And then we need people with experience that confronted themselves with hard problems and found their way out. How can we confront and critic an LLM answer otherwise ?

I feel student’s value is diluted to be at the mercy of companies providing the LLM and we might loose some critical knowledge / critical thinking in the process from the students.




I agree entirely on your take regarding education. I feel like there is a place where LLMs are useful but doesn't impact learning but it's definitely not in the "discovery" phase of learning.

However I really don't need to implement some weird algorithms myself every time (ideally I am using a well tested Library) but the point is that you learn to be able to but also to be able to modify or compose the algorithm in ways the LLM couldn't easily do.


Why did you hire someone who produced bad code and docs? Did he manage to pass interview without an AI?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: