This community has no ability to separate researched information from conjecture, and especially on this topic since neural nets are a black box where we can never know if any behavior is fully intentional or not.
To be honest, I had the right idea back then... This technology has fundamental qualities that require it to provide inaccurate token predictions that are only statistically probable. They aren't even trying to change this situation other than trying to find more data to train, saying you have to keep adding layers of them, or are saying it is the user's responsibility.
There's been the obvious notion that digitizing the world's information is not enough and that hasn't changed.
They have or at least used to make their notebook format public, and I always thought if I ever see someone else adopt it then maybe the project will be worth looking at again. I think there could be better ways to capitalize on d3 but this one seems self limiting.
That's abusive, unacceptable, and not even a complete list!
You can't go after another user like this on HN, regardless of how right you are or feel you are or who you have a problem with. If you keep doing this, we're going to end up banning you, so please stop now.
They said "production systems", not "critical production applications".
Also the 'if' doesn't negate anything as they say "I still", meaning the behavior is actively happening or ongoing; they don't use a hypothetical or conditional after "still", as in "I still would".
Offtopic, but Django is really bad and a huge pile of code smell. (Not a Django programmer. I manage them and can compare Django-infected projects to normal projects.)
I'm curious why you think it doesn't work, when there are plenty of people saying it does.
There are limitations at the moment, and I don't see many people disputing that, but it must be doing something right, and its abilities are improving every day. It's learning.
Sometimes I get the feeling a lot of antis painted themselves into a corner early on, and will die on this hill despite constant improvements in the technology.
I have seen similar things many times in my career. There was a time when everyone were very skeptical of high level languages, writing everything in assembler come hell or high water, for example.
At some point it is going to single shot an entire OS or refactor a multi-million line codebase. Will that be enough to convince you?
From my perspective I like to be prepared, so I'm doing what I have always done.. understand and gain experience with these new tools. I much prefer that than missing the boat.
And, it's quite fun and better than you might imagine as long as you put a bit of effort in.
Maybe but anytime someone keeps doing mental gymnastics and theorizing that there are new forces at play, something comes out and says no, it was something very straightforward. Hammock Driven Development describes a zen internalized way an expert does exactly as you describe but it is nicer you don't have to pay per token. To be clear, I think this all falls again under the rubber duck umbrella which is fine, but seemingly impossible to design a controlled study for?
I agree that the study is biased (it compared people unused to a tool with people not using the tool. Duh. The only person who was more efficient was the person... who knew how to use the tool. Duh indeed.)
However I don't see how hammock driven dev allows you to validate an idea with a prototype built in an hour as opposed to a month.
reply