Hacker News new | past | comments | ask | show | jobs | submit login

Language models with a loop absolutely aren't Turing complete. Assuming the model can even follow your instructions the output is probabilistic so in the limit you can guarantee failure. In reality though there are lots of instructions LLMs fail to follow. You don't notice it as much when you're using them normally but if you want to talk about computation you'll run into trivial failures all the time.

The last time I had this discussion with people I pointed out how LLMs consistently and completely fail at applying grammar production rules (obviously you tell them to apply to words and not single letters so you don't fight with the embedding.)

LLMs do some amazing stuff but at the end of the day:

1) They're just language models, while many things can be described with languages there are some things that idea doesn't capture. Namely languages that aren't modeled, which is the whole point of a Turing machine.

2) They're not human, and the value is always going to come from human socialization.






> Language models with a loop absolutely aren't Turing complete.

They absolutely are. It's trivial to test and verify that you can tell one to act as a suitably small Turing machine and give it instructions to use to manipulate the conversation as "the tape".

Anything else would be absolutely astounding given how simple it is to implement a minimal 2-state 3-symbol Turing machine.

> Assuming the model can even follow your instructions the output is probabilistic so in the limit you can guarantee failure.

The output is deterministic if you set the temperature to zero, at which point it is absolutely trivial to verify the correct output for each of the possible states of a minimal Turing machine.


If you'd care to actually implement what you describe, I'm sure the resulting blog post would make a popular submission here.

It's not very interesting - it's basically showing it can run one step of a very trivial state machine , and then add a loop to let it keep running with the conversation acting as the tape io.

It's pretty hard to make any kind of complex system that can't be coerced into being Turing complete once you add iteration.


Seriously, get any instruct tuned language model and try to do one iteration with grammar production rules. It's coin flip at best if they get it right.

I have tried that many times and had good results.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: