Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GPT is not Markovian; it has state.


Then it's a markov-like with state. Or as I've taken to calling them lately Markov+state. (I couldn't resist, sorry.)

A truck towing a trailer isn't just a car because it pivots in the middle and has more wheels. It's fundamentals of operation are still closer to a car or truck without trailer than a bicycle.

Humans can form thoughts and get to mostly correct answers even as a gut feeling, and the language to explain why/how need not even be present. We don't form thoughts one word at a time.


No it is not Markov-like. GPT models are not Markov processes by definition. They take into account all previous words in the sequence when generating the next word. They have a type of memory in the form of an attention mechanism that refers to multiple previous states when generating tokens.

They are not human-like and they are not Markov-like. GPT is a separate category.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: