The issue is that LLMs have no ability to organise their memory by importance. Especially as the context size gets larger.
So when they are using tools they will become more dangerous over time.
The issue is that LLMs have no ability to organise their memory by importance. Especially as the context size gets larger.
So when they are using tools they will become more dangerous over time.