an admirable goal!
given the fallibility of LLMs, are you sure it's a good idea that they forget about it?
that seems like it has the same risks as having no security (perhaps worse, lulling people into a false sense of security)
are you sure the LLM doing security can't be tricked/attacked using any of the usual methods?
an admirable goal!
given the fallibility of LLMs, are you sure it's a good idea that they forget about it?
that seems like it has the same risks as having no security (perhaps worse, lulling people into a false sense of security)
are you sure the LLM doing security can't be tricked/attacked using any of the usual methods?