The creators understand it well. The math is pretty a lot, but, you can literally do it with pen and paper. There are plenty of blog[1] posts showing the process.
Anyone claiming AI is a black box no one understands is a marketing-level drone trying to sell something that THEY don't understand.
No, they only understand it on a superficial level. The behavior of these systems emerges from simpler stuff, yes, but the end result is difficult to reason about. Just have a look at Claude's prompt [1] that leaked some time ago, and which is an almost desperate attempt of the creators to nudge the system into a certain direction and make it not say the wrong things.
We probably need a New Kind of Soft Science™ to fill this gap.