The corollary works well too; "If it's AI, it doesn't work."
That's because its the same mechanism at play. When people can't explain the underlying algorithm, they can't show when the algorithm would work and when it wouldn't. In computer systems, one of the truisms is that for the same inputs a known algorithm produces the same outputs. If you don't get the same outputs you don't understand all of the inputs.
But that helps set your expectations for a technology.
That's because its the same mechanism at play. When people can't explain the underlying algorithm, they can't show when the algorithm would work and when it wouldn't. In computer systems, one of the truisms is that for the same inputs a known algorithm produces the same outputs. If you don't get the same outputs you don't understand all of the inputs.
But that helps set your expectations for a technology.