The only rational capacity that LLMs have is that which has been trained into it. They've also been trained on mountains of gut reactions, tribalism, and propaganda. These things aren't Data from Star Trek. They're not coldly logical. In fact, it's a struggle to get them to be logical at all.
I think it could do a better job than 99.9% of humans at helping us spot the bias and propaganda we are fed daily.