Machine learning is a perfectly valid and useful field, traditional ML is super useful and can produce very powerful tools. LLMs are fancy word predictors that have no concept of truth.
But LLMs look and feel like they're almost "real" AI, because they talk in words instead of probabilities, and so people who can't discern the distance between and LLM and AGI assume that AGI is right around the corner.
If you believe AGI is right around the corner and skip over the bit where any mass application of AGI to replace workers for cheaper is just slavery but with extra steps, then of course it makes sense to pour money into any AI business.
And energy is the air in the economy at large. The Eliza effect is not the only bias disposing people to believe AGI is right around the corner. There are deeper assumptions many cling to.