Biggest one is write overhead: indexes have to be updated every time a record is created or an indexed column is updated. This must be done within the same transaction as the create or update, so you're adding an unnecessary overhead to every single one of those ops. Now, if the data is relatively small or the use-case doesn't warrant it, it doesn't matter.
Lesser issues: additional strain on index rebuilding whenever that happens; messing with execution plans and causing the query planner to be inefficient; primary/secondary memory overhead; or if your DB engine uses locks you can run into a myriad of issues there.
I'm all for SWEs learning about databases, as I'm morally opposed to the proliferation of ORMs and the like, but I don't think ChatGPT is the right way to go about things long-term. It's similar to Googling or using StackOverflow: yes, you will find information that is relevant to what interests you at the moment, but it's soon forgotten and does nothing to help build long-term mental models.
And to be fair, ChatGPT warned me about all of the potential issues with indexes. I did actually evaluate its suggestions and decided they were probably going to be worth the potential cost; and as I've said in a sibling thread, it was just a local SQLite database that consisted of data slurped from another source; so even if I'd somehow screwed the whole thing up, I could just delete it and start over.