But not 100%. And the things they hallucinate can be very subtle. That’s the problem.
If they are asked about a band that does not exist, to be useful they should be saying “I’m sorry, I know nothing about this”. Instead they MAKE UP A BAND, ITS MEMBERSHIP, ITS DISCOGRAPHY, etc. etc. etc.
But sure, let’s play your game.
All of the information on Infected Rain is out there, including their lyrics. So is all of the information on Jim Thirwell’s various “Foetus” projects. Including lyrics.
Yet ChatGPT, DeepSeek, and Claude will all three hallucinate tracks, or misattribute them, or hallucinate lyrics that don’t exist to show parallels in the respective bands’ musical themes.
So there’s your objective facts, readily available, that LLMbeciles are still completely and utterly fucking useless for.
So they’re useless if you ask about things that don’t exist and will hallucinate them into existence on your screen.
And they’re useless if you ask about things that do exist, hallucinating attributes that don’t exist onto them.
They. Are. Fucking. Useless.
That people are looking at these things and saying “wow, this is so accurate” terrifies the living fuck out of me because it means I’m surrounded not by idiots, but by zombies. Literally thoughtless mobile creatures.
Dude. Go be reply guy somehwere else. You bore the fuck out of me.