I wonder if this is related to how ChatGPT and other models provided as a service have been filtered.
E.g. them being “forced” to be nice and more agreeable.
If that turns out to be the case, i’d wager that it is impossible to filter for every possible constellation and outcome as we have seen with people hacking through clever prompting in ever more sophisticated ways.
I find it particularly worrying that people without any prior signs of mental health issues got sucked into severe delusions and the article suggests that the “AI” being marketed as reliable and impartial is key to it. This means the companies behind will not address this fundamental misconception as their business model is built on it.
I dont see how these cases could be prevented without extreme regulatory intervention.
I don’t see how these cases could be prevented even with regulation. It would take a massive change in how these things work on a fundamental level.
Regulations that make the companies in question fully liable for any damages that are incurred in using the product for instance. Or regulations that prohibit selling these to the general public, or requiring a human supervision at all times of use.
That’s what i mean by
extreme regulatory intervention
As a consequence it would lead to fundamental changes in the products themselves.
AI models are annoyingly affirming even for the most benign questions. I can be like what shape is a stop sign? It would reply with something like “Way to think on your toes and you are so right for asking about that!”
give me a model that responds “it’s an octagon dipshit, sesame street taught you this”
Give me a model that eats other models and then dies.
They can do that. I have an AI system that I’ve been working on, and I told it to be grumpy, and question me if I’m wrong. It gives me some sassy, angry answers. I’m guessing they set up the prompts to be overly-nice
I know of people who (proudly) post screenshots of GPT calling them insightful, as if the matrix multiplier didn’t already tell everyone that