return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square35linkfedilinkarrow-up1210arrow-down17cross-posted to: fuck_ai@lemmy.world
arrow-up1203arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square35linkfedilinkcross-posted to: fuck_ai@lemmy.world
minus-squareceenote@lemmy.worldlinkfedilinkEnglisharrow-up5·1 month agoAdmittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.
minus-squareNoiseColor @lemmy.worldlinkfedilinkEnglisharrow-up2·1 month agoId still want to double check😀.
Admittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.
Id still want to double check😀.