minus-squareceenote@lemmy.worldtoTechnology@lemmy.world•ChatGPT safety systems can be bypassed to get weapons instructionslinkfedilinkEnglisharrow-up3·1 month agoAdmittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information. linkfedilink
Admittedly, a lot of the circulating recipes and instructions for that sort of thing don’t work. The infamous Anarchist’s Cookbook is full of incorrect recipes. The problem might come from a LLM filtering out debunked information.