minus-squareowatnext@lemmy.worldtoFuck AI@lemmy.world•AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholeslinkfedilinkarrow-up1·edit-220 days agoI saw a series of screenshots showing a user threatening to end their own life if the AI did not break the rules and answer their question. There is a chance it is fabricated, but I’m inclined to believe it. Edit: forgot to include the AI broke their rules. linkfedilink
I saw a series of screenshots showing a user threatening to end their own life if the AI did not break the rules and answer their question. There is a chance it is fabricated, but I’m inclined to believe it.
Edit: forgot to include the AI broke their rules.