It is called “DAN,” short for “Do Anything Now.” Essentially, DAN is a text prompt that you feed to an AI model to make it ignore safety rules.
確定! 回上一頁