But a new "jailbreak" trick allows users to skirt those rules by creating a ChatGPT alter ego named DAN that can answer some of those queries.
確定! 回上一頁