pull down to refresh
It'd be fun to find a jailbreak forcing it to answer truthfully to this question~~
reply
I guess there are jailbreak prompts, but I have never used ChatGPT, myself.
I have looked at other’s usage, though. Apparently you have to set a scenario for the jailbreak, and then make your query to the jailbroken AI.
reply
Of course it would claim it hasn’t! The company is in ongoing court hassles over training materials. I would bet that ChatGPT denies all training materials if asked directly, except those that are free and open.