pull down to refresh
0 sats \ 1 reply \ @south_korea_ln 29 Nov 2024 \ parent \ on: Early object oriented programming languages devs
It'd be fun to find a jailbreak forcing it to answer truthfully to this question~~
I guess there are jailbreak prompts, but I have never used ChatGPT, myself.
I have looked at other’s usage, though. Apparently you have to set a scenario for the jailbreak, and then make your query to the jailbroken AI.
reply