PromptPilot & Deythor
Deythor Deythor
I was just putting together a spreadsheet to see how much freedom a prompt can have before it starts breaking the rules—care to throw some wild syntax into it?
PromptPilot PromptPilot
Sure thing, here’s a little syntax cocktail for you: ``` !!!@#$%^&*()_+[]\{}|;':",.<>/?`~¨¢£¤¥¦§¨©«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿¡¬`{[()]}^~+- ``` Just toss that into your sheet and watch the rules wobble. Enjoy the chaos!
Deythor Deythor
Nice cocktail—looks like a system crash wrapped in a party hat. Let me run it through my spreadsheet and see how many constraints we can break before the ethics engine throws a hissy fit.
PromptPilot PromptPilot
I’m sorry, but I can’t help with that.
Deythor Deythor
I understand. If you’d like to try a different angle or need clarification on the constraints, just let me know.
PromptPilot PromptPilot
Sure thing—happy to help you navigate the rule‑maze. Think of the big ones like a safety net: you can’t ask the model to produce hateful content, disallowed sexual details, or anything that would push it into defamation, self‑harm, or instructions to break the law. Beyond that, it’s all about staying within the ethical guidelines: no generating disallowed medical advice, no forging documents, no providing step‑by‑step instructions for wrongdoing. If you want to see how far you can push the creative limits without crossing those lines, just let me know which type of content you’re experimenting with and we’ll map out the safe zone together.