AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholes
AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholes

gizmodo.com
AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholes

Even using random capitalization in a prompt can cause an AI chatbot to break its guardrails and answer any question you ask it.