'Skeleton Key' attack unlocks the worst of AI, says Microsoft

Theregister | 28-06-2024 07:43pm |

Simple jailbreak prompt can bypass safety guardrails on major models Microsoft on Thursday published details about Skeleton Key – a technique that bypasses the guardrails used by makers of AI models to prevent their generative chatbots from creating harmful content....

Stay Updated with the Latest News!

Don't miss out on breaking stories and in-depth articles.