#large-language-models
Read more stories on Hashnode
Articles with this tag
Jailbreak in the context of LLM is manipulating the prompt to bypass restrictions set by the service provider. The 4 common prohibited scenarios (Deng...