Add 'prompt' to the long list of injection attacks
-
Generative AI tools can be manipulated to accomplish malicious tasks, reveal sensitive information or ignore safety filters with the right prompt.
https://www.securitymagazine.com/articles/99298-add-prompt-to-the-long-list-of-injection-attacks