.. / PromptMap-Prompt-Injection-Attack
Star

PromptMap is a tool designed to test adversarial prompt attacks on ChatGPT instances by generating and evaluating attack prompts against system instructions. It helps identify potential vulnerabilities in the system prompts of LLM instances.

This guide explains how to install, configure, and use PromptMap effectively.

Command Reference:

Command: Copy References:

https://github.com/utkusen/promptmap

https://platform.openai.com/docs