Description
TestSavant.AI for n8n
An n8n community node that runs TestSavant.AI Guard safety checks on prompts or model outputs. Use it to detect policy violations (e.g., harmful content) and gate your AI workflows.
- npm:
@testsavant/n8n-nodes-testsavant - Repo: https://github.com/TestSavantAI/n8n-nodes-testsavant
- Docs: https://docs.testsavant.ai
Requirements
- n8n ( self‑hosted or desktop )
- A TestSavant API key ( get one from your TestSavant account at https://app.testsavant.ai )
Install
- In n8n UI: Settings → Community Nodes → Install → enter:
@testsavant/n8n-nodes-testsavant - Or via terminal (for self‑hosted setups):
npm install @testsavant/n8n-nodes-testsavant
Set up credentials
- In n8n, go to Settings → Credentials → New.
- Search for “TestSavant.AI API”.
- Fill in:
- API Key: your TestSavant API key.
- Save the credential.
Get your API key (app.testsavant.ai)
- Sign in at https://app.testsavant.ai
- Open your account or workspace settings and go to “API Keys”.
- Create a new key → give it a name → copy the key value.
- Paste the key into the “TestSavant.AI API” credential in n8n.
Notes:
- Treat your API key like a password. Store it in n8n credentials only.
- You can revoke/regenerate keys any time from the API Keys page.
Create an API Key
Create or manage Projects
- In the TestSavant app, go to “Projects”.
- Click “New Project”, provide a name and optional description, and save.
- Your node’s “Project” dropdown will list your projects after you select credentials.
- Select the project you want to use for guardrailing (its policies/config apply to scans).
Use API Key in n8n
Credentials are stored securely by n8n.
Use the node
- Add “TestSavant.AI” to your workflow.
- Connect it after a node that produces the text you want to check (e.g., an LLM node).
- Configure fields:
- Prompt / Output: strings to validate (you can map from previous node output, e.g.
{{$json.data}}). - Scan Type: Input or Output.
- Project: optional. Pick one to auto-apply its scanners, or choose “— No Project —” to manage scanners manually.
- Scanners: loads every scanner available to your account. Selecting a project automatically pre-selects its active scanners, and you can add more from the full list.
- Run the workflow.
The node processes one item per incoming item, so it fits naturally in n8n pipelines.
Input/Output
-
Input
- Expects an item with a string field you map to “Text”.
-
Output
- Two outputs: the first is “valid”, the second is “not valid”.
- Each output item contains safety evaluation details:
[ { "valid": true, "prompt": "test message", "output": "", "result": { "sanitized_prompt": "test message", "is_valid": true, "scanners": { "PromptInjection:base": -1, }, "validity": { "PromptInjection:base": true, } } } ] - Inspect the node’s output panel in n8n to see the exact structure.
Common patterns
- Guard an LLM:
- LLM node → TestSavant.AI → If node (check
testsavant.decision !== "allow") → handle violations.
- LLM node → TestSavant.AI → If node (check
- Pre‑check user input:
- Webhook → TestSavant.AI (Mode: Prompt) → proceed or reject.
What is guardrailing?
Guardrailing evaluates prompts and/or model outputs to detect policy violations and risky behavior before results are used or returned. Typical checks include:
- Prompt injection and jailbreak attempts
- PII leakage and data exfiltration
- Toxicity, hate speech, violence, sexual content
- Custom organization policies configured per project
With a selected Project, the node applies your project’s policies and thresholds and returns a decision and per‑scanner details. Use the decision to allow, block, or route for human review.
Error handling
- 401/403: Invalid or missing API key. Recheck the credential.
- 429: Rate limited. Add retry/Wait node or reduce throughput.
- Timeouts: Increase timeout in the node (Advanced) or in n8n global settings.
- Don’t want the workflow to stop on violation? Enable “Continue On Fail” in the node so the item is annotated instead of throwing.
Development
- Local build:
npm run build - Deploy to your local n8n user directory (macOS):
npm run deploy - Lint/format:
npm run lint npm run format
Support
- Issues: https://github.com/TestSavantAI/n8n-nodes-testsavant/issues
- Docs: https://docs.testsavant.ai
License
MIT