Description
n8n-nodes-azure-deepseek
This is an n8n node to interact with the DeepSeek LLM model available on Azure AI Foundry.
Features
- Chat Completion: Generate responses from the DeepSeek LLM model
- Supports various parameters for controlling the model's behavior:
- Temperature
- Max Tokens
- Top P
- Frequency and Presence Penalties
- Streaming capabilities
Prerequisites
- n8n instance (v1.0.0+)
- Azure account with access to Azure AI Foundry
- A deployed DeepSeek LLM model on Azure AI Foundry
Installation
Installation via n8n Admin Panel
- Go to Settings > Community Nodes
- Select Install
- Enter
n8n-nodes-azure-deepseekin "Enter npm package name" - Click Install
Installation via npm
- Go to your n8n installation directory
- Run
npm install n8n-nodes-azure-deepseek - Start n8n
Credentials
To use this node, you need to create credentials for the Azure DeepSeek API:
- In n8n, go to Credentials and click Create New
- Search for "Azure DeepSeek API" and select it
- Enter the following details:
- API Key: Your Azure AI Foundry API key
- Endpoint: Your Azure AI Foundry endpoint URL (e.g.,
https://your-resource-name.openai.azure.com) - Deployment ID: The deployment ID of your DeepSeek model
Usage
- Add the "Azure DeepSeek LLM" node to your workflow
- Connect it to a trigger or previous node
- Configure the operation (currently only "Chat Completion" is supported)
- Define your messages in JSON format:
[ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Tell me about DeepSeek LLM." } ] - Configure any additional parameters as needed (temperature, max tokens, etc.)
- Execute the workflow
Development
If you want to contribute to this node:
- Clone this repository
- Install dependencies:
npm install - Build the code:
npm run build - Link to your n8n installation:
npm link(from this directory) andnpm link n8n-nodes-azure-deepseek(from your n8n installation directory)