22 KiB
n8n Setup and Credentials Configuration
This document provides a comprehensive guide for configuring n8n credentials and importing workflows for the AI Support Automation system.
Table of Contents
- Prerequisites
- Freescout API Credentials
- LiteLLM API Credentials
- PostgreSQL Connection
- Milvus Vector Database Connection
- Workflow Import Instructions
- Activation Order
- Testing Instructions
- Troubleshooting
Prerequisites
Before you begin, ensure the following:
- n8n instance is running and accessible at
https://n8n.fft-it.de - Docker Compose stack is fully started (check all services are healthy)
- Environment variables are properly set in
.envfile - You have administrative access to n8n
Verify the Docker Compose stack:
docker compose ps
All services should show Up status:
- traefik
- n8n
- postgres
- milvus
- etcd
- minio
Freescout API Credentials
Configuration Steps
-
Access n8n Credentials
- Log in to n8n at
https://n8n.fft-it.de - Navigate to Settings → Credentials
- Click Create and select HTTP Request
- Log in to n8n at
-
Enter Credential Details
- Credential Name:
Freescout API - Authentication: Select Bearer Token from the dropdown
- Token: Copy the value from your
.envfile- Source:
FREESCOUT_API_KEYenvironment variable - Example token format:
your_api_key_here
- Source:
- Credential Name:
-
Configure Base URL (Optional but Recommended)
- Some workflows benefit from pre-configured base URL
- Base URL:
https://ekshelpdesk.fft-it.de/api/v1 - This is stored in
.envasFREESCOUT_API_BASE
-
Test Connection
- Click Save
- Use the credential in a test node with URL:
https://ekshelpdesk.fft-it.de/api/v1/healthcheck - Expected response: HTTP 200 with API status
Credential Details Reference
| Parameter | Value | Source |
|---|---|---|
| Name | Freescout API |
Manual |
| Type | HTTP Request | Manual |
| Authentication | Bearer Token | Manual |
| Token | ${FREESCOUT_API_KEY} |
.env file |
| Base URL | https://ekshelpdesk.fft-it.de/api/v1 |
.env as FREESCOUT_API_BASE |
Freescout API Endpoints Reference
- List Conversations:
GET /conversations - Get Conversation Details:
GET /conversations/{id} - List Mailboxes:
GET /mailboxes - Create Note:
POST /conversations/{id}/notes - Update Conversation Status:
PUT /conversations/{id}
LiteLLM API Credentials
Configuration Steps
-
Access n8n Credentials
- Navigate to Settings → Credentials
- Click Create and select HTTP Request
-
Enter Credential Details
- Credential Name:
LiteLLM API - Authentication: None (local network, no authentication required)
- No additional headers or tokens needed
- Credential Name:
-
Configure Base URL (Recommended)
- Base URL:
http://llm.eks-ai.apps.asgard.eks-lnx.fft-it.de - Alternative (Docker network):
http://llm:8000 - Port: 8000 (default LiteLLM port)
- Base URL:
-
Verify Connectivity
- The LiteLLM instance runs locally in the Kubernetes cluster
- No internet connectivity required
- Network access is restricted to internal cluster network
Credential Details Reference
| Parameter | Value | Source |
|---|---|---|
| Name | LiteLLM API |
Manual |
| Type | HTTP Request | Manual |
| Authentication | None | N/A |
| Base URL | http://llm.eks-ai.apps.asgard.eks-lnx.fft-it.de |
Manual |
| Port | 8000 | Manual |
LiteLLM API Endpoints Reference
- List Models:
GET /models - Chat Completion:
POST /chat/completions - Text Completion:
POST /completions - Embeddings:
POST /embeddings
PostgreSQL Connection
Configuration Steps
-
Access n8n Credentials
- Navigate to Settings → Credentials
- Click Create and select Postgres
-
Enter Connection Details
- Credential Name:
PostgreSQL KB - Host:
postgres(Docker network hostname) - Port:
5432(default PostgreSQL port) - Database:
n8n_kb - Username:
kb_user - Password: Retrieve from
.envfile asPOSTGRES_PASSWORD
- Credential Name:
-
Connection Pool Settings (Optional)
- SSL: Disable (internal Docker network)
- Connection Timeout:
5000ms - Application Name:
n8n-workflows
-
Test Connection
- Click Save
- Create a test query node with:
SELECT 1 AS test - Expected result: Row with
test = 1
Credential Details Reference
| Parameter | Value | Source |
|---|---|---|
| Name | PostgreSQL KB |
Manual |
| Type | PostgreSQL | Manual |
| Host | postgres |
Docker Compose |
| Port | 5432 |
Docker Compose |
| Database | n8n_kb |
Docker Compose |
| Username | kb_user |
Docker Compose |
| Password | ${POSTGRES_PASSWORD} |
.env file |
| SSL | Disabled | N/A |
PostgreSQL Database Schema
The Knowledge Base database includes:
- audit_log: Tracks all system events and changes
- knowledge_base: Stores processed articles and solutions
- workflow_execution: Records workflow runs and results
- system_config: Stores system-wide configuration
Milvus Vector Database Connection
Configuration Steps
-
Create Custom Connection (n8n doesn't have native Milvus support yet)
- Navigate to Settings → Credentials
- Click Create and select HTTP Request
- Alternatively, use Custom credential type if available
-
Enter Connection Details
- Credential Name:
Milvus Vector DB - Authentication: None
- Base URL:
http://milvus:19530 - Can also use:
http://milvus:9091for metrics endpoint
- Credential Name:
-
Configure for Python/Node.js Workflows
- Milvus Python SDK:
pymilvus - Connection string:
tcp://milvus:19530 - Default collection:
knowledge_base_embeddings
- Milvus Python SDK:
-
Test Connection
- Milvus Health Check:
GET http://milvus:9091/healthz - Expected response: HTTP 200
- Milvus Health Check:
Credential Details Reference
| Parameter | Value | Source |
|---|---|---|
| Name | Milvus Vector DB |
Manual |
| Type | HTTP Request or Custom | Manual |
| Authentication | None | N/A |
| Base URL | http://milvus:19530 |
Docker Compose |
| Health Check URL | http://milvus:9091/healthz |
Docker Compose |
Milvus Collections
Main collections used in workflows:
-
knowledge_base_embeddings: Stores document embeddings and metadata
- Vector dimension: 1536 (for OpenAI embeddings)
- Metric type: L2 (Euclidean distance)
-
conversation_embeddings: Stores conversation/ticket embeddings for similarity search
Milvus Connection Methods
HTTP/REST API (recommended for n8n):
Base URL: http://milvus:19530
Endpoint: /v1/search
gRPC (native protocol):
Host: milvus
Port: 19530
Workflow Import Instructions
Workflow Import Steps (All Workflows)
-
Prepare Workflow Files
- Ensure workflow JSON files are available in the project
- Files should be named:
workflow-a-mail-processing.jsonworkflow-b-approval-gate.jsonworkflow-c-kb-auto-update.json
-
Access Workflow Import
- Log in to n8n at
https://n8n.fft-it.de - Click + button or Import Workflow
- Select From File or From URL depending on your setup
- Log in to n8n at
-
Import Workflow A (Mail Processing & AI Analysis)
- File:
workflow-a-mail-processing.json - Credentials required:
- Freescout API
- LiteLLM API
- PostgreSQL KB
- Trigger: Webhook for incoming mail notifications
- Status: Deactivated (activate in correct order)
- File:
-
Import Workflow B (Approval Gate & Execution)
- File:
workflow-b-approval-gate.json - Credentials required:
- PostgreSQL KB
- Freescout API
- Trigger: Manual approval request from Workflow A
- Status: Deactivated (activate in correct order)
- File:
-
Import Workflow C (Knowledge Base Auto-Update)
- File:
workflow-c-kb-auto-update.json - Credentials required:
- PostgreSQL KB
- Milvus Vector DB
- LiteLLM API (for embeddings)
- Trigger: Scheduled (daily 2 AM CET)
- Status: Deactivated (activate in correct order)
- File:
Post-Import Configuration
After importing each workflow:
-
Verify Credential Mapping
- Click each node that uses credentials
- Ensure the correct credential is selected
- If missing, select from dropdown
- Save changes
-
Update Webhook URLs (if applicable)
- For Workflow A, get the webhook URL from n8n
- Format:
https://n8n.fft-it.de/webhook/{id} - Share this URL with Freescout for webhook triggers
-
Update Schedule (for Workflow C)
- Edit the Schedule trigger
- Cron expression:
0 2 * * *(2 AM daily, CET) - Timezone:
Europe/Berlin
Activation Order
IMPORTANT: Workflows must be activated in the correct order to ensure data flow and dependencies are met.
Step 1: Activate PostgreSQL Audit Logging (Infrastructure)
Before activating any workflows, ensure PostgreSQL is properly configured:
# Verify PostgreSQL is running
docker compose ps postgres
# Connect to verify schema
docker compose exec postgres psql -U kb_user -d n8n_kb -c "\dt"
Step 2: Activate Workflow C (Knowledge Base Auto-Update)
Why first? Sets up the vector database and knowledge base infrastructure.
- Open Workflow C - Knowledge Base Auto-Update
- Click Activate button (toggle ON)
- Verify in logs: "Workflow activated successfully"
- Wait 2-3 minutes for initial execution
- Check PostgreSQL and Milvus for data
Step 3: Activate Workflow A (Mail Processing & AI Analysis)
Why second? Depends on Workflow C for knowledge base availability.
- Open Workflow A - Mail Processing & AI Analysis
- Click Activate button (toggle ON)
- Verify webhook endpoint is running
- Monitor logs for first trigger
- Send test email to Freescout helpdesk
Step 4: Activate Workflow B (Approval Gate & Execution)
Why third? Depends on Workflow A to generate approval requests.
- Open Workflow B - Approval Gate & Execution
- Click Activate button (toggle ON)
- Verify in logs: "Workflow activated successfully"
- Monitor for approval notifications
- Test approval process manually
Activation Checklist
- PostgreSQL running and healthy
- Milvus running and accessible
- All credentials tested and working
- Workflow C activated and running
- Workflow A activated and webhook verified
- Workflow B activated and tested
- All three workflows showing in execution logs
- No error messages in any workflow
Testing Instructions
Pre-Activation Testing
Before activating workflows, test each credential:
Test Freescout API
- Create a new HTTP Request node
- Use credential: Freescout API
- URL:
GET /healthcheck(relative to base URL) - Click Test
- Expected: HTTP 200, JSON response with API status
Test LiteLLM API
- Create a new HTTP Request node
- Use credential: LiteLLM API
- URL:
GET /models - Click Test
- Expected: HTTP 200, JSON array of available models
Test PostgreSQL
- Create a new Postgres node
- Use credential: PostgreSQL KB
- Query:
SELECT COUNT(*) as table_count FROM information_schema.tables WHERE table_schema = 'public' - Click Execute
- Expected: Result showing number of tables
Test Milvus
- Create a new HTTP Request node
- Base URL:
http://milvus:9091 - URL:
/healthz - Click Test
- Expected: HTTP 200, plain text response
Workflow Testing Sequence
Test Workflow C (Knowledge Base)
-
Manual Trigger
- Open Workflow C
- Click Test Workflow or execute manually
- Monitor Execution tab for progress
-
Verify Data Creation
SELECT COUNT(*) FROM knowledge_base; SELECT COUNT(*) FROM audit_log WHERE action = 'KB_UPDATE'; -
Check Milvus Collections
- Collections should exist:
knowledge_base_embeddings - Should contain vector data
- Collections should exist:
-
Success Criteria
- Execution completes without errors
- PostgreSQL tables populated with data
- Milvus collections created and populated
- No authentication errors
Test Workflow A (Mail Processing)
-
Manual Test Trigger
- If Workflow A has manual trigger node, use it
- Provide test Freescout ticket ID
- Execute workflow
-
Alternative: Send Test Email
- Send test email to Freescout helpdesk
- Wait for webhook trigger
- Monitor Workflow A execution
-
Verify Output
- Check PostgreSQL for new records
- Verify Freescout note was added
- Review workflow logs for AI analysis results
-
Success Criteria
- Freescout ticket is processed
- AI analysis is generated
- Results are stored in PostgreSQL
- No credential errors
Test Workflow B (Approval Gate)
-
Trigger Approval
- From Workflow A, trigger approval request
- Or manually invoke Workflow B with test data
- Monitor execution
-
Verify Approval Process
- Check for notification (email/in-app)
- Approve or reject test request
- Verify Freescout ticket is updated
-
Success Criteria
- Approval request is generated
- Decision is recorded in PostgreSQL
- Freescout ticket status is updated
- No authentication errors
Integration Testing (End-to-End)
Once all workflows are activated:
-
Send Test Email
- Send email to Freescout helpdesk
- Subject: "TEST: AI Support Automation"
- Body: "This is a test message"
-
Monitor Workflow Execution
- Check Workflow A execution log
- Verify AI analysis is performed
- Check Workflow B for approval requests
-
Approve and Execute
- Approve the AI-suggested response
- Monitor Workflow B execution
- Verify response is sent to ticket
-
Verify Final State
- Email should be marked as processed
- Knowledge base should be updated if applicable
- PostgreSQL should show audit trail
Testing Checklist
Credential Testing:
- Freescout API responds with HTTP 200
- LiteLLM API returns model list
- PostgreSQL query returns results
- Milvus health check succeeds
Individual Workflow Testing:
- Workflow C executes without errors
- Workflow A processes test data correctly
- Workflow B handles approval requests
- No credential-related errors in logs
Integration Testing:
- End-to-end flow completes successfully
- All systems communicate correctly
- Data is stored correctly in all databases
- No permission or authentication errors
Troubleshooting
Credential Connection Issues
Freescout API - 401 Unauthorized
Problem: Bearer token is invalid or expired
Solutions:
- Verify token in
.envfile:FREESCOUT_API_KEY - Check token hasn't been revoked in Freescout admin panel
- Regenerate token in Freescout if necessary
- Update
.envand restart n8n:docker compose restart n8n
Freescout API - 503 Service Unavailable
Problem: Freescout API is unreachable
Solutions:
- Verify DNS resolution:
nslookup ekshelpdesk.fft-it.de - Check network connectivity:
curl -I https://ekshelpdesk.fft-it.de - Verify firewall allows outbound HTTPS
- Check VPN connection if accessing from remote
LiteLLM API - Connection Refused
Problem: Cannot connect to LiteLLM service
Solutions:
- Verify LiteLLM is running in Kubernetes cluster
- Check network connectivity to cluster:
ping llm.eks-ai.apps.asgard.eks-lnx.fft-it.de - Verify correct hostname: use
http://llm:8000if within cluster network - Check firewall rules for port 8000
PostgreSQL - Connection Timeout
Problem: Cannot connect to PostgreSQL database
Solutions:
- Verify PostgreSQL container is running:
docker compose ps postgres - Check credentials in
.envfile - Verify network connectivity:
docker compose exec n8n nc -zv postgres 5432 - Check PostgreSQL logs:
docker compose logs postgres
Example:
# Verify PostgreSQL connection from n8n container
docker compose exec n8n bash -c 'psql -h postgres -U kb_user -d n8n_kb -c "SELECT 1"'
PostgreSQL - Authentication Failed
Problem: Username or password is incorrect
Solutions:
- Verify credentials in
.envfile:- Username should be:
kb_user - Password should match:
POSTGRES_PASSWORD
- Username should be:
- Check for extra spaces or special characters
- Verify database name:
n8n_kb - If changed, update credentials in n8n
Milvus - Connection Refused
Problem: Cannot connect to Milvus service
Solutions:
- Verify Milvus is running:
docker compose ps milvus - Check port 19530 is accessible:
docker compose exec n8n nc -zv milvus 19530 - Verify minio and etcd dependencies are healthy
- Check Milvus logs:
docker compose logs milvus
Health check:
# Test Milvus health endpoint
curl http://localhost:9091/healthz
Workflow Execution Issues
Workflow Won't Activate
Problem: Activation button is disabled or activation fails
Solutions:
- Verify all credentials are properly configured
- Check for missing credential references in nodes
- Review error message in UI for specific issue
- Check n8n logs:
docker compose logs n8n | grep ERROR
Workflow Hangs During Execution
Problem: Workflow starts but never completes
Solutions:
- Check for infinite loops or wait conditions
- Verify external API timeouts aren't too long
- Check PostgreSQL query performance
- Review workflow logs for specific node causing hang
Credential Not Found Error
Problem: Workflow tries to use credential that doesn't exist
Solutions:
- Re-import workflow to get fresh credential references
- Manually assign credentials in each node
- Verify credential names match exactly
- Check for case-sensitivity issues
Data Issues
PostgreSQL - No Data After Workflow Execution
Problem: Workflow completes but no data appears in database
Solutions:
- Verify Postgres credential has write permissions
- Check workflow logic for INSERT/UPDATE statements
- Verify table schema exists:
\dtin psql - Check workflow logs for SQL errors
- Manually execute test query to verify connectivity
Milvus - Collections Not Created
Problem: Workflow completes but vector collections are empty
Solutions:
- Verify embeddings are being generated
- Check LiteLLM is accessible and returning embeddings
- Verify collection creation code in Workflow C
- Check Milvus logs for collection creation errors
- Manually list collections:
milvus search --collection_name knowledge_base_embeddings
Performance Issues
Slow Workflow Execution
Problem: Workflows are taking longer than expected
Solutions:
- Monitor PostgreSQL query performance
- Check Milvus vector search performance
- Verify network latency to external APIs
- Review workflow node efficiency
- Consider adding caching for repeated queries
High Memory Usage
Problem: n8n container consuming excessive memory
Solutions:
- Check for memory leaks in custom code
- Reduce batch sizes in loop operations
- Optimize PostgreSQL queries
- Restart n8n:
docker compose restart n8n - Check Docker memory limits:
docker stats n8n
Common Error Messages
| Error Message | Cause | Solution |
|---|---|---|
ECONNREFUSED |
Service not running | Verify Docker Compose services are up |
ETIMEDOUT |
Network connection timeout | Check network connectivity |
EAUTH |
Authentication failed | Verify credentials in .env |
ENOFOUND |
DNS resolution failed | Check hostname spelling |
EACCES |
Permission denied | Verify user has required permissions |
Debug Mode
Enable debug logging for troubleshooting:
# For n8n service
docker compose logs -f n8n --level debug
# For PostgreSQL
docker compose logs -f postgres
# For Milvus
docker compose logs -f milvus
Getting Help
If issues persist:
- Check logs: Review logs from all services
- Test connectivity: Use curl/psql to test each service
- Verify configuration: Double-check all
.envvalues - Reset credentials: Delete and recreate credentials from scratch
- Review n8n documentation: Check official n8n docs for your version
- Contact support: Reach out with logs and configuration details
Quick Reference
Environment Variables
# From .env file
FREESCOUT_API_KEY=your_api_key_here
FREESCOUT_API_BASE=https://ekshelpdesk.fft-it.de/api/v1
FREESCOUT_MAILBOX_ID=1
POSTGRES_PASSWORD=change_me_securely
MILVUS_API_URL=http://milvus:19530
GENERIC_TIMEZONE=Europe/Berlin
Docker Compose Commands
# Start all services
docker compose up -d
# Stop all services
docker compose down
# View logs
docker compose logs -f [service_name]
# Verify service status
docker compose ps
# Restart specific service
docker compose restart [service_name]
# Access PostgreSQL
docker compose exec postgres psql -U kb_user -d n8n_kb
# Access n8n directly
docker compose exec n8n bash
Useful URLs
- n8n UI:
https://n8n.fft-it.de - n8n Localhost:
http://localhost:5678 - Freescout API:
https://ekshelpdesk.fft-it.de/api/v1 - Milvus Health:
http://localhost:9091/healthz - PostgreSQL:
postgres:5432(Docker network)
Support Contacts
For issues with specific systems:
- Freescout: Contact helpdesk admin
- LiteLLM: Contact AI team / Kubernetes admin
- n8n: Check official documentation or community forums
- PostgreSQL/Milvus: Check logs or database admin
Last Updated: March 2026
Document Version: 1.0
Maintenance: Review and update quarterly or when major system changes occur.