# n8n Setup and Credentials Configuration This document provides a comprehensive guide for configuring n8n credentials and importing workflows for the AI Support Automation system. ## Table of Contents 1. [Prerequisites](#prerequisites) 2. [Freescout API Credentials](#freescout-api-credentials) 3. [LiteLLM API Credentials](#litellm-api-credentials) 4. [PostgreSQL Connection](#postgresql-connection) 5. [Milvus Vector Database Connection](#milvus-vector-database-connection) 6. [Workflow Import Instructions](#workflow-import-instructions) 7. [Activation Order](#activation-order) 8. [Testing Instructions](#testing-instructions) 9. [Troubleshooting](#troubleshooting) ## Prerequisites Before you begin, ensure the following: - n8n instance is running and accessible at `https://n8n.fft-it.de` - Docker Compose stack is fully started (check all services are healthy) - Environment variables are properly set in `.env` file - You have administrative access to n8n Verify the Docker Compose stack: ```bash docker compose ps ``` All services should show `Up` status: - traefik - n8n - postgres - milvus - etcd - minio ## Freescout API Credentials ### Configuration Steps 1. **Access n8n Credentials** - Log in to n8n at `https://n8n.fft-it.de` - Navigate to **Settings** → **Credentials** - Click **Create** and select **HTTP Request** 2. **Enter Credential Details** - **Credential Name**: `Freescout API` - **Authentication**: Select **Bearer Token** from the dropdown - **Token**: Copy the value from your `.env` file - Source: `FREESCOUT_API_KEY` environment variable - Example token format: `your_api_key_here` 3. **Configure Base URL** (Optional but Recommended) - Some workflows benefit from pre-configured base URL - Base URL: `https://ekshelpdesk.fft-it.de/api/v1` - This is stored in `.env` as `FREESCOUT_API_BASE` 4. **Test Connection** - Click **Save** - Use the credential in a test node with URL: `https://ekshelpdesk.fft-it.de/api/v1/healthcheck` - Expected response: HTTP 200 with API status ### Credential Details Reference | Parameter | Value | Source | |-----------|-------|--------| | Name | `Freescout API` | Manual | | Type | HTTP Request | Manual | | Authentication | Bearer Token | Manual | | Token | `${FREESCOUT_API_KEY}` | `.env` file | | Base URL | `https://ekshelpdesk.fft-it.de/api/v1` | `.env` as `FREESCOUT_API_BASE` | ### Freescout API Endpoints Reference - **List Conversations**: `GET /conversations` - **Get Conversation Details**: `GET /conversations/{id}` - **List Mailboxes**: `GET /mailboxes` - **Create Note**: `POST /conversations/{id}/notes` - **Update Conversation Status**: `PUT /conversations/{id}` ## LiteLLM API Credentials ### Configuration Steps 1. **Access n8n Credentials** - Navigate to **Settings** → **Credentials** - Click **Create** and select **HTTP Request** 2. **Enter Credential Details** - **Credential Name**: `LiteLLM API` - **Authentication**: **None** (local network, no authentication required) - No additional headers or tokens needed 3. **Configure Base URL** (Recommended) - Base URL: `http://llm.eks-ai.apps.asgard.eks-lnx.fft-it.de` - Alternative (Docker network): `http://llm:8000` - Port: 8000 (default LiteLLM port) 4. **Verify Connectivity** - The LiteLLM instance runs locally in the Kubernetes cluster - No internet connectivity required - Network access is restricted to internal cluster network ### Credential Details Reference | Parameter | Value | Source | |-----------|-------|--------| | Name | `LiteLLM API` | Manual | | Type | HTTP Request | Manual | | Authentication | None | N/A | | Base URL | `http://llm.eks-ai.apps.asgard.eks-lnx.fft-it.de` | Manual | | Port | 8000 | Manual | ### LiteLLM API Endpoints Reference - **List Models**: `GET /models` - **Chat Completion**: `POST /chat/completions` - **Text Completion**: `POST /completions` - **Embeddings**: `POST /embeddings` ## PostgreSQL Connection ### Configuration Steps 1. **Access n8n Credentials** - Navigate to **Settings** → **Credentials** - Click **Create** and select **Postgres** 2. **Enter Connection Details** - **Credential Name**: `PostgreSQL KB` - **Host**: `postgres` (Docker network hostname) - **Port**: `5432` (default PostgreSQL port) - **Database**: `n8n_kb` - **Username**: `kb_user` - **Password**: Retrieve from `.env` file as `POSTGRES_PASSWORD` 3. **Connection Pool Settings** (Optional) - **SSL**: Disable (internal Docker network) - **Connection Timeout**: `5000` ms - **Application Name**: `n8n-workflows` 4. **Test Connection** - Click **Save** - Create a test query node with: `SELECT 1 AS test` - Expected result: Row with `test = 1` ### Credential Details Reference | Parameter | Value | Source | |-----------|-------|--------| | Name | `PostgreSQL KB` | Manual | | Type | PostgreSQL | Manual | | Host | `postgres` | Docker Compose | | Port | `5432` | Docker Compose | | Database | `n8n_kb` | Docker Compose | | Username | `kb_user` | Docker Compose | | Password | `${POSTGRES_PASSWORD}` | `.env` file | | SSL | Disabled | N/A | ### PostgreSQL Database Schema The Knowledge Base database includes: - **audit_log**: Tracks all system events and changes - **knowledge_base**: Stores processed articles and solutions - **workflow_execution**: Records workflow runs and results - **system_config**: Stores system-wide configuration ## Milvus Vector Database Connection ### Configuration Steps 1. **Create Custom Connection** (n8n doesn't have native Milvus support yet) - Navigate to **Settings** → **Credentials** - Click **Create** and select **HTTP Request** - Alternatively, use **Custom** credential type if available 2. **Enter Connection Details** - **Credential Name**: `Milvus Vector DB` - **Authentication**: **None** - **Base URL**: `http://milvus:19530` - Can also use: `http://milvus:9091` for metrics endpoint 3. **Configure for Python/Node.js Workflows** - Milvus Python SDK: `pymilvus` - Connection string: `tcp://milvus:19530` - Default collection: `knowledge_base_embeddings` 4. **Test Connection** - Milvus Health Check: `GET http://milvus:9091/healthz` - Expected response: HTTP 200 ### Credential Details Reference | Parameter | Value | Source | |-----------|-------|--------| | Name | `Milvus Vector DB` | Manual | | Type | HTTP Request or Custom | Manual | | Authentication | None | N/A | | Base URL | `http://milvus:19530` | Docker Compose | | Health Check URL | `http://milvus:9091/healthz` | Docker Compose | ### Milvus Collections Main collections used in workflows: - **knowledge_base_embeddings**: Stores document embeddings and metadata - Vector dimension: 1536 (for OpenAI embeddings) - Metric type: L2 (Euclidean distance) - **conversation_embeddings**: Stores conversation/ticket embeddings for similarity search ### Milvus Connection Methods **HTTP/REST API** (recommended for n8n): ``` Base URL: http://milvus:19530 Endpoint: /v1/search ``` **gRPC** (native protocol): ``` Host: milvus Port: 19530 ``` ## Workflow Import Instructions ### Workflow Import Steps (All Workflows) 1. **Prepare Workflow Files** - Ensure workflow JSON files are available in the project - Files should be named: - `workflow-a-mail-processing.json` - `workflow-b-approval-gate.json` - `workflow-c-kb-auto-update.json` 2. **Access Workflow Import** - Log in to n8n at `https://n8n.fft-it.de` - Click **+** button or **Import Workflow** - Select **From File** or **From URL** depending on your setup 3. **Import Workflow A** (Mail Processing & AI Analysis) - File: `workflow-a-mail-processing.json` - Credentials required: - Freescout API - LiteLLM API - PostgreSQL KB - Trigger: Webhook for incoming mail notifications - Status: **Deactivated** (activate in correct order) 4. **Import Workflow B** (Approval Gate & Execution) - File: `workflow-b-approval-gate.json` - Credentials required: - PostgreSQL KB - Freescout API - Trigger: Manual approval request from Workflow A - Status: **Deactivated** (activate in correct order) 5. **Import Workflow C** (Knowledge Base Auto-Update) - File: `workflow-c-kb-auto-update.json` - Credentials required: - PostgreSQL KB - Milvus Vector DB - LiteLLM API (for embeddings) - Trigger: Scheduled (daily 2 AM CET) - Status: **Deactivated** (activate in correct order) ### Post-Import Configuration After importing each workflow: 1. **Verify Credential Mapping** - Click each node that uses credentials - Ensure the correct credential is selected - If missing, select from dropdown - Save changes 2. **Update Webhook URLs** (if applicable) - For Workflow A, get the webhook URL from n8n - Format: `https://n8n.fft-it.de/webhook/{id}` - Share this URL with Freescout for webhook triggers 3. **Update Schedule** (for Workflow C) - Edit the Schedule trigger - Cron expression: `0 2 * * *` (2 AM daily, CET) - Timezone: `Europe/Berlin` ## Activation Order **IMPORTANT**: Workflows must be activated in the correct order to ensure data flow and dependencies are met. ### Step 1: Activate PostgreSQL Audit Logging (Infrastructure) Before activating any workflows, ensure PostgreSQL is properly configured: ```bash # Verify PostgreSQL is running docker compose ps postgres # Connect to verify schema docker compose exec postgres psql -U kb_user -d n8n_kb -c "\dt" ``` ### Step 2: Activate Workflow C (Knowledge Base Auto-Update) **Why first?** Sets up the vector database and knowledge base infrastructure. 1. Open **Workflow C - Knowledge Base Auto-Update** 2. Click **Activate** button (toggle ON) 3. Verify in logs: "Workflow activated successfully" 4. Wait 2-3 minutes for initial execution 5. Check PostgreSQL and Milvus for data ### Step 3: Activate Workflow A (Mail Processing & AI Analysis) **Why second?** Depends on Workflow C for knowledge base availability. 1. Open **Workflow A - Mail Processing & AI Analysis** 2. Click **Activate** button (toggle ON) 3. Verify webhook endpoint is running 4. Monitor logs for first trigger 5. Send test email to Freescout helpdesk ### Step 4: Activate Workflow B (Approval Gate & Execution) **Why third?** Depends on Workflow A to generate approval requests. 1. Open **Workflow B - Approval Gate & Execution** 2. Click **Activate** button (toggle ON) 3. Verify in logs: "Workflow activated successfully" 4. Monitor for approval notifications 5. Test approval process manually ### Activation Checklist - [ ] PostgreSQL running and healthy - [ ] Milvus running and accessible - [ ] All credentials tested and working - [ ] Workflow C activated and running - [ ] Workflow A activated and webhook verified - [ ] Workflow B activated and tested - [ ] All three workflows showing in execution logs - [ ] No error messages in any workflow ## Testing Instructions ### Pre-Activation Testing Before activating workflows, test each credential: #### Test Freescout API 1. Create a new HTTP Request node 2. Use credential: **Freescout API** 3. URL: `GET /healthcheck` (relative to base URL) 4. Click **Test** 5. Expected: HTTP 200, JSON response with API status #### Test LiteLLM API 1. Create a new HTTP Request node 2. Use credential: **LiteLLM API** 3. URL: `GET /models` 4. Click **Test** 5. Expected: HTTP 200, JSON array of available models #### Test PostgreSQL 1. Create a new **Postgres** node 2. Use credential: **PostgreSQL KB** 3. Query: `SELECT COUNT(*) as table_count FROM information_schema.tables WHERE table_schema = 'public'` 4. Click **Execute** 5. Expected: Result showing number of tables #### Test Milvus 1. Create a new HTTP Request node 2. Base URL: `http://milvus:9091` 3. URL: `/healthz` 4. Click **Test** 5. Expected: HTTP 200, plain text response ### Workflow Testing Sequence #### Test Workflow C (Knowledge Base) 1. **Manual Trigger** - Open Workflow C - Click **Test Workflow** or execute manually - Monitor **Execution** tab for progress 2. **Verify Data Creation** ```sql SELECT COUNT(*) FROM knowledge_base; SELECT COUNT(*) FROM audit_log WHERE action = 'KB_UPDATE'; ``` 3. **Check Milvus Collections** - Collections should exist: `knowledge_base_embeddings` - Should contain vector data 4. **Success Criteria** - Execution completes without errors - PostgreSQL tables populated with data - Milvus collections created and populated - No authentication errors #### Test Workflow A (Mail Processing) 1. **Manual Test Trigger** - If Workflow A has manual trigger node, use it - Provide test Freescout ticket ID - Execute workflow 2. **Alternative: Send Test Email** - Send test email to Freescout helpdesk - Wait for webhook trigger - Monitor Workflow A execution 3. **Verify Output** - Check PostgreSQL for new records - Verify Freescout note was added - Review workflow logs for AI analysis results 4. **Success Criteria** - Freescout ticket is processed - AI analysis is generated - Results are stored in PostgreSQL - No credential errors #### Test Workflow B (Approval Gate) 1. **Trigger Approval** - From Workflow A, trigger approval request - Or manually invoke Workflow B with test data - Monitor execution 2. **Verify Approval Process** - Check for notification (email/in-app) - Approve or reject test request - Verify Freescout ticket is updated 3. **Success Criteria** - Approval request is generated - Decision is recorded in PostgreSQL - Freescout ticket status is updated - No authentication errors ### Integration Testing (End-to-End) Once all workflows are activated: 1. **Send Test Email** - Send email to Freescout helpdesk - Subject: "TEST: AI Support Automation" - Body: "This is a test message" 2. **Monitor Workflow Execution** - Check Workflow A execution log - Verify AI analysis is performed - Check Workflow B for approval requests 3. **Approve and Execute** - Approve the AI-suggested response - Monitor Workflow B execution - Verify response is sent to ticket 4. **Verify Final State** - Email should be marked as processed - Knowledge base should be updated if applicable - PostgreSQL should show audit trail ### Testing Checklist **Credential Testing:** - [ ] Freescout API responds with HTTP 200 - [ ] LiteLLM API returns model list - [ ] PostgreSQL query returns results - [ ] Milvus health check succeeds **Individual Workflow Testing:** - [ ] Workflow C executes without errors - [ ] Workflow A processes test data correctly - [ ] Workflow B handles approval requests - [ ] No credential-related errors in logs **Integration Testing:** - [ ] End-to-end flow completes successfully - [ ] All systems communicate correctly - [ ] Data is stored correctly in all databases - [ ] No permission or authentication errors ## Troubleshooting ### Credential Connection Issues #### Freescout API - 401 Unauthorized **Problem**: Bearer token is invalid or expired **Solutions**: 1. Verify token in `.env` file: `FREESCOUT_API_KEY` 2. Check token hasn't been revoked in Freescout admin panel 3. Regenerate token in Freescout if necessary 4. Update `.env` and restart n8n: `docker compose restart n8n` #### Freescout API - 503 Service Unavailable **Problem**: Freescout API is unreachable **Solutions**: 1. Verify DNS resolution: `nslookup ekshelpdesk.fft-it.de` 2. Check network connectivity: `curl -I https://ekshelpdesk.fft-it.de` 3. Verify firewall allows outbound HTTPS 4. Check VPN connection if accessing from remote #### LiteLLM API - Connection Refused **Problem**: Cannot connect to LiteLLM service **Solutions**: 1. Verify LiteLLM is running in Kubernetes cluster 2. Check network connectivity to cluster: `ping llm.eks-ai.apps.asgard.eks-lnx.fft-it.de` 3. Verify correct hostname: use `http://llm:8000` if within cluster network 4. Check firewall rules for port 8000 #### PostgreSQL - Connection Timeout **Problem**: Cannot connect to PostgreSQL database **Solutions**: 1. Verify PostgreSQL container is running: `docker compose ps postgres` 2. Check credentials in `.env` file 3. Verify network connectivity: `docker compose exec n8n nc -zv postgres 5432` 4. Check PostgreSQL logs: `docker compose logs postgres` **Example**: ```bash # Verify PostgreSQL connection from n8n container docker compose exec n8n bash -c 'psql -h postgres -U kb_user -d n8n_kb -c "SELECT 1"' ``` #### PostgreSQL - Authentication Failed **Problem**: Username or password is incorrect **Solutions**: 1. Verify credentials in `.env` file: - Username should be: `kb_user` - Password should match: `POSTGRES_PASSWORD` 2. Check for extra spaces or special characters 3. Verify database name: `n8n_kb` 4. If changed, update credentials in n8n #### Milvus - Connection Refused **Problem**: Cannot connect to Milvus service **Solutions**: 1. Verify Milvus is running: `docker compose ps milvus` 2. Check port 19530 is accessible: `docker compose exec n8n nc -zv milvus 19530` 3. Verify minio and etcd dependencies are healthy 4. Check Milvus logs: `docker compose logs milvus` **Health check**: ```bash # Test Milvus health endpoint curl http://localhost:9091/healthz ``` ### Workflow Execution Issues #### Workflow Won't Activate **Problem**: Activation button is disabled or activation fails **Solutions**: 1. Verify all credentials are properly configured 2. Check for missing credential references in nodes 3. Review error message in UI for specific issue 4. Check n8n logs: `docker compose logs n8n | grep ERROR` #### Workflow Hangs During Execution **Problem**: Workflow starts but never completes **Solutions**: 1. Check for infinite loops or wait conditions 2. Verify external API timeouts aren't too long 3. Check PostgreSQL query performance 4. Review workflow logs for specific node causing hang #### Credential Not Found Error **Problem**: Workflow tries to use credential that doesn't exist **Solutions**: 1. Re-import workflow to get fresh credential references 2. Manually assign credentials in each node 3. Verify credential names match exactly 4. Check for case-sensitivity issues ### Data Issues #### PostgreSQL - No Data After Workflow Execution **Problem**: Workflow completes but no data appears in database **Solutions**: 1. Verify Postgres credential has write permissions 2. Check workflow logic for INSERT/UPDATE statements 3. Verify table schema exists: `\dt` in psql 4. Check workflow logs for SQL errors 5. Manually execute test query to verify connectivity #### Milvus - Collections Not Created **Problem**: Workflow completes but vector collections are empty **Solutions**: 1. Verify embeddings are being generated 2. Check LiteLLM is accessible and returning embeddings 3. Verify collection creation code in Workflow C 4. Check Milvus logs for collection creation errors 5. Manually list collections: `milvus search --collection_name knowledge_base_embeddings` ### Performance Issues #### Slow Workflow Execution **Problem**: Workflows are taking longer than expected **Solutions**: 1. Monitor PostgreSQL query performance 2. Check Milvus vector search performance 3. Verify network latency to external APIs 4. Review workflow node efficiency 5. Consider adding caching for repeated queries #### High Memory Usage **Problem**: n8n container consuming excessive memory **Solutions**: 1. Check for memory leaks in custom code 2. Reduce batch sizes in loop operations 3. Optimize PostgreSQL queries 4. Restart n8n: `docker compose restart n8n` 5. Check Docker memory limits: `docker stats n8n` ### Common Error Messages | Error Message | Cause | Solution | |---------------|-------|----------| | `ECONNREFUSED` | Service not running | Verify Docker Compose services are up | | `ETIMEDOUT` | Network connection timeout | Check network connectivity | | `EAUTH` | Authentication failed | Verify credentials in `.env` | | `ENOFOUND` | DNS resolution failed | Check hostname spelling | | `EACCES` | Permission denied | Verify user has required permissions | ### Debug Mode Enable debug logging for troubleshooting: ```bash # For n8n service docker compose logs -f n8n --level debug # For PostgreSQL docker compose logs -f postgres # For Milvus docker compose logs -f milvus ``` ### Getting Help If issues persist: 1. **Check logs**: Review logs from all services 2. **Test connectivity**: Use curl/psql to test each service 3. **Verify configuration**: Double-check all `.env` values 4. **Reset credentials**: Delete and recreate credentials from scratch 5. **Review n8n documentation**: Check official n8n docs for your version 6. **Contact support**: Reach out with logs and configuration details ## Quick Reference ### Environment Variables ```bash # From .env file FREESCOUT_API_KEY=your_api_key_here FREESCOUT_API_BASE=https://ekshelpdesk.fft-it.de/api/v1 FREESCOUT_MAILBOX_ID=1 POSTGRES_PASSWORD=change_me_securely MILVUS_API_URL=http://milvus:19530 GENERIC_TIMEZONE=Europe/Berlin ``` ### Docker Compose Commands ```bash # Start all services docker compose up -d # Stop all services docker compose down # View logs docker compose logs -f [service_name] # Verify service status docker compose ps # Restart specific service docker compose restart [service_name] # Access PostgreSQL docker compose exec postgres psql -U kb_user -d n8n_kb # Access n8n directly docker compose exec n8n bash ``` ### Useful URLs - **n8n UI**: `https://n8n.fft-it.de` - **n8n Localhost**: `http://localhost:5678` - **Freescout API**: `https://ekshelpdesk.fft-it.de/api/v1` - **Milvus Health**: `http://localhost:9091/healthz` - **PostgreSQL**: `postgres:5432` (Docker network) ### Support Contacts For issues with specific systems: - **Freescout**: Contact helpdesk admin - **LiteLLM**: Contact AI team / Kubernetes admin - **n8n**: Check official documentation or community forums - **PostgreSQL/Milvus**: Check logs or database admin --- **Last Updated**: March 2026 **Document Version**: 1.0 **Maintenance**: Review and update quarterly or when major system changes occur.