I've just released a video showing how to build a chat widget that actually knows what page your customer is viewing. Not through some clever JavaScript hack, but through proper context-aware routing that feeds directly into your n8n workflows.
If you've seen the video (it's embedded below for blog readers), you'll have watched a property management system spring to life with intelligent routing, AI agents that know which property you're looking at, and automation that handles everything from lead capture to maintenance tickets. This post is the technical companion piece, focusing on the bits that matter for implementation: deploying Directus properly, setting up the MCP server, and understanding how these pieces connect to create production-ready systems.
For those reading via email, the video demonstrates a fully working implementation where clicking a "Viewings" button on a property page automatically gives the AI agent complete context about which property you're interested in, without typing a word. The architecture we're about to explore makes that possible.
Understanding Context-Aware Routing
When a user clicks the "Viewings" button on the chat widget, on a property page, three critical pieces of data flow through to your n8n webhook:
- Page URL: The exact location where the user initiated contact
- Route context: Which button they pressed (viewings, maintenance, general enquiry)
- Origin path: Parsed data like property IDs extracted from the URL
This context awareness transforms a basic chatbot into an intelligent assistant that already knows what the user is looking at. The n8n workflow receives this data, routes it to the appropriate AI agent tools, and responds with context-specific information without the user typing a single word about their location in the system.
The widget itself is available at Chat Widget Designer, with both premium and free versions. The configuration interface lets you define custom routes, style the interface to match your brand, and set up those critical button mappings that drive the entire contextual experience.
Directus: Your Data Platform Foundation
Directus positions itself as an open data platform rather than just another headless CMS. It wraps any SQL database with real-time REST and GraphQL APIs, whilst providing a genuinely usable interface for non-technical users to manage content.
Why Directus for This Architecture?
Three reasons make Directus particularly suited for this type of implementation:
Dynamic API Generation: Directus automatically generates API endpoints based on your database schema. Create a "properties" collection, define some fields, and you immediately have full CRUD operations available via REST and GraphQL. No controller writing required.
Built-in Automation: Directus Flows provide event-driven automation directly within the platform. You can trigger webhooks to n8n, transform data, send notifications, or chain multiple operations together without writing a single line of code.
MCP Server Integration: This is where things get properly interesting. Directus ships with a Model Context Protocol server that lets AI tools like Claude, Cursor, and Windsurf interact directly with your database structure and content. More on this shortly.
Deployment Options: Choosing Your Path
Directus offers three primary deployment approaches, each with distinct trade-offs.
Directus Cloud
Directus Cloud handles everything for you. Select from over 15 global regions, benefit from automatic scaling, and never worry about infrastructure management. Pricing uses a pass-through model where you pay for what you consume.
This option makes sense if you value time over cost control, need guaranteed uptime with multiple availability zones, or simply don't want to become a DevOps specialist to run your CMS.
Self-Hosting
Self-hosting Directus gives you complete control. You choose the hardware, configure every environment variable, and own your data entirely.
Directus uses a Business Source Licence (BSL 1.1) which is free for organisations with less than $5 million in total annual finances. Beyond that threshold, you'll need a commercial licence. The software itself runs as a Docker container, making deployment relatively straightforward on any platform that supports Docker.
Key requirements for self-hosting:
- PostgreSQL, MySQL, MS SQL, SQLite, OracleDB, MariaDB, or CockroachDB
- Redis for caching (recommended for production)
- S3-compatible storage for file uploads (essential for persistent storage)
- Node.js runtime
Official deployment guides cover major cloud providers including AWS, Azure, GCP, and Digital Ocean. The core pattern remains consistent: run the Docker container, set your environment variables, connect your database, and you're running.
Here's a minimal Docker Compose configuration to get started:
services:
database:
image: postgis/postgis:13-master
environment:
POSTGRES_USER: "directus"
POSTGRES_PASSWORD: "directus"
POSTGRES_DB: "directus"
volumes:
- ./data/database:/var/lib/postgresql/data
cache:
image: redis:6
directus:
image: directus/directus:latest
ports:
- 8055:8055
volumes:
- ./uploads:/directus/uploads
- ./extensions:/directus/extensions
environment:
SECRET: "replace-with-secure-random-secret"
DB_CLIENT: "pg"
DB_HOST: "database"
DB_PORT: "5432"
DB_DATABASE: "directus"
DB_USER: "directus"
DB_PASSWORD: "directus"
CACHE_ENABLED: "true"
CACHE_STORE: "redis"
REDIS: "redis://cache:6379"
Railway: One-Click Deployment
Railway provides the middle ground between managed cloud and self-hosting. Their Directus templates deploy everything you need with a single click: Directus, PostgreSQL, Redis, and S3 storage configuration.
Railway's private networking means your database and Redis communicate internally without egress fees. The platform handles SSL certificates, provides a deployment dashboard, and scales vertically as needed.
Multiple community templates exist with different configurations:
- Basic Directus template (Directus + PostgreSQL + optional S3)
- Directus with Redis and auto backups
- Directus with PostGIS for spatial data
The Railway approach suits teams who want infrastructure management without the overhead of maintaining it themselves, whilst retaining more control than a fully managed service provides.
Important note: Railway's filesystem is ephemeral. Files uploaded to the local filesystem disappear on redeployment. You must configure S3-compatible storage for persistent file uploads.
The MCP Server: AI-Powered Development
This is where Directus differentiates itself significantly from other headless CMS platforms.
What Is MCP?
The Model Context Protocol is a standard for connecting AI tools to external data sources and services. Directus v11.12+ includes a built-in MCP server that exposes your entire database structure, content, and automation capabilities to AI assistants.
Rather than copying data back and forth between your AI tool and Directus, the AI connects directly to your instance. Claude can read your schema, create collections, build relationships, generate Flows, and manage content, all through natural language.
Setting Up MCP
The MCP server is disabled by default for security. Enable it through Settings in your Directus instance:
- Navigate to Settings → AI in your Directus dashboard
- Enable the MCP server
- Create a dedicated user account with appropriate permissions for AI operations (don't use your admin account)
- Generate an access token for this user
The MCP server becomes available at https://your-directus-url.com/mcp.
Connecting Claude Desktop
For Claude Desktop, edit your MCP configuration file (File → Settings → Developer → Edit Config):
{
"mcpServers": {
"directus": {
"type": "http",
"url": "https://your-directus-url.com/mcp",
"headers": {
"Authorization": "Bearer your-generated-token"
}
}
}
}
Restart Claude Desktop and you'll see the Directus tools appear in your conversation. You can now ask Claude to examine your schema, create collections, build relationships, or generate automation workflows.
Connecting Cursor
For Cursor IDE, create or edit .cursor/mcp.json in your project root:
{
"mcpServers": {
"directus": {
"command": "npx",
"args": ["@directus/content-mcp@latest"],
"env": {
"DIRECTUS_URL": "https://your-directus-url.com",
"DIRECTUS_TOKEN": "your-directus-token"
}
}
}
}
The local MCP server approach (using npx) works for older Directus versions (pre-11.12) or if you prefer running the MCP server locally rather than hitting your production instance directly. The official repository contains additional configuration options.
What MCP Can Do
Once connected, AI tools can:
- Examine schema: "Tell me about my database structure"
- Create collections: "Build me a CRM with organisations, contacts, and deals"
- Configure relationships: "Set up a many-to-many between posts and tags"
- Generate Flows: "Create an automation that sends an email when a support ticket is created"
- Manage content: "Add 10 sample blog posts with proper categories and authors"
- Query data: "Show me all high-priority tickets created this week"
The system respects the permissions of the connected user account. AI tools can only access what that user is explicitly allowed to see and modify.
Custom Prompts for Context
Directus supports custom system prompts that guide how the AI interacts with different parts of your application. You can define prompts that automatically inject context about specific collections, encourage certain patterns, or enforce your team's conventions.
Building Your Data Model
Let's examine how to construct the type of relational data structure needed for a property management system.
Collections and Fields
Directus collections represent database tables. Each collection contains fields that define the data structure and how it displays in the Directus interface.
Creating a properties collection:
- Navigate to Settings → Data Model
- Create a new collection called "properties"
- Select your primary key type (auto-incrementing integer, UUID, or manual)
- Add fields for your data
Common field types:
- String: Text data like addresses or property names
- Integer: Numeric data like bedroom counts or price
- Boolean: Yes/no flags like "featured" or "available"
- DateTime: Date fields for viewing schedules or availability
- JSON: Structured data like amenities or floor plans
- M2O/O2M/M2M: Relational fields linking to other collections
Relationships: The Critical Architecture
Directus relationships follow standard relational database patterns: Many-to-One (M2O), One-to-Many (O2M), Many-to-Many (M2M), and a few Directus-specific compound types.
Example: Properties and Tenants
A property has one tenant (M2O from tenant's perspective), but a tenant might view multiple properties before choosing one (O2M from property's perspective).
Setting up the M2O relationship:
- In the tenants collection, add a new field
- Select "Many to One" interface
- Set the field name to
current_property - Choose properties as the related collection
- Configure how properties display in the dropdown
Directus automatically creates the foreign key in your database. The relationship now works in both directions. Tenants can see their property, and properties can display which tenant occupies them.
Example: Properties and Features (M2M)
Properties have many features (parking, garden, balcony), and features can belong to many properties. This requires a junction collection.
When you create a M2M field, Directus automatically creates the junction collection for you:
- In the properties collection, add a new field
- Select "Many to Many" interface
- Name it
features - Choose or create the features collection
- Directus creates
properties_featuresjunction collection automatically
The junction collection contains two M2O relationships, one to properties and one to features. You can add additional fields to the junction collection if needed (like "date_added" or "notes").
Display Templates
Display templates control how items appear in relational fields. Instead of seeing "Property #42", you can display "2 Bedroom Flat, Leeds, £1200/month".
Configure display templates in the collection settings under "Display Template". Use double curly braces to reference fields: {{bedrooms}} Bedroom {{type}}, {{city}}, £{{rent}}/month.
Directus Flows: Automation Without Code
Flows are Directus's answer to workflow automation. Each flow consists of a trigger, a series of operations, and a data chain that passes information between steps.
Anatomy of a Flow
Triggers start your flow. Five types exist:
- Event Hook: Fires when data changes in your database (create, update, delete)
- Webhook: Triggered by HTTP requests to a unique URL
- Schedule (CRON): Runs on a schedule
- Manual: Adds a button to collections for user-triggered flows
- Another Flow: Chains flows together
Operations are the individual actions your flow performs:
- Read/Create/Update/Delete data
- Transform data with custom JSON
- Send HTTP requests to external APIs
- Run JavaScript code in a sandbox
- Send emails and notifications
- Conditional logic (if/else branching)
- Log messages for debugging
Data Chain is a JSON object that flows through your entire workflow. Each operation appends its results under a unique key ($trigger, $operation_key), making that data available to subsequent operations.
Example: Support Ticket Creation
Let's examine the support ticket flow from the video in detail:
- Webhook Trigger: Creates a unique URL for the flow. When your frontend calls this URL (from n8n or directly), the flow starts. Any data sent in the request body becomes available in
$trigger. - Condition Operation: Checks if the incoming request contains valid tenant credentials. Uses filter rules to validate the data structure.
- Read Data Operation: If the tenant is valid, queries the tenants collection to retrieve their full details including property relationships.
- Create Item Operation: Creates a new support ticket in the tickets collection, linking it to the tenant and their property through relationships.
- Webhook Request Operation: Calls back to your n8n instance with the ticket details, triggering the next phase of your automation (sending confirmation emails, notifying maintenance staff, etc).
Each operation has access to data from previous steps through the data chain. If operation #3 returned tenant data, operation #4 can reference it with {{$operation3.email}} or {{$operation3.property.address}}.
Connecting Flows to n8n
The webhook operation in Flows provides the bridge to n8n. Configure it with:
- Method: Usually POST for sending data
- URL: Your n8n webhook URL
- Headers: Authentication tokens if needed
- Body: JSON payload with data from the flow
Your n8n workflow receives this webhook, processes it through whatever logic you've built (AI agents, external APIs, database operations), and can optionally send a response back to Directus if the flow is configured for synchronous operation.
n8n Workflow Architecture (Brief Overview)
The n8n side of this architecture deserves its own deep dive, but briefly: your workflow starts with a webhook trigger that receives data from the chat widget. This data includes the page context, user input, and route information.
An AI Agent node processes the input with access to specialised tools. These tools query your Directus API (using standard HTTP requests or the Directus SDK) to retrieve property information, create leads, update tickets, and more.
The key insight: context-aware routing in the widget tells the AI agent which tools are relevant for the current interaction. A user on the properties page gets property-related tools. A tenant reporting maintenance gets ticket creation tools. The agent doesn't need to guess; the route tells it explicitly.
Production Considerations
Building a demo is straightforward. Running it in production requires addressing several concerns:
Security
2FA Implementation: The video demonstrates AI-generated verification codes. In production, keep verification codes outside the AI's access entirely. Store them in your database with a short expiration time. When the user submits a code, validate it programmatically and send only a boolean flag back to the AI (verified: true/false).
API Authentication: Use Directus access tokens with appropriate scopes. Create separate tokens for different operations and services. Don't use admin tokens in client-facing applications.
Rate Limiting: Implement rate limiting on your webhook endpoints to prevent abuse. Railway and most hosting platforms provide built-in rate limiting, but verify it's configured appropriately.
Performance
Caching: Enable Redis caching in Directus. Set appropriate cache durations for collections that change infrequently (FAQ content, feature lists) versus frequently (property availability, ticket statuses).
Database Indexing: Add database indexes on fields you query frequently. Property searches by location or price benefit massively from proper indexing.
Webhook Optimisation: For high-traffic implementations, make your webhook responses asynchronous. Acknowledge receipt immediately and process the request in a background job.
Error Handling
Flow Error Catching: Use the Throw Error operation in Flows to handle exceptional cases gracefully. Return meaningful error messages that help users understand what went wrong.
n8n Workflow Errors: Configure error workflows in n8n that catch failures and log them appropriately. Consider using a dedicated error collection in Directus to track automation failures.
API Timeout Handling: Set reasonable timeouts for all external API calls. AI agent responses can take several seconds, so design your UX to handle this wait gracefully.
Your Implementation Checklist
Ready to build this yourself? Here's the suggested sequence:
- Deploy Directus: Choose your hosting approach and get a Directus instance running. Railway provides the fastest path to a working system.
- Design Your Data Model: Map out your collections and relationships before building. Use pen and paper or a tool like dbdiagram.io to visualise the structure.
- Enable MCP: Set up the MCP server and connect it to Claude or Cursor. Use AI to help build your schema rather than clicking through the UI manually.
- Build Core Collections: Create your primary collections (properties, tenants, tickets) with all necessary fields and relationships.
- Test Your API: Make test requests to your Directus API to verify everything works. The built-in API documentation at
/admin/docshelps immensely. - Configure Basic Flows: Start with simple flows like "send notification when ticket created" before building complex multi-step automations.
- Set Up n8n: Deploy n8n and create a basic webhook workflow that successfully receives data from your chat widget.
- Integrate the Chat Widget: Deploy the Chat Widget Designer widget on your test site and configure your routes.
- Build AI Agent Tools: Create n8n tools that query your Directus API for relevant data based on the widget's context.
- Iterate and Refine: Test with real users. Monitor your Flow logs in Directus and workflow execution history in n8n to identify bottlenecks and errors.
Common Pitfalls to Avoid
Over-engineering relationships: Start simple. You can always add complexity later. A flat structure with fewer relationships is easier to understand and debug than a highly normalised schema you don't yet need.
Ignoring permissions: Directus's permission system is powerful but requires configuration. Don't leave it as an afterthought. Design your roles and permissions early.
Forgetting file storage: Railway and most container platforms have ephemeral filesystems. Configure S3 or another storage service before you start uploading files, or you'll lose everything on the next deployment.
Misunderstanding the data chain: In Flows, take time to understand how data moves through operations. Use the Log operation liberally during development to see what's actually in your data chain at each step.
Hardcoding values: Use environment variables for URLs, tokens, and configuration that differs between development and production environments.
Where This Approach Excels
The architecture demonstrated in the video isn't specific to property management. The pattern works brilliantly for:
- SaaS customer support: Context-aware help based on which feature the user is viewing
- E-commerce: Intelligent product recommendations and inventory queries that understand the current product page
- Educational platforms: Course-specific assistance that knows exactly which lesson the student is on
- Healthcare: Patient enquiries that understand appointment context and medical history (with appropriate security)
- Internal tools: Company wikis and knowledge bases that provide relevant information based on the current document
The key is context awareness. When your automation system knows what the user is looking at, it can respond intelligently without forcing the user to re-explain their situation.
Further Resources
Directus maintains excellent documentation:
The Chat Widget Designer offers both paid and free versions, with the free version providing enough functionality to build a working prototype.
This architecture combines modern headless CMS capabilities with intelligent automation to create systems that feel genuinely helpful rather than frustratingly generic. The effort invested in proper setup pays dividends in maintenance, scalability, and user experience.
