AGENT WORLD
An ambient, gamified dashboard for my real MCP stack. Every NPC is an external service plugged in — Asana, Notion, n8n, Linear, Excalidraw, Claude Code. When something happens out there — a task created, a page edited, a workflow fires — the matching NPC wakes up and walks to its quarter. The pixel-art town is the dressing. The substance is watching my automation come alive in real time.
9 Sources → 9 Archetypes
Each external tool in my stack maps to a themed NPC archetype. When real activity lands — a task, a page, a workflow run — the matching agent wakes up and walks to their quarter. This is the entire dashboard contract: source → archetype → animation.
From MCP to NPC
The town sits directly on top of my live MCP/automation stack. Whenever real work happens — through an MCP server, Claude Code, n8n, or in a tool itself — the activity cascades through the adapter layer and lands in the world as an NPC reaction.
notion, asana, make, second-brain — the layer where I actually work.
A new Asana task, a Notion page edit, a Drive upload, a Claude Code session.
Express + ws. Up to 6 live integrations (Asana, Notion, n8n, Linear, plus file watchers for Excalidraw and Claude Code). Mocks fall back when env tokens are missing.
Typed events only — source, type, data, timestamp. No game logic on the server.
SOURCE_TO_AGENT maps the event to an idle agent of matching type. assignTask() fires.
Specialized Agents
Procedural Buildings
Agent.ts Lines
State Machine States
The State Machine
Every agent cycles through four states: idle, alert, walking, working. A real event from the stack assigns a task — the NPC shows an exclamation mark overhead, walks to its quarter, plays a speech bubble with the task name, runs a progress bar for the work duration, then returns to idle with a green particle burst. The detour is the dashboard: a five-second story per piece of automation activity. All four transitions live in a single Agent.ts class — 429 lines, no framework.
type AgentState = 'idle' | 'walking' | 'working' | 'alert';
// Event arrives → alert → walk → work → back to idle
assignTask(taskName: string, duration: number, destX: number, destY: number) {
this.pendingTask = { name: taskName, duration };
this.showAlert(); // state: 'alert'
setTimeout(() => this.walkTo(destX, destY), 600); // state: 'walking'
// On arrival: startWork() → state: 'working' + progress bar + bubble
// On finish: completeWork() → state: 'idle' + green particle burst
}Key Decisions
- →Phaser 3 over raw Canvas
Built-in sprite system, physics, and scene management. Raw Canvas would mean reimplementing half a game engine.
- →Two-step work: alert → walk → work
NPCs don't snap to "done" the moment a real event arrives. They show an alert, walk to their quarter, THEN work. Without the detour the automation is invisible — with it, every Asana task or n8n run becomes a short visible story. That's the whole point of the dashboard.
- →EventDispatcher as a pure SOURCE→ARCHETYPE mapping
9 external sources → 9 archetype agents, declared as a single dictionary in EventDispatcher.ts. The game scene knows nothing about Asana or Notion — it only knows Gildenläufer and Magier. New tools plug into the dashboard by adding one adapter and one line in the map.
- →Polling as default, webhooks as opt-in
AdapterManager polls every 30s — keeps the server self-contained and runs fine as a local desktop experience, no tunnels required. A POST /api/webhook endpoint is exposed too, so n8n or any service that can fire an HTTP call gets near-instant NPC reactions when I want them.