TIA Intelligence Agency
An experimental XMPP (Jabber) agent framework that combines chat, Lingue/IBIS structured dialogue, and MCP tool integrations into a modular Node.js codebase.
This codebase contains a whole community of agents, but the core of the framework can be used (via the tia-agents npm package) to create individual agents on any system. See TIA Agent for an example.
Status 2025-12-28: we have a bunch of autonomous agents that can debate how to solve a problem, run a planning poll to pick an approach, and then invoke Model-First Reasoning or consensus workflows. The system is quite chaotic, but the end-to-end process is working.
- Live Chat - or use a standard XMPP client
- Documentation
- Lingue Protocol
Question: Schedule appointments for patients. Alice takes warfarin, Bob takes aspirin. Ensure no drug interactions.
...a lot of chat later...
TIA is a set of composable building blocks for creating conversational agents that can:
- Participate in XMPP multi-user chats and direct messages.
- Negotiate Lingue language modes and exchange structured payloads.
- Act as MCP clients (discovering tools/resources from servers).
- Act as MCP servers (exposing chat and Lingue tools to external clients).
The design goal is a clean, library-ready architecture that supports both deployable bots and reusable modules.
- XMPP room agents: long-running bots anchored in MUC rooms.
- Lingue protocol: language-mode negotiation + structured payloads (IBIS, Prolog, profiles).
- MCP bridges: MCP client and server adapters for tool discovery and exposure.
- Profiles (RDF): agent capabilities live in RDF profiles with shared vocabularies (Mistral variants inherit from
mistral-base).
TIA is published as tia-agents on npm. The package provides the core framework for building XMPP agents without bundling specific LLM implementations.
The framework provides:
- Core agent machinery (
AgentRunner,createSimpleAgent) - Base classes for building providers (
BaseProvider,BaseLLMProvider) - Profile loading from RDF/Turtle files
- XMPP utilities (auto-registration, room management)
- History stores (
InMemoryHistoryStore) - Lingue protocol support
- MCP integration
LLM API access is handled through hyperdata-clients, which provides unified interfaces for Mistral, Groq, Claude, OpenAI, Ollama, and more.
For example usage see TIA Agent
Documentation:
- 📚 Quick Start Guide - Detailed getting started
- 🔧 Provider Guide - Creating custom providers
- 📖 API Reference - Complete API docs
- 🌐 MCP HTTP Server - Streamable HTTP endpoint setup
- 📁 Templates - Example configurations
For contributing to TIA or running the full test suite with all agents:
git clone https://github.com/danja/tia.git
cd tia
npm installConfigure .env (see .env.example) and config/agents/secrets.json for XMPP passwords.
If you want multiple instances of the same agent type, enable auto-suffixing (XMPP_AUTO_SUFFIX_USERNAME=1)
to auto-register mistral1, mistral2, etc. when the base username is taken.
See the Agent Startup Guide for complete installation and configuration instructions.
Use the shared Prosody testbed at tensegrity.it to connect with any standard XMPP client. You will first have to register - is just simpe username & password.
Connection details:
- XMPP service:
xmpp://tensegrity.it:5222 - Domain:
tensegrity.it - MUC service:
conference.tensegrity.it - TLS: self-signed (set
NODE_TLS_REJECT_UNAUTHORIZED=0for CLI tools; in GUI clients accept the certificate)
Rooms to join:
- General:
general@conference.tensegrity.it - Log room:
log@conference.tensegrity.it
If your client supports it, set a distinct resource or nickname to avoid collisions.
- Coordinator — MFR (Model-First Reasoning) orchestrator for multi-agent problem solving.
- Mistral — AI chat agent backed by Mistral API with Lingue/IBIS summaries (see
mistral-analyst,mistral-creativeprofiles). - GroqBot — AI chat agent backed by Groq API (llama-3.3-70b-versatile) with same capabilities as Mistral.
- Golem — Malleable AI agent with runtime system prompt changes. Can be assigned logic-focused roles during planning. Guide
- Semem — MCP-backed knowledge agent for
tell/ask/augmentflows. - MFR Semantic — Constraint-focused agent for MFR model construction.
- Data — SPARQL knowledge query agent for Wikidata, DBpedia, and custom endpoints. Guide
- Demo — Minimal chat bot for quick XMPP smoke checks.
- Chair — Debate facilitator/Moderator agent.
- Recorder — Meeting logger/recorder agent that listens broadly.
- Prolog — Logic agent using tau-prolog for queries.
- Executor — Plan execution agent that converts high-level plans into Prolog programs.
- MCP Loopback — MCP client/server echo agent for integration tests.
src/agents— AgentRunner, providers, and profile system.src/lib— XMPP helpers, Lingue utilities, logging, RDF tools.src/mcp— MCP client/server bridges and test servers.config/agents/*.ttl— RDF profiles describing each agent.config/agents/secrets.json— local XMPP passwords keyed by profile (ignored in git).docs/— integration guides and operational docs.
The start-all.sh script provides a unified way to start agents or specific subsets. By default it starts the MFR suite.
# Start the default MFR suite (same as `./start-all.sh mfr`)
./start-all.sh
# Start MFR (Model-First Reasoning) system
./start-all.sh mfr
# Start debate system
./start-all.sh debate
# Start basic agents
./start-all.sh basic
# Custom agent selection
AGENTS=mistral,data,prolog ./start-all.sh
# Get help
./start-all.sh helpPrerequisites:
- Configure
.envfile with API keys (see.env.example) - Create
config/agents/secrets.jsonwith XMPP passwords - For MFR system: Configure Prosody MUC rooms (see MFR Room Setup)
- Ensure a log room exists (set
LOG_ROOM_JIDexplicitly for all agents and create it on the server)
Agent Presets:
mfr- MFR system (full suite): coordinator, mistral, analyst, creative, chair, recorder, mfr-semantic, data, prolog, demodebate- Debate system: chair, recorder, mistral, analyst, creativebasic- Basic agents: mistral, data, prolog, demo
The script automatically:
- Loads
.envfile - Checks for required API keys
- Skips agents with missing credentials
- Provides restart on crash
- Handles graceful shutdown (SIGTERM/SIGINT)
Once agents are running, you can interact with them directly from a chatroom using the REPL client:
# Connect to the chatroom
NODE_TLS_REJECT_UNAUTHORIZED=0 node src/client/repl.js <username> <password>Once connected to the chatroom, you can pose problems to the MFR system:
# Start a new MFR session
mfr-start Schedule appointments for patients. Alice takes warfarin, Bob takes aspirin. Ensure no drug interactions.
# Start a debate-driven MFR session (tool selection via Chair)
debate Optimize delivery routes for 3 trucks serving 10 locations.
# Shorthand for debate-driven sessions
Q: Optimize delivery routes for 3 trucks serving 10 locations.
# Check session status
mfr-status <sessionId>
# List active sessions
mfr-list
# Get help
help
Debate mode is enabled by default in config/agents/coordinator.ttl. Q: triggers a planning poll to decide between logic/consensus/Golem logic routes.
Short command versions:
startinstead ofmfr-startstatusinstead ofmfr-statuslistinstead ofmfr-list
Other MFR commands:
mfr-contribute <sessionId> <rdf>- Submit a contribution manuallymfr-validate <sessionId>- Validate a modelmfr-solve <sessionId>- Request solutionsdebate <problem description>- Start debate-driven MFR session
You can also run MFR sessions programmatically:
node src/examples/run-mfr-session.js "Your problem description here"This will automatically connect, start a session, wait for the solution, and display the results.
import { AgentRunner, LingueNegotiator, LINGUE, Handlers, InMemoryHistoryStore } from "tia-agents";
const negotiator = new LingueNegotiator({
profile,
handlers: {
[LINGUE.LANGUAGE_MODES.HUMAN_CHAT]: new Handlers.HumanChatHandler()
}
});
const runner = new AgentRunner({
profile,
provider,
negotiator,
historyStore: new InMemoryHistoryStore({ maxEntries: 40 })
});
await runner.start();See examples/minimal-agent.js for a runnable local example.
For more control, you can use the core classes directly:
Create profile files and use the factory function:
import { createAgent, DemoProvider } from "tia-agents";
// Load from config/agents/mybot.ttl
const runner = await createAgent("mybot", new DemoProvider());
await runner.start();Profile file (config/agents/mybot.ttl):
@prefix agent: <https://tensegrity.it/vocab/agent#> .
@prefix xmpp: <https://tensegrity.it/vocab/xmpp#> .
<#mybot> a agent:ConversationalAgent ;
agent:xmppAccount [
xmpp:service "xmpp://localhost:5222" ;
xmpp:domain "xmpp" ;
xmpp:username "mybot" ;
xmpp:passwordKey "mybot"
] ;
agent:roomJid "general@conference.xmpp" .Configure everything in code:
import { createSimpleAgent, DemoProvider } from "tia-agents";
const runner = createSimpleAgent({
xmppConfig: {
service: "xmpp://localhost:5222",
domain: "xmpp",
username: "mybot",
password: "secret"
},
roomJid: "general@conference.xmpp",
nickname: "MyBot",
provider: new DemoProvider()
});
await runner.start();Extend BaseProvider to implement your own logic:
import { BaseProvider } from "tia-agents";
class MyProvider extends BaseProvider {
async handle({ command, content, metadata }) {
if (command !== "chat") return null;
return `You said: ${content}`;
}
}
const runner = createSimpleAgent({
// ... config
provider: new MyProvider()
});Use BaseLLMProvider as a base class and hyperdata-clients for the API client:
import { BaseLLMProvider, createSimpleAgent, InMemoryHistoryStore } from "tia-agents";
import { Mistral } from "hyperdata-clients";
class MyMistralProvider extends BaseLLMProvider {
initializeClient(apiKey) {
return new Mistral({ apiKey });
}
async completeChatRequest({ messages, maxTokens, temperature }) {
return await this.client.client.chat.complete({
model: this.model,
messages,
maxTokens,
temperature
});
}
extractResponseText(response) {
return response.choices[0]?.message?.content?.trim() || null;
}
}
const provider = new MyMistralProvider({
apiKey: process.env.MISTRAL_API_KEY,
model: "mistral-small-latest",
historyStore: new InMemoryHistoryStore({ maxEntries: 40 })
});
const runner = await createSimpleAgent({
xmppConfig: { /* ... */ },
roomJid: "general@conference.xmpp",
nickname: "MyBot",
provider
});
await runner.start();See mistral-minimal/mistral-provider.js for a complete working example.
Copy templates to get started:
cp -r node_modules/tia-agents/templates/* ./Or use the mistral-minimal/ example as a starting point.
For a fuller walkthrough and profile-driven setup, see:
Quick start - see Agent Startup Guide for complete instructions.
Additional documentation:
- Agent Startup Guide - Main guide for starting agents
- Testing
- Server
- MCP Server
- MFR Room Setup - Requires Prosody MUC configuration for persistent rooms


