Graphonomous is a continual learning memory engine for AI agents. It stores knowledge as a graph, routes reasoning via the κ invariant, and now includes tier-aware deliberation plus proactive attention loops for bounded autonomy.
v0.1.12 · Phase 2: Deliberator + Attention + Model-Tier · Elixir/OTP · MCP Server · Apache 2.0The system analyzed a 4-node business cycle, routed reasoning depth automatically, and now supports both topology-aware deliberation and proactive attention cycles with model-tier adaptation.
routing: fast max_kappa: 0 action: Single-pass retrieval. No deliberation needed.
routing: deliberate max_kappa: 1 scc_count: 1 fault_line: Product Quality → Market Share budget: max_iterations: 2, agents: 1, confidence: 0.75
{
"routing": "deliberate",
"max_kappa": 1,
"scc_count": 1,
"sccs": [{
"id": "scc-0",
"nodes": ["market-share", "revenue", "r-and-d", "product-quality"],
"kappa": 1,
"approximate": false,
"fault_line_edges": [{
"source": "product-quality",
"target": "market-share"
}],
"routing": "deliberate",
"deliberation_budget": {
"max_iterations": 2,
"agent_count": 1,
"timeout_multiplier": 1.5,
"confidence_threshold": 0.75
}
}],
"dag_nodes": []
}
Live result from Graphonomous MCP server. The system detected a circular dependency between market share, revenue, R&D, and product quality — and identified the exact edge (Product Quality → Market Share) where the feedback loop is weakest. No other agent memory system does this.
Agents store episodic, semantic, and procedural knowledge as typed graph nodes with confidence scores and provenance. Edges capture causal, temporal, and associative relationships.
On every retrieval, Graphonomous computes the topological structure of the relevant subgraph. Tarjan's SCC algorithm detects circular dependencies. The κ invariant measures entanglement depth.
κ = 0 → fast retrieval. On constrained tiers, low-κ regions can be enriched without full deliberation. Higher-friction regions route to deliberate (decompose fault lines, reconcile, write conclusions back). Attention then prioritizes what to do next under autonomy and budget controls.
| Tool | Description |
|---|---|
| store_node | Persist knowledge nodes with type, confidence, metadata |
| retrieve_context | Semantic search + neighborhood expansion + topology annotations + tier-aware enrichment |
| analyze_topology | Compute SCCs, κ values, routing decision, fault-line edges |
| deliberate | κ-driven focused reasoning over cyclic regions with optional crystallization |
| attention_survey | Ranked attention map across goals, coverage, and topology signals |
| attention_run_cycle | Trigger one survey/triage/dispatch attention cycle with autonomy override |
| learn_from_outcome | Update confidence across causal chains from grounded outcomes |
| query_graph | List, filter, similarity search across the graph |
| manage_goal | Goal lifecycle — create, transition, link nodes |
| review_goal | Inspect goal state, coverage, progress |
| run_consolidation | Trigger decay, prune, merge cycle |
Every agent memory system retrieves context. Graphonomous is the only one that tells you the shape of what you retrieved.
Graphonomous computes κ — a proved graph-theoretic invariant — on every retrieval. When your knowledge has feedback loops, the system tells you exactly where they are and how to reason through them.
The κ invariant is proved on 1,926,351 finite systems with zero counterexamples. The proof is browser-runnable at opensentience.org.
The theoretical foundations, deliberation protocol, attention engine, and governance model are published as open research protocols OS-001 through OS-006.