open-standards pope-ontology mcp portability vendor-lockin

Is there an open standard for portable AI memory?

Not yet — but POPE (People, Organizations, Places, Events) is emerging as a de facto ontology, and MCP (Model Context Protocol) is the transport. bRRAIn's memory exports as POPE JSON + MCP endpoints, so portability is guaranteed across vendors. Lock-in is a choice, not a default.

The standards landscape is still forming

There is no single ratified standard for AI memory the way there is for, say, email or SQL. The field is young and competing vendors have incentive to keep memory proprietary. But two de facto conventions are emerging and winning: POPE (People, Organizations, Places, Events) as a minimal ontology for enterprise memory, and MCP (Model Context Protocol) as the transport layer between LLMs and memory backends. Together they form a pragmatic portability stack — not ratified, but widely enough adopted that building against them is a defensible bet.

POPE as the minimal ontology

POPE is the simplest ontology that covers most enterprise memory: every meaningful fact links to at least one Person, Organization, Place, or Event. bRRAIn stores its POPE graph in a documented, exportable schema. Your memory can leave bRRAIn as a POPE JSON dump, be imported into another POPE-compatible system, and retain all relationships. The ontology is extensible — you can add decision records, risks, and domain-specific node types — but the POPE core stays stable. Portability rests on schema stability, and POPE delivers it.

MCP as the transport layer

Model Context Protocol is the emerging wire format for LLMs to talk to memory backends, tool servers, and data sources. bRRAIn's MCP Gateway speaks standard MCP, so any MCP-aware client — ChatGPT desktop, Claude desktop, Cursor, custom agents — can read bRRAIn's memory without a proprietary adapter. Switch models next quarter and the same MCP endpoint serves the new model. MCP is open-speced and reference-implemented in multiple languages. Committing to it is the current best portability bet.

Lock-in is a choice, not a default

bRRAIn takes a strong position: lock-in should be a customer decision, not a vendor default. Your memory exports as POPE JSON in one CLI command. Your connectors speak MCP, not a proprietary dialect. The Embedded SDK lets you run bRRAIn inside your own app with full data sovereignty. If you ever leave, the graph leaves with you, readable by any system that understands POPE. That stance comes from the bRRAIn architecture — zero-trust, open ontology, standard transport — and it's why we think persistent memory should be treated as customer-owned infrastructure, not a SaaS feature.

Relevant bRRAIn products and services

bRRAIn Team

Contributor at bRRAIn. Writing about institutional AI, knowledge management, and the future of work.

Enjoyed this post?

Subscribe for more insights on institutional AI.