Uxopian Blog

ECM, AI and standardization

Written by Alain Escaffre | Jan 27, 2025 1:34:01 PM

This conversation began with a thought-provoking LinkedIn post by my former colleague at Hyland, Moun Ndyaye, EMEA Director of Ecosystem Sales and Programs. I encourage you to follow Moun1 if you are not, he always tries to shake things and bring great ideas! This time, he was wondering whether the CMIS standard needs revisiting in today's AI-driven content management world. During the discussion, Angel (another former colleague!) mentioned Anthropic’s new MCP protocol (Model Context Protocol), suggesting it might already address the challenges we were debating.

After reading the LinkedIn exchange, a few questions came to mind:

  • Does content management really need another standard?
  • What exactly is MCP?
  • Is it fair to compare CMIS and MCP, or are they fundamentally different?

About CMIS and the Need for Standards

CMIS was created to standardize access to content repositories, much like JDBC did for databases. Designed 15 years ago, its goal was to enable developers to build solutions compatible with multiple content management vendors by coding once.

It achieved some successes—several vendors used CMIS to build connectors and integrations to fetch content. However, it also had notable failures. Many business solutions require workflows and processes that CMIS doesn’t address, making it hard to create solutions truly compatible with various ECM repositories.

Fast-forward to today, outside the AI conversation, how necessary is a new standard like CMIS? My guess: integration costs are plummeting. Generative AI tools like ChatGPT and Claude can already produce advanced code just by "reading" documentation. (Fun fact: they can also do the reverse—write documentation by analyzing code!) Integration tasks, such as writing connector code, don’t typically involve rocket science or groundbreaking architectures. As such, they’re prime candidates for automation by GenAI.

In just a few years, we may see systems that dynamically read Swagger files and auto-generate the code needed to perform searches and CRUD operations via native APIs. (Side note: this could be a great feature for the roadmap of our Fast2 content migration orchestrator!)

So, my self-reflective answer is that as GenAI dramatically reduces integration costs, multivendor standardized APIs like CMIS will become less essential. Native APIs will always offer better performance and value.

Back to MCP

After diving deeper into MCP, it quickly became clear that it isn’t comparable to CMIS. MCP isn’t a standardized API for accessing data or content repositories like CMIS. Instead, it’s an open protocol designed to enable LLMs to make API calls autonomously. This allows the LLM to provide more complete responses by interacting with external systems.

For example, when I asked ChatGPT:
"What will the weather be the day after tomorrow in Marseille?"
It responded:
"I’m sorry, but I don’t have real-time weather data or the ability to fetch up-to-date forecasts."

Here’s where MCP shines. The protocol would allow The Weather Channel to deploy a “tool” for Claude Desktop (a desktop wrapper for Claude LLM) to fetch weather data from its site. The tool, described in plain text with its methods clearly outlined, enables the MCP Host layer to identify it as relevant based on the user’s prompt. The LLM could then call the API, retrieve the temperature for Marseille, and enrich its response. (Hint: the temperature in Marseille is always lovely!)

With MCP, the possibilities extend far beyond weather data. Tools could fetch information from databases, Slack, GitHub, or even content repositories—provided an MCP server is in place. The magic is that you can declare multiple “tools”, and the host will self determine which ones to use based on the prompt, and even better “combine them”, so that in the end it is possible to grasp information from somewhere, process it and send it somewhere else, all in natural language. Welcome to the agentic AI area!

MCP in the Uxopian Context

For Uxopian Software, implementing an MCP server could enable natural language access to Flowerdocs repositories. Users could search, fetch content, or perform CRUD operations directly through an LLM's messaging interface.

But MCP isn’t limited to data retrieval. It supports non-idempotent operations that modify the external world. Imagine an MCP server for eProcess:

  • A customer could send a case to a department by typing a natural language instruction.
  • A validation workflow could be initiated with a simple command.
  • With Flowerdocs and ARender, you could redact a document and send a PDF version with a single request.
  • Or, you could even design your repository—structure, content types, and metadata—using natural language.

The potential is exciting! For now, MCP’s adoption is limited by the fact that only Claude Desktop implements the “host” layer of the protocol. We could already think of using it to instantiate in natural language new connectors for ARender, or to configure Flowerdocs. However, it’s gaining traction and evolving rapidly, so I’m optimistic about what’s to come. We’ll keep an eye on this space and look for opportunities to experiment—hopefully with some of our customers!

🔗 For more context, you can read the original LinkedIn discussion that inspired this article.

1LinkedIn Profile of Moun Ndyaye