enterprise

Anthropic releases Model Context Protocol to standardize AI-data integration


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


One decision many enterprises have to make when implementing AI use cases revolves around connecting their data sources to the models they’re using. 

Different frameworks like LangChain exist to integrate databases, but developers must write code whenever they connect models to a new data source. Anthropic hopes to change that paradigm by releasing what it hopes to be a standard in data integration. 

Anthropic released its Model Context Protocol (MCP) as an open-source tool to provide users with a standard way of connecting data sources to AI use cases. In a blog post, Anthropic said the protocol will serve as a “universal, open standard” to connect AI systems to data sources. The idea is that MCP allows models like Claude to query databases directly. 

Alex Albert, head of Claude Relations at Anthropic, said on X that the company’s goal is “to build a world where AI connects to any data source” with MCP as a “universal translator.”

“Part of what makes MCP powerful is that it handles both local resources (your databases, files, services) and remote ones (APIs like Slack or GitHub’s) through the same protocol,” Albert said. 

A standard way of integrating data sources not only makes it easier for developers to point large language models (LLMs) directly to information but also eases data retrieval issues for enterprises building AI agents.

Since MCP is an open-source project, the company said it encourages users to contribute to its repository of connectors and implementations. 

Read More   NTT DATA, Palo Alto Networks expand collaboration for end-to-end enterprise cybersecurity - ETTelecom

A standard for data integration

No standard way of connecting data sources to models exists just yet; this decision is left to enterprise users and model and database providers. Developers tend to write a specific Python code or a LangChain instance to point LLMs to databases. With each LLM functioning a little differently from each other, developers need a separate code for each one to connect to specific data sources. This often results in different models calling to the same databases without the ability to work together seamlessly. 

Other companies extend their databases to make creating vector embeddings that can connect to LLMs easier. One such example is Microsoft integrating its Azure SQL to Fabric. Smaller firms like Fastn also offer a different method to connect data sources

Anthropic, though, wants MCP to work even beyond Claude as a step toward model and data source interoperability. 

“MCP is an open standard that enables developers to build secure, two-way connections between their data sources and AI-powered tools. The architecture is straightforward: developers can either expose their data through MCP servers or build AI applications (MCP clients) that connect to these servers,” Anthropic said in the blog post. 

Several commenters on social media praised the announcement of MCP, especially the protocol’s open-source releases. Some users in forums like Hacker News were more cautious, questioning the value of a standard like MCP. 

Of course, MCP is a standard only for the Claude family of models right now. However, Anthropic released pre-built MCP servers for Google Drive, Slack, GitHub, Git, Postgres and Puppeteer. 

Read More   Riot Games gets final approval on gender discrimination settlement

VentureBeat reached out to Anthropic for additional comment. 

The company said early adopters of MCP include Block and Apollo, with providers like Zed, Replit, Sourcegraph and Codeium working on AI agents that use MCP to get information from data sources. 

Any developers interested in MCP can access the protocol immediately after installing the pre-built MCP servers through the Claude desktop app. Enterprises can also build their own MCP server using Python or TypeScript. 



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.