MCP Server Directory & Submission Hub
Why Build on the Model Context Protocol (MCP)?
MCP Servers
Frequently Asked Questions about MCP Servers
What is an MCP server?
How do I publish my MCP server?
Which AI assistants support MCP servers?
MCP vs. OpenAI “function‑calling” — what’s the difference?
Scope: OpenAI function‑calling is an API‑specific JSON protocol that lets ChatGPT call developer‑defined functions inside a single request cycle. MCP is an open, transport‑agnostic protocol that works with any LLM or IDE and supports persistent state, resource streaming and multi‑tool suites.
Transport: Function‑calling occurs over HTTPS. MCP supports STDIO for local processes and Server‑Sent Events for remote servers, enabling CLI‑level latency and bi‑directional progress events.
Security model: Function‑calling inherits the security context of the backend service. MCP adds tool‑level capability descriptors, allowing clients to review & approve each server before use.
Bottom line: choose MCP when you need an open ecosystem where any LLM, IDE or agent framework can reuse the same server; choose function‑calling for quick single‑model prototypes on OpenAI.