Preferences

What is the point of a MCP server? If you want to make an RPC from an agent, why not... just use an RPC?

fennecfoxy
Standardising tool use, I suppose.

Not sure why people treat MCP like it's much more than smashing tool descriptions together and concatenating to the prompt, but here we are.

It is nice to have a standard definition of tools that models can be trained/fine tuned for, though.

ethbr1
Also nice to have a standard(ish) for evolution purposes. I.e. +15 years from now.
lobsterthief
Agreed. Without standards, we wouldn’t have the rich web-based ecosystem we have now.

As an example, anyone who’s coded email templates will tell you: it’s hard. While the major browsers adopted the W3C specs, email clients (I.e. email renderers) never adopted the spec, or such a W3C email HTML spec didn’t exist. So something that renders correctly in Gmail looks broken in Yahoo mail in Safari on iOS, etc.

antupis
It is easier to communicate and sell that we have this MCP server that you can just plug and play vs some custom RPC stuff.
freeone3000
But MCP deliberately doesn’t define endpoints, or arguments, or return types… it is the definition of custom RPC stuff.

How does it differ from providing a non MCP REST API?

hobofan
The main alternative one would have for having a plug-and-play (just configure a single URL) non-MCP REST API would be to use OpenAPI definitions and ingesting them accordingly.

However, as someone that has tried to use OpenAPI for that in the past (both via OpenAI's "Custom GPT"s and auto-converting OpenAPI specifications to a list of tools), in my experience almost every existing OpenAPI spec out there is insufficient as a basis for tool calling in one way or another:

- Largely insufficient documentation on the endpoints themselves

- REST is too open to interpretation, and without operationIds (which almost nobody in the wild defines), there is usually context missing on what "action" is being triggered by POST/PUT/DELETE endpoints (e.g. many APIs do a delete of a resource via a POST or PUT, and some APIs use DELETE to archive resources)

- baseUrls are often wrong/broken and assumed to be replaced by the API client

- underdocumented AuthZ/AuthN mechanisms (usually only present in the general description comment on the API, and missing on the individual endpoints)

In practice you often have to remedy that by patching the officially distributed OpenAPI specs to make them good enough for a basis of tool calling, making it not-very-plug-and-play.

I think the biggest upside that MCP brings (given all "content"/"functionality") being equal is that using it instead of just plain REST, is that it acts as a badge that says "we had AI usage in mind when building this".

On top of that, MCP also standardizes mechanisms like e.g. elicitation that with traditional REST APIs are completely up to the client to implement.

freeone3000
I can’t help but notice that so many of the things mentioned are not at all due to flaws in the protocol, but developers specifying protocol endpoints incorrectly. We’re one step away from developers putting everything as a tool call, which would put us in the same situation with MCP that we’re in with OpenAPI. You can get that badge with a literal badge; for a protocol, I’d hope for something at least novel over HATEOAS.
ethbr1
REST for all the use cases: We have successfully agreed on what words to use! We just disagree on what they mean.
rco8786
Standardization. You spin up a server that conforms to MCP, every LLM instantly knows how to use it.
vidarh
MCP is an RPC protocol.
refulgentis
Not everyone can code, and not everyone who can code is allowed to write code against the resources I have.
nsonha
you have to write code for MCP server, and code to consume them too. It's just the LLM vendor decide that they are going to have the consume side built-in, which people question as they could as well do the same for open API, GRPC and what not, instead of a completely new thing.
The analogy that was used a lot is that it's essentially USB-C for your data being connected to LLMs. You don't need to fight 4,532,529 standards - there is one (yes, I am familiar with the XKCD comic). As long as your client is MCP-compatible, it can work with _any_ MCP server.
fennecfoxy
The whole USB C comparison they make is eyeroll inducing, imo. All they needed to state was that it was a specification for function calling.

My gripe is that they had the opportunity to spec out tool use in models and they did not. The client->llm implementation is up to the implementor and many models differ with different tags like <|python_call|> etc.

lsaferite
Clearly they need to try explaining it it easy terms. The number of people that cannot or will not understand the protocol is mind boggling.

I'm with you on an Agent -> LLM industry standard spec need. The APIs are all over the place and it's frustrating. If there was a spec for that, then agent development becomes simply focused on the business logic and the LLM and the Tools/Resource are just standardized components you plug together like Lego. I've basically done that for our internal agent development. I have a Universal LLM API that everything uses. It's helped a lot.

ethbr1
The comparison to USB C is valid, given the variety of unmarked support from cable to cable and port to port.

It has the physical plug, but what can it actually do?

It would be nice to see a standard aiming for better UX than USB C. (Imho they should have used colored micro dots on device and cable connector to physically declare capabilities)

fennecfoxy
Certainly valid in that just like various usb c cables supporting slightly different data rates or power capacities, MCP doesn't deal with my aforementioned issue of the glue between MCP client and model you've chosen; that exercise is left up to us still.
ethbr1
My gripe with USB C isn't really on the nature, but on the UX and modality of capability discovery.

If I am looking at a device/cable, with my eyes, in the physical world, and ask the question "What does this support?", there's no way to tell.

I have to consult documentation and specifications, which may not exist anymore.

So in the case of standards like MCP, I think it's important to come up with answers to discovery questions, lest we all just accept that nothing can be done and the clusterfuck in +10 years was inevitable.

A good analogy might be imagining how the web would have evolved if we'd had TCP but no HTTP.

This item has no comments currently.