I agree with the article, and I love how the author is (mis-)using MCP. I just want to rephrase what the accident actually is.
The accident isn't that somehow we got a protocol to do things we couldn't do before. As other comments point out MCP (the specificaiton), isn't anything new or interesting.
No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned.
I don't know how long it'll last, but I sure appreciate it.
Hype, certainly.
But the way I see it, AI agents created incentives for interoperability. Who needs an API when everyone is job secure via being a slow desktop user?
Well, your new personal assistant who charges by the Watt hour NEEDS it. Like when the CEO will personally drive to get pizzas for that hackathon because that’s practically free labor, so does everyone want everything connected.
For those of us who rode the API wave before integrating became hand-wavey, it sure feels like the world caught up.
I hope it will last, but I don’t know either.
Unfortunately, I think we're equally likely to see shortsighted lock-in attempts like this [0] one from Slack.
I tried to find a rebuttal to this article from Slack, but couldn't. I'm on a flight with slow wifi though. If someone from Slack wants to chime in that'd be swell, too.
I've made the argument to CFOs multiple times over the years why we should continue to pay for Slack instead of just using Teams, but y'all are really making that harder and harder.
[0]: https://www.reuters.com/business/salesforce-blocks-ai-rivals...
I wasn’t aware of this, it’s extremely shortsighted. My employees’ chats are my company’s data, and I should be able to use them as I see fit. Restricting API access to our own data moves them quickly in to the 'too difficult to continue doing business with' category.
The reality is that Slack isn’t that sticky. The only reason I fended off the other business units who've demanded Microsoft Teams through the years is my software-engineering teams QoL. Slack has polish and is convenient but now that Slack is becoming inconvenient and not allowing me to do what I want, I can't justify fending off the detractors. I’ll gladly invest the time to swap them out for a platform that respects our ownership and lets us use our data however we need to. We left some money on the table but I am glad we didn’t bundle and upgrade to Slack Grid and lock ourselves into a three-year enterprise agreement...
Precisely the situation I'm in. I've fought off slack-to-teams migrations at multiple orgs for the same QoL reasons, but this will make that much (much) harder to justify.
> I wasn’t aware of this, it’s extremely shortsighted. My employees’ chats are my company’s data, and I should be able to use them as I see fit.
True, and if you're the only one sitting on the data and using it, then what you say is true.
The moment you use another platform, entering agreements of terms of service and more, it stops being "your and/or your company's data" though, and Slack will do whatever they deem fit with it, including preventing you from getting all of the data, because then it gets easier for you to leave.
Sucks, yeah, but it is the situation we're in, until lawmakers in your country catch up. Luckily, other jurisdictions are already better for things like this.
We migrated from Slack to Teams and while it does work, it’s also not very good (UI/UX wise). We also did try out Rocket.Chat and Mattermost and out of all of those Mattermost was the closest to Slack and the most familiar to us.
I’d go for Discord if it had a business version without all the gaming stuff.
The dedicated voice/video channels are great for ad-hoc conversations when remote and a lot better than Slack’s huddles. They’re like dedicated remote meeting rooms except you’re not limited by office space.
> I’d go for Discord if it had a business version without all the gaming stuff.
Granted, my Discord usage been relatively limited, but what "gaming stuff"? In the servers unrelated to gaming I don't think I see anything gaming related, but maybe I'm missing something obvious.
We've migrated a 1000+ product team to Mattermost 2 years ago.
Super happy with it. No bullshit upgrades that break your way of working. Utilitarian approach to everything, the basics just work. Still has some rough edges, but in a workhorse kind of way.
Endorse.
Sounds sort of like an innovator's dilemma response. New technology appears and the response is gatekeeping and building walls rather than adaptation.
Slack was never an innovator. By the time they showed up there were lots of chats apps. They just managed to go beyond the others by basically embedding a browser engine into their app at a time most thought of that as heresy, I mean a chat app that requires 1Gb to run was a laughable proposition to us, techies. But here we are… MS Teams is even heavier, but users seem to care nothing about that anyway.
They were never an innovator, they just did this thing nobody else did, that some years later became the norm?
Blitzscaled? Yep.
I'm happier we went with Zulip each day.
(Still) such an overvalued alternative to all these "ephemeral but permanent" chat apps. For folks who like a bit more structure and organization, but still want "live communication" like what Slack et al offers, do yourself a favor and look into Zulip.
If you are interested in scraping slack for personal use, I made a local-only slack scraper mcp: https://github.com/kimjune01/slunk-mcp
Thanks for building this, but also ridiculous that you had to do it. I miss irc even though slack is objectively better.
Hard disagree, IRC is still the best chat application out there.
Jabber & XMPP was the peak of instant messaging. Since then it's been downhill.
A big part of my short thesis with Apple is that they'll try to do this sort of thing and it will mean real AI integration like what their customers want will simply never be available, driving them to more open platforms.
I think you'll see this everywhere. LLMs mean "normal" people will suddenly see computers the way we do and a lot of corporate leadership just isn't intuitively prepared for that.
It's going to take more people willing to move away from slack for those purposes.
As it is, I'm going to propose that we move more key conversations outside of slack so that we can take advantage of feeding it into ai. It's a small jump from that to looking for alternatives.
The argument used to be “Let’s move FOSS conversation out of {Slack, Discord} because they prevent conversations from being globally searchable, and they force individuals into subscription to access history backlog.”
Getting indexed by AI crawlers appears to be the new equivalent to getting indexed by search engines.
I use both daily and Teams absolutely sucks.
Very aware, zero desire to use Teams, that's why I've fought to keep Slack despite the cost.
But now they're actively making it more difficult for people like me to say "engineers like it more" and that be a compelling-enough argument.
> But the way I see it, AI agents created incentives for interoperability.
There are no new incentives for interoperability. Compare that were already providing API access added MCP servers of varying quality.
The rest couldn't care less, unless they can smell an opportunity to monetize hype
Well, interoperability requires competition and if there's one thing we've learnt it's that the tech industry loves a private monopoly.
MCP it's like API but with 100,000x the operating cost!
Reminds me of the days of Winsock.
For those that don't remember/don't know, everything network related in Windows used to use their own, proprietary setup.
Then one day, a bunch of vendors got together and decided to have a shared standard to the benefit of basically everyone.
https://en.wikipedia.org/wiki/Winsock
Trumpet Winsock! Brings back memories :)
I think we're seeing a wave of hype marketing on YouTube, Twitter and LinkedIn, where people with big followings create posts or videos full with buzzwords (MCP, vibe coding, AI, models, agentic) with the sole purpose of promoting a product like Cursor, Claude Code or Gemini Code, or get people to use Anthropic's MCP instead of Google's A2A.
It feels like 2 or 3 companies have paid people to flood the internet with content that looks educational but is really just a sales pitch riding the hype wave.
Honestly, I just saw a project manager on LinkedIn telling his followers how MCP, LLMs and Claude Code changed his life. The comments were full of people asking how they can learn Claude Code, like it's the next Python.
Feels less like genuine users and more like a coordinated push to build hype and sell subscriptions.
They’re not being paid, at least not directly. They don’t need to be. “Educational” “content” is a play to increase the personal profile as a “thought leader.” This turns into invitations to conferences and ultimately funnels into sales of courses and other financial opportunities
Hype marketing can look spontaneous, but it's almost always planned. And once the momentum starts, others jump in. Influencers and opportunists ride the wave to promote themselves
Nah it's the same motivation as all the gen z tiktok kids. It's all for clout.
People write those medium articles wanting engagement/clout/making it big/creating a brand.
Trumpet + PPP on a university library mainframe was first experience on the internet.
The main benefit is not that it made interoperability fashionable, or that it make things easy to interconnect. It is the LLM itself, if it knows how to wield tools. It's like you build a backend and the front-end is not your job anymore, AI does it.
In my experience Claude and Gemini can take over tool use and all we need to do is tell them the goal. This is huge, we always had to specify the steps to achieve anything on a computer before. Writing a fixed program to deal with dynamic process is hard, while a LLM can adapt on the fly.
The issue holding us back was never that we had to write a frontend — it was the data locked behind proprietary databases and interfaces. Gated behind API keys and bot checks and captchas and scraper protection. And now we can have an MCP integrator for IFTTT and have back the web we were promised, at least for a while.
Indeed, the frontend itself is usually the problem. If not for data lock in, we wouldn't need that many frontends in the first place - most of the web would be better operated through a few standardized widgets and a spreadsheet and database interfaces - and non-tech people would be using it and be more empowered for it.
(And we know that because there was a brief period in time where basics of spreadsheets and databases were part of curriculum in the West and people had no problem with that.)
So... How do MCPs magically unlock data behind proprietary databases and interfaces?
It doesn't do it magically. The "tools" an LLM agent calls to create responses are typically REST APIs for these services.
Previously, many companies gated these APIs but with the MCP AI hype they are incentivized to expose what you can achieve with APIs through an agent service.
Incentives align here: user wants automations on data and actions on a service they are already using, company wants AI marketing, USP in automation features and still gets to control the output of the agent.
> Previously, many companies gated these APIs but with the MCP AI hype they are incentivized to expose what you can achieve with APIs through an agent service.
Why would they be incentivized to do that if they survived all the previous hype waves and still have access gated?
> user wants automations on data and actions on a service they are already using,
How many users want that? Why didn't companies do all this before, since the need for automation has always been there?
> Why would they be incentivized to do that if they survived all the previous hype waves and still have access gated?
Because they suddenly now don't want to be left out of the whole AI hype/wave.
Is it stupid? Yes. Can we still reap the benefits of these choices driven by stupid motivations? Also yes.
> Because they suddenly now don't want to be left out of the whole AI hype/wave.
https://news.ycombinator.com/item?id=44405491 and https://news.ycombinator.com/item?id=44408434
what is IFTTT ?
If this then that - a zapier type glue provider.
Minor chronological point but Zapier is an IFTTT-type glue provider.
IFTTT was announced Dec. 14, 2010 and launched on Sept. 7. 2011.
Zapier was first pitched Sept. 30, 2011 and their public beta launched May 2012.
Minor point but arguably both are Yahoo Pipes-type glue provider, which itself is basic no-code glue thing. The difference is that IFTTT erred on the side of dumbing the product down too much, and Zapier erred on the side of being too much B2B-focused - so they both missed the mark on becoming the universal glue.
Then of course we have Node-Red too, which probably is too developer focused (and generally just lacking any sort of focus at the same time, strangely), so also doesn't sit somewhere closer to the middle.
Do we really not having anything coming close to the usefulness of Yahoo Pipes yet? What would a modern alternative look like and how would it work? Someone has to thinking about this.
If we're considering Node-Red, it would not be amiss to also mention N8N - which, mirroring the IFTTT/Zapier split, is basically the opposite of Node-Red on the "let's turn this into an enterprise product" scale.
> Do we really not having anything coming close to the usefulness of Yahoo Pipes yet?
I don't know of anything. There's some new products I saw heavily promoted on LinkedIn, but at first glance they feel like IFTTT with a shiny coat of paint.
> What would a modern alternative look like and how would it work?
At this point I think ComfyUI or the node editor in Blender would be the best; they're oriented for different kinds of workflows, but UIs of both are excellent in their own ways, and the rest is a matter of implementing the right blocks.
And to answer the obvious next question: yes, there is an MCP for IFTTT https://mcp.pipedream.com/app/ifttt
It’s more useful the other way :p
But zapier is easily Google-able and therefore useful as a reference name even if the commenter hasn't heard of it.
Not as easily Goolge’d as “IFTTT” with five characters versus six.
I don't understand what you mean.
> It (the main benefit?) is the LLM itself, if it knows how to wield tools.
LLMs and their ability to use tools are not a benefit or feature that arose from MCP. There has been tool usage/support with various protocols and conventions way before MCP.
MCP doesn't have any novel aspects that are making it successful. It's relatively simple and easy to understand (for humans), and luck was on Anthropic's side. So people were able to quickly write many kinds of MCP servers and it exploded in popularity.
Interoperability and interconnecting tools, APIs, and models across providers are the main benefits of MCP, driven by its wide-scale adoption.
To me it feels like an awkward API that creates opportunities to work the limitations of a normal API... which to me is not a great thing. Potentially useful, sure, but not great.
the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned
Perhaps but we see current hypes like Cursor only using MCP one way; you can feed into Cursor (eg. browser tools), but not out (eg. conversation history, context etc).
I love Cursor but this "not giving back" mentality originally reflected in it's closed source forking of VS Code leaves an unpleasant taste in the mouth and I believe will ultimately see it lose developer credibility.
Lock-in still seems to be locked in.
The VSCode extension Continue provides similar capabilities and gives you full access to your interaction traces (local database and JSON traces)
Remember Web 2.0? Remember the semantic web? Remember folksonomies? Mash-ups? The end of information silos? The democratizing power of HTTP APIs?Anyone? Anyone?
I think we found a new backronym for MCP: Mashup Context Protocol.
(The mashup hype was incredible, btw. Some of the most ridiculous web contraptions ever.)
And some of the most ridiculous songs. I remember (vaguely) a Bootie show at DNA Lounge back in the early 2000s that was entirely mashups. It was hilarious. Lady Gaga mashes up with Eurythmics, Coldplay and Metallica, Robert Palmer and Radiohead.
(I hereby claim the name "DJ MCP"...)
Yes. Pieces of all of those things surround us now. And where we are wrt locking and interop is far beyond where we were when each of those fads happened.
Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t. The benefits are too high for companies to allow vendors to lock the data away from them.
> Mcp is a fad, it’s not long term tech. But I’m betting shoveling data at llm agents isn’t.
I'd bet that while "shoveling data at llm agents" might not be a fad, sometime fairly soon doing so for free while someone else's VC money picks up the planet destroying data center costs will stop being a thing. Imagine if every PHP or Ruby on Rails, or Python/Django site had started out and locked themselves into a free tier Oracle database, then one day Oracle's licensing lawyers started showing up to charge people for their WordPress blog.
I know we're all just soaked by a wave of hype right now but I think MCP will go the way of other "but it works" tech, like zip files, RSS and shell scripts.
Remember when GraphQl was making REST obsolete? This rhymes.
> Mcp is a fad, it’s not long term tech
One that won't be supported by any of the big names except to suck data into their walled gardens and lock it up. We all know the playbook.
Yahoo Pipes, XMPP, self-hosted blogs, RSS-based social networks, pingbacks; the democratized p2p web that briefly was. I bet capitalism will go 2 for 2 against the naive idealism that gatekeepers will stop gatekeeping.
Giving value away is unacceptable… for MBAs and VCs, anyway.
I don't understand your point. Some of those things were buzzwords, some were impossible dreams, some changed the way the web works completely. Are you just saying that the future is unknown?
No. What they are saying is best said with a quote from Battlestar Galactica:
> All of this has happened before, and all of this will happen again.
”It” here being the boom and inevitable bust of interop and open API access between products, vendors and so on. As a millenial, my flame of hope was lit during the API explosion of Web 2.0. If you’re older, your dreams were probably crushed already by something earlier. If you’re younger, and you’re genuinely excited about MCP for the potential explosion in interop, hit me up for a bulk discount on napkins.
And then, there are "architecture astronaut"s dreaming of an entire internet of MCP speaking devices - an "internet of agents" if you will. That is now requiring a separate DNS, SMTP, BGP etc. for that internet.
I think battlestar Galactica must be quoting one of the Eddas. I've only read if it from Borges in Spanish, but Conner the same meaning: "Estas cosas han pasado. Estas cosas también pasarán."
I'm older and would like a discount please. The "this time it's different" energy is because assuming a human can interact with the system, and that vision models can drive a gui, who cares if there's an actual API, just have the AI interact with the system as if it was coming in as a human.
What's the point of stating the obvious if the obvious won't change anything? Things evolve. Winners win and losers lose. Change is constant. And? Does that somehow mean there's nothing to see here and we should move on?
No, things can change but we programmers tend to see everything as a technical problem, and assume that if only we can find a good technical solution we can fix it. But the problem isn’t technical – the APIs were shut down because consumer tech is governed by ads, which are not part of APIs (or would be trivial to remove). You have surely noticed that APIs are alive and well in enterprise, why? Because they have customers who pay money, and API access does not generally break their revenue stream (although even there some are skittish). As mere ”users” our economic function in consumer tech is to provide data and impressions to advertisers. Thy shall not bypass their sidebar, where the ads must be seen.
Nowadays the shutdown is not just of APIs but even anti-scraping, login walls, paywalls, fingerprinting, and so on and so forth. It’s a much more adversarial landscape today than during Web 2.0. When they threw copyright in the trash with the fair use loophole for AI, that obviously causes even more content lockdown panic. And in the midst of this giant data Mexican standoff, people are gonna take down their guns and see the light because of a new meta—API protocol?
I take their point to be that the underlying incentives haven't changed. The same forces and incentives that scuttled those things are likely to scuttle this as well.
I actually disagree with the OP in this sub-thread:
> "No, the accident is that the AI Agent wave made interoperability hype, and vendor lock-in old-fashioned."
I don't think that's happened at all. I think some interoperability will be here to say - but those are overwhelmingly the products where interoperability was already the norm. The enterprise SaaS that your company is paying for will support their MCP servers. But they also probably already support various other plugin interfaces.
And they're not doing this because of hype or new-fangledness, but because their incentives are aligned with interoperability. If their SaaS plugins into [some other thing] it increases their sales. In fact the lowering of integration effort is all upside for them.
Where this is going to run into a brick wall (and I'd argue: already has to some degree) is that closed platforms that aren't incentivized to be interoperable still won't be. I don't think we've really moved the needle on that yet. Uber Eats is not champing at the bit to build the MCP server that orders your dinner.
And there are a lot of really good reasons for this. In a previous job I worked on a popular voice assistant that integrated with numerous third-party services. There has always been vehement pushback to voice assistant integration (the ur-agent and to some degree still the holy grail) because it necessarily entails the service declaring near-total surrender about the user experience. An "Uber Eats MCP" is one that Uber has comparatively little control over the UX of, and has poor ability to constrain poor customer experiences. They are right to doubt this stuff.
I also take some minor issue with the blog: the problem with MCP as the "everything API" is that you can't really take the "AI" part out of it. MCP tools are not guaranteed to communicate in structured formats! Instead of getting an HTTP 401 you will get a natural language string like "You cannot access this content because the author hasn't shared it with you."
That's not useful without the presence of a NL-capable component in your system. It's not parseable!
Also importantly, MCP inputs and outputs are intentionally not versioned nor encouraged to be stable. Devs are encouraged to alter their input and output formats to make them more accessible to LLMs. So your MCP interface can and likely will change without notice. None of this makes for good API for systems that aren't self-adaptive to that sort of thing (i.e., LLMs).
Nobody remembers the semantic web anymore
Palantir remembers..
https://www.palantir.com/docs/foundry/ontology/overview
That isn’t the same at all. Ontologies in Foundry are not defined with triplets like they are in the semantic web. Ontologies in Palantir are just a set of object types with properties, and objects of each type.
It was supposed to be Web 3.0, but then Web3 happened.
In all seriousness though, I think HN has a larger-than-average amount of readers who've worked or studied around semantic web stuff.
> It was supposed to be Web 3.0, but then Web3 happened
There's a shocking overlap of GPU-slinging tech-bros hyping both. Crypto-bros turned LLM-experts out of necessity when mining crypo on their rigs became unprofitable.
[dead]
I haven't seen an app that didn't have an API create one via MCP. The only MCP servers I've seen were for things that I could already access programmatically.
I am pondering if I should do this.
If I have a app who's backend needs to connect to, say, a CRM platform - I wonder if instead of writing APIs to connect to Dynamics or Salesforce or Hubspot specifically, if there's benefit in abstracting a CRM interface with an MCP so that switching CRM providers later (or adding additional CRMs) becomes easier?
One of us doesn't understand MCP well enough, and it might very well be me, but how can MCP be used without an LLM? Most of the structure is in human language.
MCP itself doesn't require the use of the LLM. There are other concepts, but for this use, Tools are key. A Tool is an operation, like a search.
Have a look at the Filesystem example MCP server - https://github.com/modelcontextprotocol/servers/blob/main/sr.... It has a collection of Tools - read_file, read_multiple_files, write_file, etc.
The LLM uses MCP to learn what tools the server makes available; the LLM decides when to use them. (The process is a little more complicated than that, but if you're just trying to call tools without an LLM, those parts aren't really important.) If you ask the LLM, "Find all files with an asterisk," it might deduce that it can use the search_files tool provided by that MCP to gain the needed context, invoke that tool, then process the results it returns. As an engineer, you can just call search_files if you know that's what you want.
Yeah MCP isn't really doing a whole lot. You can give an LLM a generic HTTP extension. Then list a series of GET/POST/PUT and ask it to form the calls and parse the response. The problem is its not really ideal as the calls aren't natural language and its common for it to misguess the next token and mess up things like the route, body, headers, with a hallucination. So people started shortening these calls to simple things like read_file, etc. Prior to MCP there was a ton of playgrounds doing this with simple Playwright functions.
The thing that surprises me is with MCP we have shirked all of the existing tools around OpenAPI specs, OIDC, etc. We could have created a system where all 'services' expose a mcp.slack.com/definition endpoint or something that spit back a list of shortcut terms like send_message and a translation function that composes it into the correct HTTP API (what most MCP servers do). For security we could have had the LLM establish its identity via all our existing systems like OIDC that combine authentication and authorization.
In the system above you would not "install an mcp package" as in a code repo or server. Instead you would allow your LLM to access slack, it would then prompt you to login via OIDC and establish your identity and access level. Then it would grab the OpenAPI spec (machine readable) and the LLM focused shortcuts 'send_message', 'read_message', etc. LLM composes 'send_message Hello World' -> translates to HTTP POST slack.com/message or whatever and bob's your uncle.
If you wanted to do fancy stuff with local systems then you could still build your own server the same way we have all built HTTP servers for decades and just expose the mcp.whatever.com subdomain for discovery. Then skip OIDC or allow ALL or something to simplify if you want.
That's my understanding as well, but you'll be missing the discovery part. You'll have to hardcode the API, at which point you may as well just use the computer API the MCP also uses under the hood.
getting the LLM to use your tool is actually tougher than it should be. You don't get to decide that deterministically. I don't get what benefit there would be to build an MCP server without an LLM-based agent. You might as well build an API and get the value from a strict, predictable interface & results.
My thought too when I read TFA about MCP as a universal interface, but I suppose one can distinguish between interface and automated discovery/usage.
The MCP exposed API is there for anyone/anything to use, as long as you are able to understand how it works - by reading the natural language description of the exposed methods and their JSON input schemas. You could read this yourself and then use these methods in any way you choose.
Where LLMs come in is that they understand natural language, so in the case where the MCP client is an LLM, then there is automatic discovery and integration - you don't need to grok the MCP interface and integrate it into your application, but instead the LLM can automatically use it based on the interface description.
You write a script which pretends to be an LLM to get the data you want reliably.
But… you don’t really need to pretend you’re an LLM, you can just get the data using the same interface as a model would.
I will also freely admit to not understanding MCP much, but using it without an LLM was (at least to my reading) pretty much the main thesis of the linked article.
"Okay but. But. What if you just... removed the AI part?
What if it's just "a standardized way to connect literally anything to different data sources and tools"?"
(which had struck out text 'AI models' in between 'connect' and 'literally')
You’re right. It’s unclear, however, how your application will handle knowing when to call a tool/MCP. That’s the part where LLMs are so good at: understanding that to do a certain job, this or that tool would be useful, and then knowing how to provide the necessary parameters for the tool call (I say tool here because MCP is just a convenient way to pack a normal tool, in other words, it’s a plugin system for tools).
How ironic given the amount of APIs that were locking down access in response to AI training!
Though the general API lockdown was started long before that, and like you, I’m skeptical that this new wave of open access will last if the promise doesn’t live up to the hype.
MCP seems to be about giving you access to your own data. Your Slack conversations, your Jira tickets, your calendar appointments. Those wouldn't go into AI training datasets anyway, locked down APIs or not.
The APIs of old were about giving you programmatic access to publicly available information. Public tweets, public Reddit posts, that sort of thing. That's the kind of data AI companies want for training, and you aren't getting it through MCP.
Interesting perspective because MCPs are safer when they give you access to your own content from trusted providers or local apps on your computer than when they give you access to public data which may have prompt injection booby traps.
MCP is supposed to grant "agency" (whatever that means), not merely expose curated data and functionality.
In practice, the distinction is little more than the difference between different HTTP verbs, but I think there is a real difference in what people are intending to enable when creating an MCP server vs. standard APIs.
Might be another reflection of McLuhan‘s “the medium is the message” in that APIs are built with the intended interface in mind.
To this point, GUIs; going forward, AI agents. While the intention rhymes, the meaning of these systems diverge.
i don't think it's ironic at all. the AI boom exposed the value of data. there's two inevitable consequences when the value of something goes up: the people who were previously giving it away for free start charging for it, and the people who weren't previously selling it at all start selling it.
the APIs that used to be free and now aren't were just slightly ahead of the game, all these new MCP servers aren't going to be free either.
> Want spell check? MCP server.
> Want it to order coffee when you complete 10 tasks? MCP server.
With a trip through an LLM for each trivial request? A paid trip? With high overhead and costs?
the whole point of the article is that it doesn't need to be an LLM, MCP is just a standard way to expose tools to things that use tools. LLMs can use tools, but so can humans.
So the whole point of the article is that an API is an API and anything can call an API?
There is a long tail of applications that are not currently scriptable or have a public API. The kind that every so often make you think "if only I could automate this instead of clicking through this exact same dialog 25 times"
Before, "add a public API to this comic reader/music player/home accounting software/CD archive manager/etc." would be a niche feature to benefit 1% of users. Now more people will expect to hook up their AI assistant of choice, so the feature can be prioritized.
The early MCP implementations will be for things that already have an API, which by itself is underwhelming.
You would think Apple would have a leg up here with AppleScript already being a sanctioned way to add scriptable actions across the whole of macOS, but as far as I can tell they don't hook it up to Siri or Apple Intelligence in any way.
That's not where the money is. It's in adding a toll charge for tokens to talk to widely used APIs.
> There is a long tail of applications that are not currently scriptable or have a public API.
So how does MCP help with this?
The theory is, I guess, that creating an MCP API is a lot easier than creating a regular API. A regular API is a very costly thing to develop and it has on-going costs too because it's so hard to change. You have to think about data structures, method names, how to expose errors, you have to document it, make a website to teach devs how to use it, probably make some SDKs if you want to do a good job, there's authentication involved probably, and then worst of all: if you need to change the direction of your product you can't because it'd break all the connected apps.
An MCP API dodges all of that. You still need some data structures but beyond that you don't think too hard, just write some docs - no fancy HTML or SDKs needed. MCP is a desktop-first API so auth mostly stops being an issue. Most importantly, if you need to change anything you can, because the LLM will just figure it out, so you're way less product constrained.
Maybe phrasing it this way will be the lightbulb moment for everyone who hasn’t got that yet.
I see the point as "let's not overcomplicate the API with complex schemas and such. Lets not use GraphQL for everything. Just create a simple API and call it to extend stuff. Am I wrong?
Part of the reason AI agents and MCP work is because AI can programmatically at runtime determine what plug-ins to use. Without the AI part, how does the host app know when to call a MCP server function?
Same way it would call any other api: exactly when it was programmed to.
same as any other api function call in an app - because an app developer programmed it to call that function.
That only works for the MCPs your app knows about, which is not that great. The usefulness of a plugin system like MCP is that an app can automatically use it. But MCPs are literally just a function, with some metadata about what it does and how to invoke it. The only thing generic enough to figure out how to use a function given only this metadata seems to be an LLM. And not even all of them, only some support “toll calling “.
It is new and exciting if you just learned to vibe code, and you don’t even know what a rest api is
I genuinely believe that low-code workflow orchestrators like Zapier or IFTTT will be the first major victims of agentic LLM workflows. Maybe not right now but already it’s easier to write a prompt describing a workflow than it is to join a bunch of actions and triggers on a graph.
The whole hype around AI replacing entire job functions does not have as much traction as the concept of using agents to handle all of the administrative stuff that connects a workflow together.
Any open source model that supports MCP can do it, so there’s no vendor lock in, no need to learn the setup for different workflow tools, and a lot of money saved on seats for expensive SaaS tools.
> made interoperability hype, and vendor lock-in old-fashioned
I always imagined software could be written with a core that does the work and the UI would be interchangeable. I like that the current LLM hype is causing it to happen.
> I don't know how long it'll last
I'm just baffled no software vendor has already come up with a subscription to access the API via MCP.
I mean obviously paid API access is nothing new, but "paid MCP access for our entreprise users" is surely on the pipeline everywhere, after which the openness will die down.
I think for enterprise it’s going to become part of the subscription you’re already paying for, not a new line item. And then prices will simply rise.
Optionality will kill adoption, and these things are absolutely things you HAVE to be able to play with to discover the value (because it’s a new and very weird kind of tool that doesn’t work like existing tools)
And I expect there'll eventually be a way for an AI to pay for an MCP use microtransaction style.
Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done! Ads (and scams) for AIs to fall for!
> Heck, if AIs are at some point given enough autonomy to simply be given a task and a budget, there'll be efforts to try to trick AIs into thinking paying is the best way to get their work done!
We're already there, just take a look at the people spending $500 a day on Claude Code.
Mapbox is just a small step away from that with their MCP server wrapping their pay-by-use API. I wouldn’t be surprised to see a subscription offering with usage limits if that somehow appealed to them. MapTiler already offers their service as a subscription so they’re even closer if they hosted a server like this on their own.
https://github.com/mapbox/mcp-server
The pressure to monetize after all those humungous investments into AI will surely move some things that have been stuck in their ways and stagnant. It looks like this time the IT industry itself will be among those that are being disrupted.
AI agents didn't only make adversarial interoperability hype, they've also made it inevitable! From here all the way until they're probing hardware to port Linux and write drivers.
I joked with people in IRC that it took LLMs to bring answer engines into the terminal
All vendor lock in is being transmuted to model access.
Exactly. It’s a crack in the MBA’s anti-commoditization wall. Right now it’s like USB-C. If we’re lucky it will turn into TCP/IP and transform the whole economy.