LLMs just got a major upgrade with the Model Context Protocol


LLMs just got a major upgrade with the Model Context Protocol

“MCP”( Model Context Protocol) is like the GraphQL of LLMs but possibly even more impactful due to the insanely fast adaption rate of modern LLMs.

I am typically not on the side of AI will/should replace everything but this is the type of thing that makes me think and Iron Man Jarvice level AI is just around the corner. Perhaps not in its current form but in the future.

What is MCP? There are plenty of great videos out there on it Fireship has an amazing one I strongly recommend watching but I will do my best to summarize.

MCP is a layer that lives between the LLM client and whatever data source you want to feed it as context. You define “Resources” it can query and “Tools” which are basically actions it can take. When you connect your favorite LLM client to the MCP server the first thing the server sends the LLM client is the list of Resources and Tools.

Those Resources and Tools get fed into the LLM model by the LLM by the client interface whenever you give it a prompt. The LLM can then use your prompt to choose which actions (tools) to perform. At least this is my understanding at a high level.

I started wiring a few into Claude on my desktop to find that currently there are some limitations that I will cover over the next few weeks but I have a feeling those limitations will be quickly overcome.

If you are interested in learning more about AI/ML, MCP and how to cost effectively run it on AWS you should sign up for my free live Tech Talk on it.