Sunday, May 11, 2025

FastAPI-MCP Tutorial for Inexperienced persons and Consultants

Have you ever ever come throughout a state of affairs the place you needed your chatbot to make use of a instrument after which reply? Sounds difficult, proper! However now, MCP (Mannequin Context Protocol) provides you a method to combine your LLM to exterior instruments simply and the LLM will have the ability to use these instruments in each approach. On this tutorial, we’ll dive into the method of changing a easy internet app made utilizing FastAPI, powered by an MCP Server, utilizing the FastAPI-MCP.

FastAPI with MCP

FastAPI is a quite simple instrument in-built Python which lets you construct internet functions utilizing APIs. It’s designed to be simple to make use of in addition to quick on the similar time. Consider FastAPI as a wise waiter who takes your order (HTTP requests), goes to the Kitchen (Database/Server) after which takes your order (Output) after which reveals it to you. It’s a fantastic instrument for constructing Internet backends, Companies for Cellular apps and so on.

MCP is an open customary protocol by Anthropic that gives a performance for the LLMs to speak with exterior knowledge sources and instruments. Consider MCP as a toolkit that gives the precise instrument for the given activity. We’d be utilizing MCP for making a server.

Now, what if these functionalities are given to your LLM? It can make your life a lot simpler! That’s why FastAPI to MCP integration helps so much. FastAPI takes care of the companies from completely different sources and MCP takes care of the context of your LLM. Through the use of FastAPI with MCP server, we will get entry to each instrument deployed over the net and make the most of that as a LLM instrument and make the LLMs do our work extra effectively.

 Within the above picture, we will see that there’s an MCP server that’s linked to an API endpoint. This API endpoint could be a FastAPI endpoint or another third occasion API service out there on the web.

What’s FastAPI-MCP?

FastAPI-MCP is a instrument which helps you to convert any FastAPI utility into some instrument that LLMs like ChatGPT or Claude can perceive and use simply. Through the use of FastAPI-MCP you may wrap your FastAPI endpoints in such a approach that they may change into a plug and play instrument in an AI ecosystem using LLMs.

If you wish to know learn how to work with MCP, learn this text on The right way to Use MCP?

What APIs Can Be Transformed into MCP Utilizing FastAPI-MCP?

With FastAPI-MCP, any FastAPI endpoint might be transformed right into a MCP instrument for LLMs. These endpoints ought to embody:

  • GET endpoints: Transformed into MCP assets.
  • POST, PUT, DELETE endpoints: Transformed into MCP instruments.
  • Customized utility features: May be added as extra MCP instruments

FastAPI-MCP is a really easy-to-use library that routinely discovers and converts these endpoints into MCP. It additionally preserves the schema in addition to the documentation of those APIs.

Arms-on utilizing FastAPI-MCP

Let’s have a look at a easy instance on learn how to convert a FastAPI endpoint right into a MCP server. Firstly, we’ll create a FastAPI endpoint after which transfer in the direction of changing it right into a MCP server utilizing fastapi-mcp.

Configuring FastAPI

1. Set up the dependencies

Make your system appropriate by putting in the required dependencies.

pip set up fastapi fastapi_mcp uvicorn mcp-proxy

2. Import the required dependencies

Make a brand new file with the identify ‘principal.py’, then import the next dependencies inside it.

from fastapi import FastAPI, HTTPException, Question import httpx from fastapi_mcp import FastApiMCP

3. Outline the FastAPI App

Let’s outline a FastAPI app with the identify “Climate Updates API”.

app = FastAPI(title="Climate Updates API")

4. Defining the routes and features

Now, we’ll outline the routes for our app, which can denote which endpoint will execute which perform. Right here, we’re making a climate replace app utilizing climate.gov API (free), which doesn’t require any API key. We simply have to hit the https://api.climate.gov/factors/{lat},{lon} with the precise worth of latitude and longitude.

We outlined a get_weather perform which can take a state identify or code as an argument after which discover the corresponding coordinates within the CITY_COORDINATES dictionary after which hit the bottom URL with these coordinates.

# Predefined latitude and longitude for main cities (for simplicity) # In a manufacturing app, you might use a geocoding service like Nominatim or Google Geocoding API CITY_COORDINATES = {    "Los Angeles": {"lat": 34.0522, "lon": -118.2437},    "San Francisco": {"lat": 37.7749, "lon": -122.4194},    "San Diego": {"lat": 32.7157, "lon": -117.1611},    "New York": {"lat": 40.7128, "lon": -74.0060},    "Chicago": {"lat": 41.8781, "lon": -87.6298},    # Add extra cities as wanted } @app.get("/climate") async def get_weather(    stateCode: str = Question(..., description="State code (e.g., 'CA' for California)"),    metropolis: str = Question(..., description="Metropolis identify (e.g., 'Los Angeles')") ):    """    Retrieve right now's climate from the Nationwide Climate Service API based mostly on metropolis and state    """    # Get coordinates (latitude, longitude) for the given metropolis    if metropolis not in CITY_COORDINATES:        elevate HTTPException(            status_code=404,            element=f"Metropolis '{metropolis}' not present in predefined checklist. Please use one other metropolis."        )       coordinates = CITY_COORDINATES[city]    lat, lon = coordinates["lat"], coordinates["lon"]       # URL for the NWS API Gridpoints endpoint    base_url = f"https://api.climate.gov/factors/{lat},{lon}"       strive:        async with httpx.AsyncClient() as shopper:            # First, get the gridpoint info for the given location            gridpoint_response = await shopper.get(base_url)            gridpoint_response.raise_for_status()            gridpoint_data = gridpoint_response.json()                       # Retrieve the forecast knowledge utilizing the gridpoint info            forecast_url = gridpoint_data["properties"]["forecast"]            forecast_response = await shopper.get(forecast_url)            forecast_response.raise_for_status()            forecast_data = forecast_response.json()            # Returning right now's forecast            today_weather = forecast_data["properties"]["periods"][0]            return {                "metropolis": metropolis,                "state": stateCode,                "date": today_weather["startTime"],                "temperature": today_weather["temperature"],                "temperatureUnit": today_weather["temperatureUnit"],                "forecast": today_weather["detailedForecast"],            }       besides httpx.HTTPStatusError as e:        elevate HTTPException(            status_code=e.response.status_code,            element=f"NWS API error: {e.response.textual content}"        )    besides Exception as e:        elevate HTTPException(            status_code=500,            element=f"Inner server error: {str(e)}"        )

5. Arrange MCP Server

Let’s convert this FastAPI app into MCP now utilizing the fastapi-mcp library. This course of may be very easy, we simply want so as to add a couple of strains of and the fastapi-mcp routinely converts the endpoints into MCP instruments and detects its schema and documentation simply.

mcp = FastApiMCP(    app,    identify="Climate Updates API",    description="API for retrieving right now's climate from climate.gov", ) mcp.mount() 

6. Beginning the app

Now, add the next on the finish of your Python file.

if __name__ == "__main__":    import uvicorn    uvicorn.run(app, host="0.0.0.0", port=8000) 

And go to terminal and run the primary.py file.

python principal.py 

Now your FastAPI app ought to begin in localhost efficiently. 

Configuring Cursor

Let’s configure the Cursor IDE for testing our MCP server.

  1. Obtain Cursor from right here https://www.cursor.com/downloads.
  2. Set up it, join and get to the house display.
Cursor Home Screen
  1. Now go to the File from the header toolbar. and click on on Preferences after which on Cursor Settings.
Cursor Settings
  1. From the cursor settings, click on on MCP.
Configuring Cursor
  1. On the MCP tab, click on on Add new world MCP Server.
    It can open a mcp.json file. Paste the next code into it and save the file.
{    "mcpServers": {      "Nationwide Park Service": {          "command": "mcp-proxy",          "args": ["http://127.0.0.1:8000/mcp"]      }    } }
  1. Again on the Cursor Settings, you need to see the next:
Linked MCP Server

In case you are seeing this in your display, which means your server is working efficiently and linked to the Cursor IDE. If it’s displaying some errors, strive utilizing the restart button in the precise nook.

We now have efficiently arrange the MCP server within the Cursor IDE. Now, let’s take a look at the server.

Testing the MCP Server 

Our MCP server can retrieve the climate updates. We simply should ask the Cursor IDE for the climate replace on any location, and it’ll fetch that for us utilizing the MCP server. 

Question:Please inform me what’s right now’s climate in San Diego

Prompt Response 1

 Question:New York climate?

Prompt Response 2

We are able to see from the outputs that our MCP server is working properly. We simply have to ask for the climate particulars, it’s going to resolve by itself whether or not to make use of MCP server or not. Within the second output we requested vaguely “New York climate?” it was in a position to notice the context of the question based mostly on our earlier immediate, and used the suitable MCP instruments to reply.

Conclusion

MCP permits LLMs to extend their answering capabilities by giving entry to exterior instruments and FastAPI provides a simple approach to try this. On this complete information, we mixed each the applied sciences utilizing the fastapi-mcp library. Using this library, we will convert any API into MCP server, which can assist the LLMs and AI brokers to get the newest info from the APIs. There will probably be no have to outline a customized instrument for each new activity. MCP with FastAPI will deal with all the things routinely. The revolution within the LLMs was introduced by the introduction of MCP, and now, FastAPI paired with MCP is revolutionizing the best way LLMs are accessing these instruments.

Harsh Mishra is an AI/ML Engineer who spends extra time speaking to Massive Language Fashions than precise people. Obsessed with GenAI, NLP, and making machines smarter (so that they don’t substitute him simply but). When not optimizing fashions, he’s most likely optimizing his espresso consumption. 🚀☕

Login to proceed studying and luxuriate in expert-curated content material.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles