Understanding MCP Prompts

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

You have now built Tools to give the model hands (actions) and Resources to give the model eyes (context). The final piece of the puzzle is Prompts.

While Tools and Resources are designed for the Model and the Application respectively, Prompts are designed for the User.

What Are Prompts?

An MCP Prompt is a pre-defined template that users can select to start a conversation with an LLM.

The Interaction Flow

It is important to understand that LLMs do not access Prompts.

Building a Prompt Server

You will now create a server that provides a travel itinerary helper. Ensure you are still inside your lesson_2 directory.

from mcp.server.fastmcp import FastMCP


mcp = FastMCP("Prompt Demo")

@mcp.prompt()
def draft_itinerary(destination: str, days: int) -> str:
    """Creates a prompt for the LLM to generate a travel plan."""
    return f"""
    Please create a detailed {days}-day itinerary for a trip to {destination}.
    1. List specific tourist attractions.
    2. Recommend local food.
    3. Keep the budget moderate.
    """

if __name__ == "__main__":
    mcp.run(transport="streamable-http")

Analyzing the Code

  • @mcp.prompt(): This decorator registers the function as a user-facing prompt.
  • Arguments: The function arguments (destination, days) automatically become the form fields the user needs to fill out in the UI.
  • Return Value: The string returned by this function is exactly what will be sent to the LLM.

Running the Server

Run your prompt server using uv:

$ uv run --with mcp prompts_demo_server.py
See forum comments
Download course materials from Github
Previous: Inspecting Resources with Inspector Next: Inspecting Prompts with Inspector