This OpenAI post “Build Git MCP Server with the OpenAI Agents SDK” shows how to implement an MCP Server into an agent.

If you want your Python agent to answer questions about a Git repository—“Who’s the top contributor?”, “What changed last?”—the cleanest way is to plug in a Model Context Protocol (MCP) server. MCP is an open protocol that standardizes how LLM apps connect to tools and data; the OpenAI Agents SDK has first-class support for it, so your agents can discover MCP tools and call them safely. Think of MCP as the USB-C of AI apps: a single, consistent way to plug in new capabilities. OpenAI GitHubmodelcontextprotocol.io

Below is a complete code sample using the official Git MCP server (mcp-server-git) and the Agents SDK. After the code, we’ll cover the packages, key components, server types, and how we scope the server to a specific repository directory.

What’s the Git MCP server?

mcp-server-git is a standalone MCP server that exposes Git operations (read/search/manipulate) as MCP tools. Agents don’t shell out to git directly; instead, they call these tools through the MCP connection. Because the server is a separate process, it can enforce arguments, validate inputs, and give you clean, structured results. It’s actively evolving, so expect tool names and capabilities to grow over time. GitHubPyPI

Why MCP with the Agents SDK?

The Agents SDK discovers MCP servers at runtime, lists the tools they provide, and then forwards tool calls when the LLM decides to use them—this keeps your agent code small and the tool boundary explicit. You can also enable caching of the tool list to reduce latency on repeated runs. OpenAI GitHub

Packages you need

  • OpenAI Agents SDK (openai-agents) provides Agent, Runner, tool decorators, handoffs, tracing, and the MCP client adapters you’re using (MCPServerStdio, etc.). OpenAI GitHub
  • mcp-server-git is the Git MCP server you’re launching. You can run it directly or via uvx as shown here. PyPI
  • uv / uvx lets you run tools in an ephemeral environment without globally installing them—handy for CLIs like MCP servers. (Install uv, which provides uvx.) Astral DocsPyPI
  • python-dotenv (optional) to load OPENAI_API_KEY and other settings from a .env file, as the snippet demonstrates.

Components in the code

  • Two general-purpose tools exposed via @function_tool: generate_password (crypto-secure) and get_time. The decorator makes them discoverable and schema-validated. OpenAI GitHub
  • A specialist agent (python_tutor_agent) used via handoff—your “triage” agent can delegate Python Q&A to it when appropriate. OpenAI GitHub
  • Tracing: with trace(workflow_name="MCP Git Example") captures a timeline of calls (including MCP tool listing), which is helpful in debugging and evaluation. OpenAI GitHub
  • MCP binding: async with MCPServerStdio(...): launches mcp-server-git as a subprocess over STDIO, one of MCP’s standard transports. The SDK also supports SSE and Streamable HTTP if you host servers remotely. OpenAI GitHub

Types of MCP servers (transports)

MCP recognizes three transport styles, and the Agents SDK mirrors them with MCPServerStdio, MCPServerSse, and MCPServerStreamableHttp. In this example we use STDIO for a simple local subprocess, but the same Agent can switch to remote servers later by changing the transport. OpenAI GitHub

How the run works

  1. uvx mcp-server-git starts the Git MCP server in a temporary tool environment; the SDK connects over STDIO. Astral Docs
  2. The triage agent answers a multi-part question by calling your local tools and possibly handing off to the Python specialist.
  3. You then create a repo-focused agent that asks two Git questions. Behind the scenes, the SDK has already called list_tools() on the Git server so the LLM knows what it can do; when it chooses to, say, summarize the last change, the SDK forwards a call_tool() to the MCP server and returns structured results. OpenAI GitHub

Scoping Git operations to a directory

Notice directory_path = "/workspaces/openAIAgent" and the instruction string: “use that for repo_path.” Most Git MCP servers expect a repo_path argument when you call tools (e.g., log, status, diff). By explicitly telling the agent which path to use, you ensure every tool call is scoped to the intended repository. This is a best practice for both safety (no accidental operations outside your sandbox) and clarity (multi-repo environments). Some Git MCP variants even allow multiple repos; in those cases, you can pass different repo_path values per call. PyPIAwesome MCP Servers

If you prefer to avoid relying on instructions, you can also wrap or pre-configure the server to set a default working directory (server-specific) or add light tool filtering so only read-only tools are exposed to the agent. The Agents SDK supports both static and dynamic tool filtering. OpenAI GitHub

Why this pattern scales

  • Separation of concerns. Your agent focuses on reasoning; Git operations live behind an MCP boundary with clear inputs/outputs. That keeps your codebase clean and auditable. OpenAI GitHub
  • Portability. You can swap or add servers (Filesystem, Fetch, Databases, etc.) without changing agent logic—just attach another MCP server to the agent. GitHub
  • Security posture. Because MCP servers are processes with well-defined contracts, you can gate write actions (commit, push) or require explicit flags and prompts before enabling them. Treat repo_path as a guardrail: point it at a throwaway checkout or CI workspace for analysis. PyPI
  • Performance knobs. Set cache_tools_list=True to avoid re-listing tools every run, especially for remote servers. Use tracing to spot slow calls and the STDIO vs SSE vs Streamable HTTP trade-offs if you move off-host. OpenAI GitHub

Getting this running locally

  1. Install the SDK (pip install openai-agents) and uv (which provides uvx). Then the code can launch mcp-server-git as shown. OpenAI GitHubAstral Docs
  2. Ensure OPENAI_API_KEY is available; the snippet loads .env automatically if present.
  3. Point directory_path at a real Git repo.
  4. Run the script. You’ll see outputs for the triage question, followed by Git-aware answers powered by the MCP server.

That’s it: your agent now speaks Git through MCP—modular, auditable, and ready to grow as you add more servers or move them off-machine.


Discover more from CPI Consulting

Subscribe to get the latest posts sent to your email.