MCP: extending AI¶
MCP (Model Context Protocol) is an advanced feature that allows AI models to use external tools, like reading files from your computer or accessing online services. In this chapter we explain what it is, when to use it, and how to configure it.
What is MCP in simple terms¶
Normally, an AI model can only read what you write and respond. It can't "do" anything in the real world: it can't open files, can't search the internet, can't check what's in a folder.
MCP changes this: it allows you to connect "tools" to the model that it can use during the conversation. For example:
- Filesystem tool: the model can read and write files on your computer in enabled folders
- GitHub tool: the model can search for information in repositories
- Custom tools: you can add others depending on your needs, to connect to applications on your PC or to connect to online services
When MCP is active and you ask the model "Using filesystem read the file report.txt in folder X", the model:
- If it's active, understands it needs to use the filesystem tool
- Calls the tool with the file path
- Receives the file content
- Uses it to answer your question
All this happens automatically, without you having to do anything different.
Compatible models¶
Not all models support MCP. It requires a capability called "function calling" that allows the model to generate structured calls to tools.
Models that work with MCP¶
| Model | MCP quality | Notes |
|---|---|---|
| llama3.1, llama3.2, llama3.3 | Excellent | Native support |
| qwen2.5 (all variants) | Excellent | Also great for Italian |
| mistral, mistral-nemo | Good | Fast and reliable |
| command-r, command-r-plus | Excellent | Great for research |
Models that DON'T work with MCP¶
- llama2 (all versions)
- codellama
- phi, phi2
- gemma (version 1)
If you use a model that's not compatible with MCP enabled, it will simply ignore the available tools.
Activating MCP¶
MCP is disabled by default. To activate it:
- In the right sidebar, find the MCP switch
- Turn it on (it turns blue)
- Click the MCP settings icon to see available tools
Configured tools appear in the list. Activate the ones you want to make available to the model.
Configuring tools¶
MCP tools configuration is done in the app/data/mcp-config.json file. The first time you need to create it by copying the example:
copy app\data\mcp-config.json.example app\data\mcp-config.json
Then edit it with a text editor:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "D:/Documents"],
"enabled": false,
"description": "File access"
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": { "GITHUB_TOKEN": "your-token-here" },
"enabled": false,
"description": "GitHub access"
}
}
}
Filesystem tool¶
Allows the model to read and write files. You must specify which folders it can access:
"args": ["-y", "@modelcontextprotocol/server-filesystem", "D:/Documents", "D:/Projects"]
Security
Limit access only to necessary folders. Don't give access to the entire disk.
GitHub tool¶
Allows the model to search in GitHub repositories. Requires a personal access token:
- Go to GitHub → Settings → Developer settings → Personal access tokens
- Create a new token with necessary permissions
- Insert it in the configuration file
MCP in action¶
Once MCP servers are activated you can make requests, preferably mentioning the MCP server you want to use
Repositories¶
There's starting to be a wide availability of MCP servers and repositories. One of these is github.com/modelcontextprotocol/servers
Privacy and MCP¶
MCP servers can also work locally on your computer, however even in these cases some MCP/tools might connect to external services:
- filesystem: completely local, no data leaves
- github: connects to GitHub servers
Before using an MCP/tool, consider what data it might transmit.
MCP troubleshooting¶
"The model doesn't use the tools" - Verify that MCP is active (blue switch) - Verify that the specific tool is activated in the panel - Try a different model (it might not support function calling)
"Error during tool execution" - Check the application logs for details - Verify that paths in the configuration file are correct - For GitHub, verify that the token is valid
"The tool is slow" - This is normal for operations on many files - Smaller models might take longer to generate correct calls

