Hello explorers and developers,
If you have been following the discussions around Chatbots and MCP servers to talk to your family tree, here is another experiment in that category:
If you have an environment where you can pip install packages and you have a Python environment that has access to the gramps source code and your trees, then you could try this out.
pip install gramps-ez-mcp
This will install a complete MCP server ready to run (no docker, no need for gramps-web-api, etc).
You will need an API key set from an LLM vender (like Anthropic, or OpenAI), and a method of using an MCP server (such as an AI coding environment, like Cursor). Here is a simple one:
pip install ez-mcp-toolbox
Then create a ez-config.json file, like:
{
"model": "openai/gpt-4o-mini",
"model_parameters": {
"temperature": 0.0
},
"mcp_servers": [
{
"name": "gramps-ez-mcp",
"description": "Gramps EZ MCP server for genealogy chats",
"command": "gramps-ez-mcp",
"args": ["Gramps Example"]
}
]
}
changing āGramps Exampleā to the name of one of your Gramps family trees.
A quick way to get a list of names is:
gramps -l
At that point, you are ready to talk to your family tree:
Iām thinking that using MCP is going to be simpler for Gramps as it can use a server installed at the OS level, rather than requiring a stack to be packaged with Gramps.
Let me know what you think if you try this out.

