Chat with your genealogy?

I am currently researching if it would be feasible to use the Gramps Web API as a basis for an interface to query (and potentially even update) your geneology through chatting with an AI agent. Recent developments like GPT4, GPT functions returning structured data, and middleware like Langchain have made such ideas much more realistic in the last year or so.

Imagine if you could write free form about your family and let the agent build the tree for you - adding people, dates, relations, places, events, and other details as you go. Using Langchain it would probably be possible to program the agent to ask clarifying questions. “Are you referring to your uncle Bob, or your sisters spouse Bob?” or “Was it your grandmother on your fathers or mothers side who worked in a Ford factory?”

My question is: Are there developers in this space who would be interested in working on something like this? I write code myself, but mostly do data science and scripting, so it would be a lot more fun to team up with someone who is already invested in Gramps.

6 Likes

:raising_hand_man: Yes, I’d love to collaborate on that.

It would be great being able to use open source models like Llama 2 instead of GPT, do you have experience with them?

1 Like

Perhaps the developer (@Hace ) of the ChatWithTree (addon for the desktop Gramps) would have some ideas too.

1 Like

I’d be glad to participate in this (I work in this area for my day job).

My recommendation would be to create a MCP server using litellm to talk to the LLM. Litellm allows switching between models/vendors, even using ollama to run models locally.

In fact I already created this infrastructure here: GitHub - dsblank/gramps-ez-mcp: An easy to use MCP server to talk to a Gramps genealogy family tree

The next steps would be to add Python functions to: gramps-ez-mcp/gramps_ez_mcp/tools.py at main · dsblank/gramps-ez-mcp · GitHub

These functions would allow adding entries in the Gramps database. Feel free to make PRs in that Git Repo.

See also: A simpler Gramps MCP Server for chatbots

1 Like

Thanks for notifying @emyoulation

@aerugo

The ChatWithTree addon that was created delivers an oppertunity to “talk to the database” - whereby a large language model (LLM) interacts with the source data of what is stored in Gramps.

To me: so far it is mainly an exercise to see what are the capabilities and if it helps me investigate a bit further in my own genealogy research. Also, depending on what model you connect to, it provides oppertunities to connect your own ancestry with the actual knowledge contained within the models themselves.

I was just trying to enhance the responses to provide a bit more formatted data then the plain text balloons that I had first. There is this pull request up that shows an example response from a model that tells me a bit about some far acencestors living in Rotterdam, the Netherlands:

Meanwhile, I am trying to get the chat more incorporated with Gramps itself. That is, whenever a found person is mentioned in the chat, it should be easy to just “click” on the Person to dive into the details.

Processing texts that are returned from LLMs is not that easy though, but here is a sneak preview of what I am currently aiming for:

The idea in the code is:

  • Whenever the model uses one of the get_person tools (get_person, get_mother or get_father) we keep the person in a list.
  • Then when the model is returning the final answer, we can process that text and find matches for the (also returned) list of people that were searched, to provide links inside the text balloons enabling an immediate EditPerson Gramps screen - as shown on the right after clicking the person
  • This way, the ChatWithTree becomes a more connected tool to talk to your tree and then edit and update the actual information

This highly depends on the way you are aiming these kind of tools for, of course. For me, I am usually searching for more ancestors to slowly extend the tree with people, so often I am just searching the web and the (pretty good) Dutch archives for connections to add more and more people and background data from notary archives and such.

However, extending data is something I am very careful with, as the LLMs can definiately hallucinate things together. I would personally not trust them to extend the tree without me approving anything - your careful setup data would be an hallucinated mess, I’m afraid, so with the current state of AI - I would thread carefully there :slight_smile:

Having said that: connecting Gramps data to LLMs is possible - connecting LLM responses as an integrated clickable guide for the Edit popups is also possible now - and a very fun coding exercise into GTK+ and Python as well btw.

Update:

Added clickable links in PR: render markdown and add gramps links by MelleKoning · Pull Request #784 · gramps-project/addons-source · GitHub