GrampsChat Addon for 6.0

GrampsChat Addon for Gramps desktop 6.0 is now available. This is a configurable Chatbot conversation Gramplet. You can use it with commercial LLM providers (like OpenAI) or you can use it with your own LLM server.

The addon will automatically install litellm if you have your Addon Manager settings set appropriately:

To use a commercial provider, you need to set these two environment variables before starting Gramps, like this on my Linux machine for example:

export GRAMPS_AI_MODEL_NAME="gpt-3.5-turbo-0125"
export export OPENAI_API_KEY="sk-..."

You can get an account at openai.com. Their models are listed here.

WARNING: this could cost you money! Keep reading for free, DIY server

After starting Gramps, select the GrampsChat Addon (gramplet, developer, stable):

You can then add it to
 ah, just the Dashboard. I guess I need to make sure it can be added on any View. TODO #1.

Oh, I guess I need to set a height. TODO #2. Let’s disconnect it (click gears) to make it bigger.

And ask a question, like “What is Gramps?”

Talk to it, and try different models.

WARNING: as your conversation history gets longer, it uses up more tokens on your provider and can cost you money. Click “New Conversation” to clear the history.

DIY server

(This section requires some computer expertise and may not be for everyone. It may also require a computer with some hefty memory, CPU, and optionally a GPU.)

If you are like me, you don’t want to send your data to some remote server, and don’t want to pay money for exploring these ideas. GrampsChat was designed to also allow you to use your own LLM server.

First, you’ll need ollama. This is a system for hosting your own LLM server.

I think by installing it, it will start a “server”. If not then you can start (in another terminal/console window):

ollama serve

Then you need to pick a model. Let’s start with a small model. Then we need to “pull” it:

ollama pull llama3.2

And finally, you need to stop Gramps and reset the environment variables:

export GRAMPS_AI_MODEL_URL=http://127.0.0.1:11434
export GRAMPS_AI_MODEL_NAME=ollama/llama3.2

Note the “ollama/” in front of the model name. You should see something like:

That didn’t send any data to another server, and is completely free.

In this thread I’ll describe a bit about how this works, and what it might be useful for. But I encourage you to explore!

7 Likes

I just noticed that llama says that Gramps is the “Gnome Family Relations Manager” :slight_smile: Lucky gnomes!

1 Like

A metered service probably needs ‘alert’ CSS applied to the dialog for pay services. Maybe a red for titlebar background? (Using an RGBa transparency to make it pale and work with Light/Dark mode.) Maybe the gramplet backgound if the titlebar is only visible when undocked.

(The map services that for Geography view should probably be that way too.)

The Place Cleanup addon gramplet pops-up an initial Config where users sign up for account because that’s a metered service.

Is GrampsChat potentially more expensive (both cash and resource wise) if you have multiple instances open in different Categories?

SuperTool by @kku is an addon that (as a Tool rather than a Gramplet) has cached content sync’d to the category. His .json-fed expandable Help might be a good addition too.

If you are setting an environment variable to a service provider that costs money, then you’ll be aware of that. I don’t think we can say for certain whether you are using a no-cost model or not.

No. The only thing that costs money is the number of words (eg, “tokens”) that you send to a commercial provider. And that just depends on your current chat history and your question.

When we start getting advanced, and seeding the history with text from your Gramps tree or Gramps documentation, then the number of tokens is going to sky rocket. But that is why we need researchers to explore making a useful system.

If you want to see some serious confusion, use:

ollama pull deepseek-r1:1.5b
export GRAMPS_AI_MODEL_NAME="ollama/deepseek-r1:1.5b"
export GRAMPS_AI_MODEL_URL=http://127.0.0.1:11434

This is a tiny model that will probably run on most laptops. It has a so-called “reasoning” mode such that it generates a bunch of text before it answers your question. Supposedly this gives better answers. But on small models, it is wacky.

Ask Gramps: “How many ancestors do I have?” deepseek will ramble on and on, arguing with itself about the proper mathematical formula to use. Of course, the correct answer is that it is unknowable. A better question may be “How many humans have ever lived?”. And even that is problematic; what counts as being human?

One other point for developers serious about making a useful chatbot for genealogy: if you want to keep track of all of your questions and the models’ responses, you can set:

Get a free Opik API key from: Comet - Build Better Models Faster (create an account on comet.com, get the API key). Then:

export OPIK_API_KEY=...

When you start Gramps you might see an error dialog; ignore it. All of your chats will be logged, like this:

Full disclosure: Opik is an open source LLM tracking and evaluation tool created by the company I work for (Comet ML).

4 Likes

Anyone have any luck with this?

1 Like

Hello,
I did not test GrampsChat Addon yet, but I am just wondering if it will be possible to extend it with some features like OCR support ?
I think on something like mistralai lib (python) or requests lib.

We are no more on the talk feature (conversation), which could be related to Mistral’s models. So, it was just an idea.

regards,
JérÎme

Let me take this wish a step further: transcription of gothic handwriting, that would be nice.

2 Likes

I tested IA, Kurrent/SĂŒtterlin and Latin script support via Transkribus.

It seems also possible according to documentation :

curl https://api.mistral.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $MISTRAL_API_KEY" \
  -d '{
    "model": "pixtral-12b-2409",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "transcribe this"
          },
          {
            "type": "image_url",
            "image_url": "https://ciir.cs.umass.edu/irdemo/hw-demo/page_example.jpg"
          }
        ]
      }
    ]
  }'

=> # Letters Orders and Instructions December 1855\n\nHoag’s Company, if any opportunity offers.\n\nYou are to be particularly exact and careful in these pagineries, that there is no disgrace meet between the Returns and you Pay Roll, or those who will be strict examining into it hereafter.\n\nI am & c.\n\n[Signed]\nEff.

Hey JérÎme! Nice to hear from you again!

I created GrampsChat to be able to use (and easily switch) between language models. You can use ones that cost money, and ones that are free. Unfortunately, the models like pixtral-12b-2409 are not supported by the open source stack. But of course, as you have shown, you can easily wrap the request up in Python and make an API. Luckily, there are some free versions of such models. However, the hardware requirements probably exclude most of the users of Gramps.

In any event, if a user was willing to pay Mistral, you could probably create a nice Gramplet for the Media view.

Is it read with AI??

What do you mean by “read” ?
I just copied the documentation section “Transcribe old documents” from :

If need, you can make your own test by using “La Plateforme”. I did not test any upload from “the chat” (it sounds also good in french), and they seem to also provide some time (or size) limited feature for testing this IA and ‘image<->text’ handling.

1 Like

Hello Doug! Nice to see you again!
You are right, I looked at Ollama search base, but maybe only Mistral-nemo might be used for a ‘free’ test.

I was just wondering if we might play with their OCR API. e.g., into GrampsChat Addon, a quick textual prompt like : “transcribe one of our internal media object (internal uri)”, “find the individuals from {my_local_file}.jpg” or “find the dates set/list on media objects (batch)”, etc.
Sure, could be a nice gramplet for the Media View.

1 Like

It is not clear what they mean by:

The model is available as pixtral-large-latest on our API, as well as for self-deployment as Mistral Large 24.11 on HuggingFace under the Mistral Research License (MRL) for research, or with a commercial license from Mistral AI for commercial use.

The Mistral Research License (MRL) is for HuggingFace?

1 Like

Pekka @PeterPower posted a message on the Finnish Google group for Gramps that has an important observation.

Users who are not fluent in English (but are reading English forums) can have a difficult time with the Gramps terminology. The generic “Google translate” in the Discourse/Facebook/Reddit forum may have different terms than what a Human added to the Weblate glossary.

Could the AI Chat be used with the Weblate glossary to find similar English phrases and show those strings in their language? Or do the opposite, give them a lookup of terms in their native language so they can see the possible English phrases to use in their posting?

This could even be expanded to developers creating a new module/addon. Compare their addon’s “untranslated strings/phrases” through Weblate to find similar (already translated) phrases that could used. Use those suggested phrases in their addon to reduce the translator burden. Eventually, run it against the whole Weblate glossary for reducing the size of overall string collection.

Sure, wording is important. The context, the style as well. There should also be a charter, policy, rules, or guidelines for the translation of a word.

I guess, like in French, Finnish (swedish? samis ?), there is full of words for our locale to a translation in or from english?

#: ../data/org.gramps_project.Gramps.metainfo.xml.in:7
#: ../gramps/gen/const.py:259
msgid ""
"Gramps is a genealogy program that is both intuitive for hobbyists and "
"feature-complete for professional genealogists."
msgstr ""

:france: “Gramps est un programme de gĂ©nĂ©alogie Ă  la fois intuitif pour les amateurs et complet pour les gĂ©nĂ©alogistes professionnels.”

:finland: “Gramps on sukututkimusohjelma, joka on sekĂ€ intuitiivinen harrastajille ettĂ€ ominaisuuksiltaan kattava ammattilaisille sukututkijoille.”

:sweden: “Gramps Ă€r ett slĂ€ktforskningsprogram som Ă€r bĂ„de intuitivt för hobbyister och funktionskomplett för professionella slĂ€ktforskare.”


variations locales (“dialect”)

“Gramps, c’est un programme de gĂ©nĂ©alogie qui est Ă  la fois intuitif pour les amateurs pis ben complet pour les gĂ©nĂ©alogistes professionnels.”

:switzerland: Gramps, c’est un programme de gĂ©nĂ©alogie qui est Ă  la fois intuitif pour les amateurs et bien complet pour les gĂ©nĂ©alogistes professionnels, n’est-ce pas ! (‘‘cliché’’)

:martinique: Gramps sé yon pwogram jénéalojik ki fasil pou amatÚ, é ki konplÚ pou pwofésyonÚl yo.

“Gramps li yon program gĂ©nĂ©alogie ki facile pou les amateurs et ki complet pou les gĂ©nĂ©alogistes professionnels.” (mix-up of english, spanish, french, and languages from Africa)


:france: “Gramps est un logiciel de gĂ©nĂ©alogie qui se rĂ©vĂšle Ă  la fois ergonomique pour les nĂ©ophytes et exhaustif en termes de fonctionnalitĂ©s pour les gĂ©nĂ©alogistes chevronnĂ©s.”

:france: “Gramps, c’est un peu comme le couteau suisse de la gĂ©nĂ©alogie : assez simple pour que mĂȘme votre grand-tante puisse s’en servir, mais avec tellement de fonctionnalitĂ©s que mĂȘme les pros de la chasse aux ancĂȘtres en restent baba !”

:france: “Gramps est comme un arbre gĂ©nĂ©alogique numĂ©rique : ses racines sont assez simples pour que les amateurs puissent les explorer, mais ses branches sont si riches en fonctionnalitĂ©s qu’elles comblent mĂȘme les gĂ©nĂ©alogistes les plus exigeants.”

:france: “Gramps, tel un grimoire ancestral, dĂ©voile ses secrets gĂ©nĂ©alogiques avec une simplicitĂ© qui sĂ©duit les novices, tout en recelant une richesse de fonctionnalitĂ©s capable d’émerveiller les gĂ©nĂ©alogistes les plus Ă©rudits.”

:france: "Gramps : simple pour les amateurs, complet pour les pros

etc.

1 Like