GrampsChat Addon for Gramps desktop 6.0 is now available. This is a configurable Chatbot conversation Gramplet. You can use it with commercial LLM providers (like OpenAI) or you can use it with your own LLM server.
The addon will automatically install litellm
if you have your Addon Manager settings set appropriately:
To use a commercial provider, you need to set these two environment variables before starting Gramps, like this on my Linux machine for example:
export GRAMPS_AI_MODEL_NAME="gpt-3.5-turbo-0125"
export export OPENAI_API_KEY="sk-..."
You can get an account at openai.com. Their models are listed here.
WARNING: this could cost you money! Keep reading for free, DIY server
After starting Gramps, select the GrampsChat Addon (gramplet, developer, stable):
You can then add it to⌠ah, just the Dashboard. I guess I need to make sure it can be added on any View. TODO #1.
Oh, I guess I need to set a height. TODO #2. Letâs disconnect it (click gears) to make it bigger.
And ask a question, like âWhat is Gramps?â
Talk to it, and try different models.
WARNING: as your conversation history gets longer, it uses up more tokens on your provider and can cost you money. Click âNew Conversationâ to clear the history.
DIY server
(This section requires some computer expertise and may not be for everyone. It may also require a computer with some hefty memory, CPU, and optionally a GPU.)
If you are like me, you donât want to send your data to some remote server, and donât want to pay money for exploring these ideas. GrampsChat was designed to also allow you to use your own LLM server.
First, youâll need ollama. This is a system for hosting your own LLM server.
I think by installing it, it will start a âserverâ. If not then you can start (in another terminal/console window):
ollama serve
Then you need to pick a model. Letâs start with a small model. Then we need to âpullâ it:
ollama pull llama3.2
And finally, you need to stop Gramps and reset the environment variables:
export GRAMPS_AI_MODEL_URL=http://127.0.0.1:11434
export GRAMPS_AI_MODEL_NAME=ollama/llama3.2
Note the âollama/â in front of the model name. You should see something like:
That didnât send any data to another server, and is completely free.
In this thread Iâll describe a bit about how this works, and what it might be useful for. But I encourage you to explore!