Gramps Web on OSMC

Hi,
I am eager to finally test gramps web as it seems to be what I was waiting for a long time. I am using gramps since more than 10 years and the possibility to sync with a web version of it would just be perfect (I currently use webtrees running on my synology nas which is great but the data exchange between the two systems via gedcom is sometimes not very convenient).

I want to have it running in my home network on my raspberry pi model 3b which is already serving as my kodi media host system using the osmc image for rpi which is probably also the reason why I am struggling.

I managed to install docker and docker-compose via ssh on osmc and then followed the instructions for docker-compose.yml. The installation seemed to run correctly, but when I tried to access gramps web on my computer via the ip address of the raspberry pi https://192.xx.xx.xx:80 the web page directed me of course to the kodi web server which is on port 80.
I then changed the port to 8080 in the docker-compose.yml but nothing happens when I call the ip address with this port.
I read in another thread here that David does not recommend using another port than 80. If this is the case, I would need to get another rpi, which I would like to avoid. In the same thread a user mentions that the standard path for installation of the docker-compose.yml via ssh is the reason why it doesn’t work for him so he suggests to use /usr/local/… (sorry for not providing the link to the thread but I cannot find it anymore).
Any ideas what I am doing wrong? Any help would be very much appreciated.

Thanks!

Hi ukulele31, have you tried if docker and grampsweb works at all? You could try to shut down kodi first and then restart docker (‘docker-compose up -d’, to be sure it is correctly built) and see with ‘ps ax’ if docker runs on port 80. I also suggest, if not already done to reduce the number of unicorn-workers. I’m running on olimex (also ARM-hardware) which runs dead-slow when using the default 8-workers.
Also, when you change the port, you may need to rebuild the docker-container as while building the port seems to be included in the container. So use ‘docker-compose down’ and again ‘docker-compose up -d
Hope this helps.

Hi insomniux,

wow, this was exactly the kind of help I was hoping to get here! Thank you very much! :+1:
I followed your recommendations and stopped kodi and reduced the numbers of workers to 2 and could easily reach gramps web on my raspberry pi under its IP address. I am now already playing around with gramps web and the sync tool :grinning:

Apparently, there is a way to change the port of kodi, so maybe this is a better option than changing the port of gramps web, however, I haven’t tried it out yet.
Edit: just changed the port of kodi and now I can use both at the same time on my rpi.

I guess that after some time of using gramps and gramps web in my home environment, the next step will be to make it accessible over the internet, however I am kind of hesitating, though. If there are any recommendations on how to do this safely on my rpi, any help is very much appreciated.

Best,
ukulele31

Hi @ukulele31, I missed your post - had set up notifications incorrectly.

I am personally serving a Gramps Web instance to the public internet from a RPi that also servers Nextcloud and some other stuff. It works well using acme-companion as described here Docker with Let's Encrypt - Gramps Web.

<advertisement>
You are also welcome to try Grampshub if you want to save yourself the hassle :slight_smile:
</advertisement>

Hi @DavidMStraub,

thanks a lot for your reply and the invitation to use Grampshub.
I finally managed to have grampsweb on my Raspberry Pi Model 3 running with OSMC being accessible to the public internet via a reverse proxy on my synology disk station (I only have a DS 218j which does not support docker directly, that’s why I had to take the detour via the RPi in the first place).
Once again, thanks for all your efforts you put into grampsweb. This is a real gamechanger.

Best,
ukulele31

2 Likes

Hi @DavidMStraub ,

after some testing, it seems that my RPi Model 3B was not powerful enough to handle both webservers simultanously.
Therefore, I would like to acquire a new RPi which is only used for grampsweb.

The question which comes along is which model to purchase. You mentioned that you are running an instance on an RPi yourself. Which Raspberry are you using for it?
I am hesitating between a

  1. RPi zero 2 W with 1GHz quad-core 64-bit Arm Cortex-A53 CPU but only 512MB SDRAM or a
  2. RPi Model 4 B with Quad core Cortex-A72 (ARM v8) 64-bit SoC @ 1.8GHz 1GB, 2GB, 4GB or 8GB SDRAM

Do you have experiences with the used memory and the CPU of grampsweb? Would the RPi zero be powerful enough to handle grampsweb? Or do you recommend the RPi 4. In the latter case, which size of RAM would you recommend?

Thanks a lot for your insights!
Best,
ukulele31

Hi

No idea about the web server, but if going for a new Pi surely you
would look at the Pi 5 latest version both 4 and 5 cannot run from 3B
power supply and 5 cannot run from 4 so you will have to buy a new one
either way and both need a cooler.
Just got one as a Christmas present with web server in mind at some
point but running quite well as is.
Did web servers trial a few years ago on 2B but need one pi for each server

Phil

Hi Phil,
the power supply is not a relevant issue since it would need to be purchased anyway along with the Pi, the case and the sd card.
RPi 5 would be overdimensioned for this project as I know that grampsweb runs on a RPi 3.
The question is:
do I need to spend 100€ (RPi 4) or 50€ (zero) for this project?
Thanks,
ukulele31

Hi,

I am running Gramps Web, Nextcloud & Photoprism simultaneously on a RPi 4 with 4 GB of RAM. Idle usage from everything seems to be around 1.5 GB. I am using 4 workers for Gramps Web - this is important for memory usage, since every Gunicorn worker will consume a certain amount of memory even when idle, a couple of 100 MB I think. See here Limit CPU usage - Gramps Web.

Overall, I am very happy with the performance.

Hi @DavidMStraub,
thanks a lot for your valuable insights! They very much appreciated.