Parallel processor gzip support?

When linking the gzip site to a wiki document, noted that there is a gzip fork called Pigz (pig-zee) focused on optimizing as a multi-threaded Parallel Processor implementation.

Is the slow performance backup or load times related to gzip CPU load? Is it sufficient to warrant the overhead of building in Pigz support into Gramps? Is this even viable for a Python app in the supported OSes?

(I vaguely recall seeing a discussion of POSIX multi-threading coordination problems for Gramps database record locking management. However, compression/decompression seems ideal for threading on multi-core processors.)

We use the python gzip module for file compression.

1 Like

Yes, I’d discovered that a few months ago when intending to manually harmonize some strings in the XML ‘description’ text data. Was surprised that the .Gramps file was compressed with another uncompressed document of the same filename & extension inside.

I’m just wondering how much of each backup & archive is the gzip processing as opposed to I/O. Since Pigz IS a supported gzip branch, I was just indulging a moment of curiosity.

And, if the temp storage is compressed, a multi-threaded option might have performance perks.

As a test, try decompressing the backup outside of Gramps and then importing it.

1 Like

I’ll run some experiments.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.