(Gramps 5.1.3 on macOS 10.13)
How do I bulk move data from one field to another?
I recently discovered that I had been entering surnames into the Surname Prefix field. Looking for a way to avoid manually moving 1500+ pieces of data.
Thanks,
TG
(Gramps 5.1.3 on macOS 10.13)
How do I bulk move data from one field to another?
I recently discovered that I had been entering surnames into the Surname Prefix field. Looking for a way to avoid manually moving 1500+ pieces of data.
Thanks,
TG
This looks very similar to the experiment in data cleaning Iâm doing with the Isotammi SuperTool.
(See yesterdayâs posting in the Development thread.)
While Iâm filling in a blank Origin types for surnames based on the comparing a Surname to the fatherâs surname, you need to fill in Blank Surnames if there is a Prefix. You have the extra item of then clearing the Prefix⌠but the task is similar. Oh⌠and youâre working in reference Person view and Iâm checking Families.
MaybeâŚbut it looks like itâs beyond my capabilities.
Plus, as you wrote, Iâd still have to touch each Person record to clear the Prefix.
Actually, itâs a bit beyond me at the moment too. But I hope that wonât be true in about a week of intensive experimentation.
FYI, clearing the prefix duplicate data should just be one or two more lines in the script.
It will be a good experiment in adaptation. Might take me a bit to figure it out though.
if you have an XML capable editor, consider using Gramps to export your entire tree without media, be sure to clear the check mark for compression. Then use your XML editor to make your changes to the .gramps file. Then in Gramps, create a new empty tree and import your edited .gramps file.
(Iâm not using macOS, so canât give any advice about an XML capable editor)
Donât trust âExportâ! It filters data⌠It dropped a LOT of my disconnected records (Like Sources painfully transcribed from a bibliography & ToDo notes. The drops that REALLY bugged me were Places for my home county. I had entered a complete list of Cemeteries and historic place names but just hadnât recorded Events in some of them.) Floaters that havenât linked to the Tree yet are at risk of being ignored during export.
Use the backup which does NOT filter.
When you do the Export, just make sure to clear the check-mark for the Privacy filter, and set the other filters to include all selected people/notes/records.
When I do this I get an identical XML file with both Backup and Export.
The advantage of the Export is that you can turn off compression, so the user doesnât have to figure out how to decompress the file before using an XML editor.
It seems like the Gramps CSV import could be used to update some fields, and perhaps enhanced to update others. The wiki doc says " If you use the âgrampsidâ as a way to assign specific ids, be very careful when importing to a current database. Any data you enter will overwrite the data assigned to that grampsid." That seems exactly what is needed in this case. In other words, first export the names to a CSV file. make changes in the spreadsheet, and then import them. I have not tried this myself!
Hi. A People Category SuperTool script to fix blank surnames (by moving from the prefix field to the surname field) would be something like this
I hope it helps:
[title]
set-surname-from-prefix
[category]
People
[initial_statements]
# SuperTool script for the People category
#
# Goes through the selected persons and sets their surname from the "prefix" field (if the surname is empty)
# and clears the prefix
#
counter = [0]
[statements]
for nameobj in nameobjs:
for surnameobj in nameobj.get_surname_list():
if surnameobj.get_surname() == "" and surnameobj.get_prefix() != "":
surnameobj.set_surname(surnameobj.get_prefix())
surnameobj.set_prefix("")
print(gramps_id,name,"- surname set to '" + surnameobj.get_surname() + "'")
counter[0] += 1
[filter]
[expressions]
"names updated:", counter[0]
[scope]
selected
[unwind_lists]
False
[commit_changes]
True
[summary_only]
True
The Gramps CSV import do not update objects, it overwrite them.
So if you have a Place that used to have multiple âenclosed byâ etc. it will all be gone after a CSV import if the new object have the same ID.
Unless someone has changed it the last 6 month or so.