Plans to Update Forms Addon Over the 2024 Summer

I understand what you’re saying, but conceptually speaking, there is no big difference between GEDCOM, JSON, and XML, in the sense that they’re all formats that you can use to write information from linked objects to a text file. GEDCOM X can use JSON and XML, and GEDCOM-L is a welll known extension of GEDCOM 5.5.1 that could be further extended to store the things that you mentioned, like farms (properties), and their inhabitants, and events that describe the sale of such a property. What I mean is that the low level GEDCOM syntax does not forbid you to link data in other ways than one would use for family trees. And for some, these files would be much easier to read than JSON or XML. Storing text styles in GEDCOM is a bit difficult, because there is no standard for that, but when you stick to plain text, including UTF-8, the formats are fully interchangable.

In other words, if you want, it’s perfectly possible to use a GEDCOM extension for an OrtsFamilienBuch like yours.

I fully agree. My problem with GEDCOM is not that it is per se BS (even if it appears a bit clumsy and stone age compared with XML). My problem is that

(1) the implementation of GEDCOM import and export routines you meet “in the wild” are very often problematic leading to the situation that an export from one software package cannot be read by the import routine of an other one though both claim to be GEDCOM. And Gedcom X or Extensions or whatever else do not really improve that situation. After some years dealing with those problems, I simply got fed up and vowed never to use GEDCOM again. May be that was a bit impulsive but

(2) I use many event types and roles that are simply not supported by GEDCOM, so if I used GEDCOM, I would automatically lose a significant part of my work.

And no, I don’t deal with an Ortfamilienbuch. The usual Ortsfamilienbuch is based on the primary events of a life (birth, marriage, death) that are mostly available from church registers. In the area I’m specialising in, using those primary events is very often not possible before 1700/50 for several reasons. A lot of people tried nevertheless and created nonsense. Therefore even if you find a database or publication claiming to contain information equivalent to an Ortsfamilienbuch, it will be mostly nonsense. What I’m doing instead is combining the information from church registers with extensive information from seigneurial registers going back before 1600. This means that I can recreate the life and the social network of those farmers in the 17th century with much more detail. I’d call it “historical sociology”, “historical demography”, or “historical ethnography”, but definitely not Ortsfamilienbuch.

If you can read a bit of German, just look into the “Historischer Atlas von Bayern” for the old district of Griesbach. On page 61 you’ll find the farm name “Demmerlehner” (which is an old version of Demlehner) located in Mitterhaarbach. The farm names you find on this page and all other pages are the situation around 1760 and I’m trying to geolocate all those farms (which is not always trivial), to build a history of the farm owners (which is very different from trivial), their families, and how they were interconnected to each other on the social level. Of course you can compile an Ortsfamilienbuch from this information simply by throwing away a lot of information.

2 Likes

I get it, and parsing XML is often much easier. I worked with C# and .NET in my last job, and found that it’s much easier to process Gramps XML with those tools, and they would probably work well with whatever you store in XML.

This sounds to me a lot like the program Evidentia, it may have the features you are looking for Evidentia Screenshots – Evidentia

The drop down list is already incorporated into Gramps. Everytime you add a new role to an event it gets added to the list of roles you have previously used that you can then simply select to set again. If the list of roles is implemented here it would simply be to allow selection via this already included feature in addition to the creation of new roles. The purpose would purely be to increase speed of entry.

ha, I don’t really use facebook anymore either. This was the software I had created although it isn’t really fleshed out for the type of thing you are trying to do. GitHub - rentheroot/Census2Ged: Converts census transcriptions made in Genscriber to gedcom files

That being said I get what you are wanting with the connections between people. Gedcom actually does have an association tag which can be used to create associations between people. Personally though I make use of roles to connect people with one another. This program doesn’t work anymore because of the amount of time that has passed, but back in highschool I did make another tool that seems like it could be something that could be developed into something useful for that type of research. See here: FamGenealogy : Visualizing Shared Events in Your Tree

The size of nodes in the network graphs are determined by how many connections are made between people using roles.

If graphing shared events is something people are interested in maybe after I have updated the forms gramplet I will try and make new one that incorporates this functionality.

1 Like

There is some confusion here with the word “Role”
So if I create a Census Event using the Form Addon
Each Person added to the Event via the Form gets the Role “Primary”
In the Form I have a Field/Attribute named “Role” which contains the
likes of “Head” “Wife” “Son” “Daughter” .
These are two completely different and to me acceptable uses of the word
“Role” for different purposes.
The reason being that Census having nothing to do with Families they are
about the groups of individuals residing in the one property on the one
night and their relationship to one another. “Head” “Servant” “Boarder”
*Son" for each of which the Census is a unique event hence the Role Primary.
So it would need the creation of CensusRole with a completely different
drop down list for the Form Addon
phil

1 Like

To state the obvious: the many problems with GEDCOM import/export implementations can be understood as a direct consequence of the clumsiness of GEDCOM. And may be, the definition of GEDCOM is not precise enough so different people understand it differently. Compared with that, parsing the Gramps XML file is something even I as a hobby software engineer is able to do.

If I understand the screenshots correctly, Evidentia is not a genealogy package but a tool to define hypotheses and link documents as evidence to those hypotheses. And it appears to be limited to GEDCOM exports which is a KO criteria for me as I described in this thread. I should probably mention another KO criteria: I never use software for something important that is the product of a single individual. It’s simply too risky …

So the size is proportional to the number of events an individual is connected to in any role?

I understand what you’re saying, but in reality, the GEDCOM issue is a red herring, and so is the database for that matter. It is very easy to write software that can convert GEDCOM to JSON or XML, without data loss.

And in way, this so-called clumsiness is a design feature. GEDCOM was created long before formats like JSON and XML were invented, and that made it into an industry standard, in the same way that Windows became one, even though one might say that unix was much better, at the time.

The standard’s vagueness is a feature too, because it accomodates a variety of programs, with different data models, and that’s what the industry likes.

I know this, because I’ve seen that even though GEDCOM X has been available for years, noone has accepted it as the new universal file format. It’s too much work, and has no real advantage for anyone in the industry, partly because it might be making migration to another program too easy.

Programs like Ancestral Quest, Legacy, and RootsMagic use GEDCOM X to exchange information with FamilySearch, for the simple reason that there is no other way, and FamilySearch is a monopolist, in practical terms. Ancestry has its own protocol, spoken by FTM and RootsMagic, and My Heritage has one too, spoken by their desktop program, Family Tree Builder.

And although parsing XML is easier, it is way more important to create a better model for the way that we relate data, like FamilySearch does on-line. That’s what enables me to reject evidence, if I want to.

And there we have the reason not to use FamilySearch if the evidence was so good (even though false) that unknown persons put it on “your” tree then there is something fundamentally wrong with the system.

No doubt they have been huge assistance in the field of accessibility of records but that is where it should start and stop.

phil

1 Like

There is nothing wrong with the system, because it was designed to work that way, as a shared tree, that I am very happy with, as a user.

It’s also irrelevant, because I’m only using it to illustrate a mechanism that can be used to associate persons in your tree with evidence, like here, for the marriage of my paternal grandparents:

The essence here is, that the persons on the left and right are independent objects in the database, which probably live in different tables. They all have unique IDs and URLs, and this model gives me a chance to attach persons as a source for persons in the tree, and to detach them too, when I want.

I can do exactly the same on Ancestry, where my tree is private, like here:

And IMO, this is way better than using the Forms Gramplet, because I can see the source and tree persons side by side, and I can also perform easy searches in both the evidence and conclusion domain.

This is very much like the concept of matching up information recorded on index cards that I described in my earlier post. We don’t have the concept of an evidence person (sometimes called a persona) in Gramps yet. This is why the Forms store the information in event reference attributes instead. The user links to a person when they enter the data.

We could consider creating a Persona table to store this index card information.

3 Likes

Enno I can see where you are coming from but this will only work with an online tree my main tree is offline and I certainly do not work on Ancestry to produce a Tree that I can export as GEDCOM to bring into GRAMPS lesson learnt too many people select Manchester, Jamaica rather than Manchester, Lancashire. Also too many transcriptions errors on all websites.
So I download all the raw data from Ancestry, FamilySearch, FMP and then produce my own transcription (via Forms). My Ancestry Tree only has DOB, DOD and location by County/Country for each event for any individual and is public to encourage DNA Matches and Hints nothing else I originally had hopes for the message system but too high a proportion never respond.
So how would you get the base census data into GRAMPS so you could produce a similar look and feel.
phil

Hello Phil,

The answer is in Nick’s reply right above yours, which may imply that any person data stored in event reference attributes now, and any new data entered in an improved Forms Gramplet is stored in that new person(a) table, which can store data as found in the source record, without any furher interpretation. This means that it may contain embedded events, which store place names like they are found in the source, withiout further interpretation, so that you just see Manchester there as text, without implying which Manchester that is, just like it works in Gramps now.

The source itself can be linked to a specific location, because you often now where the source was created.

With such a table, you can also register whether an extracte person is already linked to a person in the tree or not.

In this situation, persons still have roles, like before.

Regards,

Enno

I feel I am still not explaining properly how I work properly I will
need to think of how best to present this
phil

I am hoping these 3 images show how my screen is set when I am entering Census Data




Image 4 is the bit clipped of Image 2

You will notice in image 3 the downloads in this case from Ancestry for the Raw Census Data.
Hope this is helpful
phil

1 Like

I am so glad that there have come some other people to this forum that advocate for this.

I have tried to do that since version 5.0 was released as beta.

there are so little that needs to be changed in Gramps to make it possible to do both feature for both document-based and event-based research…

And YES, I know that the reprogramming does take a lot more work than it seems when thinking about the “small changes”.

These two features would put Gramps in a league of its own:

  • Event-based research with main/sub events and event for places
  • Document based research

And with a feature to at least read biblatex and “json-CSL” or “citeproc-json”… Gramps would become a tool that could be used for a lot more than genealogy…

in addition, add an export to one or two open data formats like json-ld and a network graph format…

Gramps would go from being great software to become an amazing tool for historical social research…


I actually use Obsidian and Foam for VSC for my research now and use Aeon Timeline for timelines, sadly that mean I have to register data 3 or 4 times, so less and less goes into Gramps.

You should take a look at Obsidian for your research, it is actually kind of easy to link people, places, items, documents etc., etc., with simple wiki-links, it has a lot of addons that can be helpful.

I do mostly research on the Norwegian Mercantile Fleet between 1920 and 1930/45. but I also do some Norwegian farm research, where I try to find everyone that has owned or lived on those farms.

What I do is that I make a Markdown Note for each object, e.g., one note for a person, one for a place, one for a company, one for a ship etc., and I create Notes for any Events.

In addition, Obsidian have support for Zotero, so I store all my documents and sources in Zotero, then I can either just make a link to that document in Zotero or I can add it to Obsidian and create markdown notes for it, with information, transcripts etc.
I can easily link all these notes together with wiki-links and build up a network graph for a graphical view, in addition I can use one of the maps addons to ad geo-locations.

It also supports md-tables with wiki-links inside the table as well as YAML so it is possible to add nearly any type of structured data in addition to metadata.

The good thing with Markdown, even the extended versions you can use in Obsidian, is that it is just plain text, so if the software gets deprecated, you can still read the information.

AND it stores all your information locally…

The only software I have found that actually have a real approach of being a document-based research tool for genealogy (and some other research) is Clooz 4.
At least as being actively developed.

Gedcom is as always, a limitation in this niche. Gedcom is and will always be an LDS based lineage-linked export/import format, nothing else, doesn’t help much to call it Gedcom X and base it on XML, it will still be a format that is focused on the work and information that LDS wants and collect.