[Batch Import] csv2sql csv2json, or csv2xml

Hello.
I want to try to import a data set related to architecture, building with some common records with a genealogical search (events, dates, places, sources, repositories, citations, notes, etc.).

Here some samples of the data set:

As the database is SQL, I thought it was possible to generate a simple “bridge”. But I do not have access to this SQL DB.
So, either a translator from csv format to xml, to json or to sql. This project is based on some core wikipedia modules, but I did not look if a convertor already exists and might be adapted.

Looking at current snapshots, the csv fields looks like:
Titre: is an address or a place
Adresse complète: complete (full) address
Adress: Address
Numéro de rue:Street number
Ville:City
Pays:Country
Coordonnées:geographical coordinates
Image principale:main image
Événement:Events with (date and type)
Personne:Individual or Company
Inscription:a specific mark like a classed (classified ?) monument
Langue:Lang
Source:Source or Repository
The Description will be the content with HTML marks or wikipedia syntax, including Source via ref tag with or without attribute.
e.g.,
<ref name="seyboth">{{source|Seyboth Das Alte Strassburg (Livre)}} - Seyboth\, Adolphe\, ''Das alte Strassburg\, vom 13. Jahrhundert bis zum Jahre 1870\; geschichtliche Topographie nach den Urkunden und Chroniken\, bearb. von Adolph Seyboth''\, Strasbourg\, J.H.E. Heitz (Heitz & Mündel)\,1890\, p.11</ref>

<gallery>
Fichier:....jpg
Fichier:...pdf
</gallery>

or

[[Fichier:...]]
[[Media:...]]

should be the image and media stuff into the new description field.

Currently, I just wonder which will be the simpliest (or more direct) way for importing such dataset into a Gramps DB?

Best regards,
Jérôme