Hi Tim,
Just wondering: In the case of NBN, after the initial import from a dump file and if they just change a few records, are you planning to download everything again and then overwrite all 20 million records in GBIF's database? Or are you also interested in using some sort of incremental harvesting?
Best Regards, -- Renato
Locally generated / localised DwC index files? (if you have rich data behind LSID, then this file is an index that allows searching of those rich data using DwC fields)
I would like to see the data file accompanied with a compulsory metafile that details rights, citation, contacts etc are all given. Whether this file needs the data generation timestamp I am not so sure either and the HTTP header approach does sound good. It means you can do a one time metafile crafting and then just CRON the dump generation... This would be for institutions with IT resources - e.g. UK NBN with 20M records.
For Joe Bloggs with a data set, if we included it in the wrapper tools, then it is easy to rewrite the metafile seemlessly anyway so they don't care.
Cheers,
Tim