i’ve not tried to do it myself, but the first thing i would try is to just update observation.latitude and observation.longitude.
even though the old API is mostly deprecated, it still is similar to the v1 API, and sometimes when the documentation for the v1 API is lacking (especially for non-GET requests), the documentation for the old API can be informative.
UPDATE: i tried what i described above, and it works. here’s the curl command i used from my Windows command line (with JWT redacted):
I have to update (round a bit) the coordinates for 8.000+ obs.
It’s impossible to perform that by exporting to csv / editing the csv / reimporting the csv, I have to resort to API calls and a bit of bash scripting.
earlier, you mentioned trying to update an obscured observation, and your example script seems to set geoprivacy to open.
if you’re trying to remove the system’s standard obscuring function in favor of “manual” obscuring, just know that you need to be aware that when you remove the system’s obscuring function, your image metadata will become visible to other users again, and if you have GPS coordinates attached to your images, the coordinates attached to your images will become visible to other iNat users again.
Well, I’ll have to find also a programmatic way to expunge that info or, if it fails, to download & reupload photos without their gps metadata :) edit: it’s actually as simple as downloading all original images (their file having been cleaned of metadata upon initial upload), then reuploading them!
Too bad the iNat software lacks the ability to round coordinates or sanitize image metadata on-demand…
it looks like all your photos are licensed CC BY-NC and are hosted the AWS Open Data bucket. but if they weren’t, you would have wanted to be careful of how much data you download, to stay within recommended limits. every “original” sized photo is roughly 2MB. so with thousands of observations, you’re dealing with at least 10s of GBs.
for future folks who are reading this and have not licensed your photos (or if iNat is no longer participating in the the AWS Open Data program), you will want to extra observant of the limits because your photos are not hosted in the AWS Open Data bucket.
I see no mention of specific (non-)limits for ‘AWS Open Data’ files in that page: do you know what such limits are?
Anyway, even with some rate-limiting in place so as to spread the operation over 48hrs, updating 8.000 obs (while downloading/reuploading ~20.000 images of 1.5mpix each) should represent less than 24GB per day. In any case, I’ll test and monitor the bash script on a few obs first before processing the whole bunch. Hopefully I will be able to lift the rate-limit later, when similarly processing future smaller batches of new observations.