What fields must be edited through API to alter coordinates of an observation?

Using the API at https://api.inaturalist.org/v1/ (being authenticated) I am trying to edit the coordinates (latitude and longitude) for an obscured observation.

Obviously my API requests (using PUT /observations/{id}) are processed: I was able to successfully edit the ‘accuracy’ value of the obs.

However when trying to edit the coordinates, even though the request appears to succeed (status 200), the coordinates do not change on the resulting webpage for this obs.
My request body is like this:

  "ignore_photos": true,
  "observation": {
    "private_location": "42.251,8.947",
    "private_geojson": {
        "type": "Point",
        "coordinates": [

I also tried to edit the coordinates for each field using two separate requests… to no avail.

Am I missing something? did I hit a bug?

i’ve not tried to do it myself, but the first thing i would try is to just update observation.latitude and observation.longitude.

even though the old API is mostly deprecated, it still is similar to the v1 API, and sometimes when the documentation for the v1 API is lacking (especially for non-GET requests), the documentation for the old API can be informative.

UPDATE: i tried what i described above, and it works. here’s the curl command i used from my Windows command line (with JWT redacted):

curl "https://api.inaturalist.org/v1/observations/170190090" -X "PUT" -H "Content-Type: application/json" -H "Accept: application/json" -H "Authorization: redacted" -d "{\"ignore_photos\":1, \"observation\":{\"latitude\":3, \"longitude\":-3}}"

just curious – why are you trying to edit observation locations via the API?

1 Like

Thanks! It looks like using the separate latitude and longitude fields works as intended.

For the record:

  "ignore_photos": true,
  "observation": {
    "latitude": 42.251,
    "longitude": 8.947,
    "positional_accuracy": 123,
    "geoprivacy": null

does the job: updating in a single operation the coordinates and accuracy, while removing obscuration. Now on to updating thousands of obs :stuck_out_tongue_closed_eyes:

I have to update (round a bit) the coordinates for 8.000+ obs.
It’s impossible to perform that by exporting to csv / editing the csv / reimporting the csv, I have to resort to API calls and a bit of bash scripting.

hmmm… ok.

earlier, you mentioned trying to update an obscured observation, and your example script seems to set geoprivacy to open.

if you’re trying to remove the system’s standard obscuring function in favor of “manual” obscuring, just know that you need to be aware that when you remove the system’s obscuring function, your image metadata will become visible to other users again, and if you have GPS coordinates attached to your images, the coordinates attached to your images will become visible to other iNat users again.

1 Like

Well, I’ll have to find also a programmatic way to expunge that info or, if it fails, to download & reupload photos without their gps metadata :) edit: it’s actually as simple as downloading all original images (their file having been cleaned of metadata upon initial upload), then reuploading them!
Too bad the iNat software lacks the ability to round coordinates or sanitize image metadata on-demand… :disappointed:


/thread can be closed

it looks like all your photos are licensed CC BY-NC and are hosted the AWS Open Data bucket. but if they weren’t, you would have wanted to be careful of how much data you download, to stay within recommended limits. every “original” sized photo is roughly 2MB. so with thousands of observations, you’re dealing with at least 10s of GBs.

for future folks who are reading this and have not licensed your photos (or if iNat is no longer participating in the the AWS Open Data program), you will want to extra observant of the limits because your photos are not hosted in the AWS Open Data bucket.

1 Like

I see no mention of specific (non-)limits for ‘AWS Open Data’ files in that page: do you know what such limits are?

Anyway, even with some rate-limiting in place so as to spread the operation over 48hrs, updating 8.000 obs (while downloading/reuploading ~20.000 images of 1.5mpix each) should represent less than 24GB per day. In any case, I’ll test and monitor the bash script on a few obs first before processing the whole bunch. Hopefully I will be able to lift the rate-limit later, when similarly processing future smaller batches of new observations.

as i understand it, there are no download limits for images hosted in the AWS Open Data bucket. below are snippets from a thread where i posed the question, and staff responded:

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.