API parameter for annotations with negative score

Platform(s): API
URLs: https://api.inaturalist.org/v1/observations, https://api.inaturalist.org/v2/observations

Description of need:
I have made 300,000+ annotations, mostly life stages of shield bugs. I do this very carefully, but I (presumably) still make mistakes. As only the observer could remove any incorrect annotations and others can only vote, and I do not get notifications of either action, I would like to be able to find annotations that I made that people disagree with. Given the large amount of annotations I do not think the usual method of downloading all observations through the API and filtering them locally would work, but correct me if I am wrong. Instead,

Feature request details:
I propose adding a parameter annotation_max_score which ideally works in conjunction with annotation_user_id. For example (in observation_query_builder.js):

  if ( req.inat.annotation_users || params.annotation_max_score || params.annotation_max_score === 0 ) {
    const initialFilters = [];
    if ( req.inat.annotation_users ) {
      const annotationUserIDs = _.map( req.inat.annotation_users, "id" );
      initialFilters.push(
        esClient.termFilter( "annotations.user_id", annotationUserIDs )
      );
    }
    if ( params.annotation_max_score || params.annotation_max_score === 0 ) {
      initialFilters.push(
        { range: { "annotations.vote_score_short": { gte: params.annotation_max_score } } }
      );
    }
    
    const nestedQuery = {
      nested: {
        path: "annotations",
        query: {
          bool: {
            filter: initialFilters
          }
        }
      }
    };
    searchFilters.push( nestedQuery );
  }

Even better would be if one could also select the annotation, e.g. adding annotation_term_id to keep backwards compatibility, but that is perhaps too much to ask.

If I understand correctly, if a parameter for this is added to the V2 API I can handcraft a URL to get the query to work on the Explore page, and the same for the V1 API and the Identify page.

remember to vote for your own feature request

1 Like

you could do it this way. it would just take a lot of requests to get 300,000 observations.

i think a better request would be to add an annotation_disagreement sort of field and related filter parameter that would allow you to find observations that had any disagreement on annotations. using that to get a smaller set of observations, you could then more easily do your own sort of client side filtering to get exactly what you wanted.

1 Like

See related forum discussions as well:

https://forum.inaturalist.org/t/searching-for-disagreed-annotations/21763

https://forum.inaturalist.org/t/how-to-find-my-incorrect-annotations/38048

1 Like

I would like to check this regularly. I guess if order_by=updated_at works for annotation votes I could try using that to not have to download all 300,000 every time (perhaps a developer can weigh in), but I would prefer not to have to write a script for this, especially since the ElasticSearch backend already seems able to handle this just fine.

To clarify, the API parameter that I propose would also work on its own, without annotation_user_id. Additionally, I chose annotation_max_score since it is more flexible than a yes/no switch and since annotation_min_score is already used as a parameter internally.

However, in spite of generally preferring more flexible parameters, I do not think there is much need to filter for observations where some annotation is disagreed with and some other annotation is made by a particular user. Hence, I would suggest the annotation_max_score/disagreement and annotation_user_id parameters to work in conjunction instead. In my opinion, this would greatly lower the barrier for people to check their annotations, without using external tools or writing their own scripts.

i don’t think it works. when i checked the other day, the observation is reindexed, but the update date for the observation does not change.

i personally don’t think this is good design, but that’s me.

unfortunately, i think this is the consequence of the voting design for annotations. i don’t think i would have implemented annotations this way, but it is what it is at this point. there are many variations of disagreements that people might be interested in. so you just need to allow people to define for themselves what they’re looking for.