I am currently working on a project that involves analyzing iNaturalist data, and as part of this, I’ve been looking at identification patterns across different observations. One of the key aspects I’m examining is the number of ID agreements associated with each observation.
For my analysis, I downloaded a dataset from the iNaturalist website. While reviewing the data, I noticed that some observations marked as Research-grade have 0 agreements and 0 disagreements. This seems incorrect because, according to the definition of Research-grade, an observation should have a community taxon supported by at least two people. After further investigation, I found that some of these cases can be explained by taxonomic changes or by the community taxon being assigned at a lower rank. However, there are still some observations that I cannot explain.
I came across a few discrepancies while working with the iNaturalist dataset:
This observation (https://www.inaturalist.org/observations/248065238) is marked as Research-grade, yet according to the dataset, it has 0 agreements and 0 disagreements. This seems contradictory because, by definition, a Research-grade observation should have at least two people agreeing on the identification.
I’m wondering how these discrepancies occur. Are some IDs not counted in the export, just like the observers ID? Is this due to how iNaturalist processes data? Or are there other factors involved?
If anyone has insights into this, I’d really appreciate your help!
Thanks in advance!
When you go to an Observation, you have the option to agree (by hitting an indication button) or you have the option to suggest the name yourself. Or you have the option to go in and explore suggestions and research a little bit and then come at it that way.
Any can produce a RG Observation (if they are the same and at species level) but perhaps only hitting the button triggers the Agreement for your purposes?
(I will always type the name myself if I am adding a comment at the same time.)
edit to add: These show due diligence on the part of the Identifiers for not hitting the button if they felt like they needed to look a little more closely.
This shows two agreements… with the person who suggested the correct ID. Think of it as someone saying “I think it is ABC!” And then two more people saying “Agree!” So three total, but one made the initial suggestion, then two more agreed.
So the system actually differentiates between “agreeing” IDs that came from the “Agree” button and ones that came from manually typing in the same ID that someone else previously suggested? I didn’t realize there was any difference in how these were recorded.
The fish has 2 people agreeing. Tick.
Nudibranch is 3 agree
Not sure where or how you see that?
The dragon is a mystery I cannot explain. 3 people agree to sp. The fourth is only for family. If - 4 agree then it should be at Family - or - what we see - 3 agree to sp.
I often come across that - the label on the box does not match the contents.
Presuming that hitting ‘agree’ vs typing the name yourself/hitting compare and clicking suggest taxon ID from there vs clicking machine view suggestion are all counted differently, as Lucy is saying, it sounds like: Bleeker’s Parrotfish is one machine view + one typed/selected suggestion; Bus Stop Nudibranch is one machine view + one typed/selected suggestion + one agreement; the Sumatra Forest Dragon has one typed/selected suggestion + two agreements.
Where agreement in your terminology means “provides ID that is the same as the research grade outcome” but in the iNat technical terminology means “hit the agree button specifically”.
Looks like https://www.inaturalist.org/observations/248065238 wasn’t updated properly on the backend. That last ID (since withdrawn) from mbeukema a few hours ago forced the system to reindex the observation and it now has the proper number of agreeing and disagreeing IDs.
Whether someone used the Agree button or not doesn’t affect this count.
Ah, OK, was theorizing, which is why I used a question mark when I said
I had noted that the initial species level ID said “improving” so thought perhaps their datasets excluded those and only included “supporting” and perhaps even agreeing using button. Thanks for explaining it @tiwane.
2025-03-09T08:00:00Z
I have often added observations of mounted specimens to fill a taxonomic spot that was empty. The specimens were verified by a specialist with an ID by xxx label photograph included with several shots of important characters. These observations rarely get any other people agreeing or disagreeing, so there are zero added posts which can’t make it to the higher grade. There should be a way to make these postings significant.