Does the number of “observations unknown after 30 days” in Site Stats include
- observer opt-out of community ID
- missing date or location
- multiple species and other data quality issues
- copyright infringement
Thanks!
Patti
Does the number of “observations unknown after 30 days” in Site Stats include
Thanks!
Patti
My understanding is that the observations unknown after 30 days is the total of all unknowns after 30 days, regardless of issue. About 1.2 or 1.3 million (I gotta go back and look at a prior post) of those have no data issues except for being cultivated, OR they still haven’t been reviewed yet.
The number of unreviewed unknowns is well under a million, and the ratio of unknowns to all observations on iNat has dropped to 0.019 for the first time in about a year. EDIT: presently about 355124 unreviewed, never ID’d unknowns
EDIT: here’s the post I was looking for https://forum.inaturalist.org/t/make-captive-cultivated-not-automatically-no-id-needed/112/196?u=asteroidowl
If that’s the case, then the minimum value for those two stats will always be increasing, since the number of unidentifiable observations goes up each day. (Since each day has new opt-out, no evidence of organism, etc.)
Would it be more interesting to follow those stats for verifiable observations?
Yes, the minimum value will be increasing, but the overall ratio gives a better idea of how significant the situation is because it accounts for the increase in observations.
And does the inclusion mean that iNat believes there is value in eventually adding IDs to all of the observations in that category?
I can’t speak for inat, but I can see value in eventually IDing 3 of the 4 categories listed in the OP.
can all be solved if the observer fixes the issue.
I would think the best way to divide the unidentified obs up would be:
-Verifiable
-Captive/cultivated
-Data-deficient, but could be verifiable if observer fixed something (ID-less opt-outs, missing date/location, multiple species)
-Inherently unidentifiable (no evidence of organism, no media, copyright infringement)
I see potential value in all these categories except for the last one. And as has been said a bunch of times before on the forum, I think splitting Casual into Captive, Data-deficient, and Frass would be immensely helpful for identification and data analysis purposes.
to clarify, do you mean iNat the org (staff, board), or iNat as in all of us who participate on the platform?
I meant the iNat organization, or the group who designed the statistics.
I think you’re great, but your response is about a different issue. I am only asking about the stats board.
I think there is, coincidentally, another discussion happening on the questions to which you’re pointing. I really don’t want to mix the two.
Thanks for understanding.
My perspective is that iNat doesn’t really have a position, meaning that people upload observations, and iNat isn’t really saying yes these are good or not. The observations are there, whether they are good or not, and by whose definition they are good or not. iNat is providing the place to host those observations, but I don’t think it’s independently saying to any of us we must ID everything.
Not sure if that helps.
Not quite…I am asking iNat to weigh in on the question. I’m sorry it seems hard to understand. I guess I’m not making myself clear.
The dashboard is a reflection of what iNaturalist wants to track. I’m asking why they chose to include unVerifiable in those “unidentified” numbers to gain some insight. I am hoping to hear from iNaturalist staff, or someone else involved in creating the new dashboard metrics for identifying.