Observers deleting observations when identifiers disagree with them

2023 stats, thank you.

2% added only IDs - taxon specialists I presume.
11% like me both observe and ID.

But 87% observe … then wait … for the OTHER 13% to ID for them. 9 wait for The One. The One is very, very tired.

4 Likes

In no way am I diminishing how tiring that behavior is (I think I fall into Observer and subpar Identifier, yikes, will do better), but how many of those were one day accounts? Or one time Observers?

I am beyond grateful for Identifiers, but I think that could be a simplified statistic.

1 Like

Most of the observers probably were single observation people. Judging by the numbers, the statistics still work using the 2020 breakdown (above). At that time it was about 10-12% that had placed an ID (for all years to that point), and 2023 had 12.57%. So presumably, the observer statistics are also comparable. For the previous breakdown the top 4000 observers posted more than half of all observations.

1 Like

For identifiers at the time, that was the top 1000 identifiers identified double what all the other identifiers did combined.

Well, it’s an option, but not the only one. Situation: old observation, RG for a long time, now tipped out of RG by a disagreeing ID, no explanation forthcoming. Other options? Observer can tag friends or people on the leaderboard for the species he thinks it is, asking them to take a look and add their votes. Or the observer can look into it himself and decide whether agreeing with the new ID is right, or not. Just deleting the observation wastes the time the observer took to post the observation in the first place and the time the identifiers took to ID it. It also removes what could have been a useful data point.

I’m glad that they usually have responded. We’re all busy and it’s easy to overlook these questions even if we’re still on iNaturalist. Recently I was away from my computer for a few days and came back to over 800 notifications! Plus new posts from people I follow, which are not included in the count. That took a while to clear! I can’t be sure I got them all, due to iNat’s clunky way of handling notifications. (Which they may change, I hope.) To anybody whose question I didn’t respond to: I apologize. Please try again.

6 Likes

Just for fun, I decided to take a look at the numbers on the Explore page. https://docs.google.com/spreadsheets/d/1By9MOz6ED1RFK3PG0Io7VIqzDCDx3cofWHB4RKIpVAM/edit?gid=0#gid=0
Every year has two lines: one for the observations uploaded during that year, and one for all observations that were uploaded for that year.

Note that while the identifier numbers climbed from 2017-2020, the ratio to observers always goes down. For 2021-2023 note the rather disturbing trend of fewer identifiers every year. Also worth noting that the percentage of observations making Research grade descends every year.

I do realize that there are other factors at play, but the fact is; every year we have fewer identifiers.

To add to that, note that every year number of RG observations goes up (in spite of of ratio to observations going down). One way to interpret that is every year, you have fewer people (2020-2023 identifiers decreased from 125K to 119K), doing more work every year (Same period saw a rise from 14 Million Research Grade to 24 million RG).

Might be worth thinking about the next time you’re wondering why you’re not a response.

7 Likes

Now I’m curious: of those 280, how many of them have in turn reviewed (and identified) other people’s observations? If we go by Diana’s suggested ratio, they should each review at least 250 others’ observations, for a total of 70,000 observations reviewed.

1 Like

I don’t know. Looking over the list, I see several top identifiers, and a number of moderate identifiers, I don’t see that many that have the “we wish” ratio of 2/1 ID’s to observations. Some though.

2 Likes

this anaylsis seems constructed in a way that results in non-meaningful information.

the number of identifiers associated with observations from a given year is not the same thing as identifiers who made identifications in a given year.

for example, i think your identifier count associated with observations from 2017 would include not just identifications made in 2017 but also identifications made in all subsequent years. so if you’re counting identifiers like that, it’s no wonder that your identifier counts from previous years are relatively higher than those from current years.

to do a proper analysis of observers and identifiers, you probably need to compare:

  • observations created in a given year vs identifications made in a given year, or
  • observers associated with observations created in a given year vs identifers associated identifications made in a given year.

to get the proper identification numbers, you need to query the API via /v1/identifications and /v1/identifications/identifiers, or you can use my pages https://jumear.github.io/stirfry/iNatAPIv1_identifications and https://jumear.github.io/stirfry/iNatAPIv1_identifications_identifiers.

2 Likes

I’ve changed my IDing priorities
After location and kingdom, my priorities are;

  • users with <1000 observations
  • new users in whom I can hopefully instill good habits such as captive/cultivated vs wild, and giving an initial ID.
  • users with between 1,000 and 5,000 observations, as long as I can see increasing ID action
  • users with > 5,000 observations provided their ID numbers exceed their observations.

I do make allowances when I see from their bio that they are significantly contributing to the naturalist community in other ways.
I’m still never short of observations to ID.

3 Likes

Yes unfortunately I did not have a way to check how many ID’s were made in the individual years, but is it really relevant? Everyone knows that it is not uncommon for ID’s to come in at a significantly later date. Also, isn’t this showing the exact same info that the iNat blog post showed?

As such, the point still stands: every year from 2017-2020 we had more identifiers, but always at a lower rate than the observer increase. 2021-2023 there were progressively fewer identifiers every year. In spite of that, there are more RG observations every year.

Yes, many of those 2017 identifiers made their ID’s in later years, and more recent years have had less time to get ID’s, but there are still 25 million RG observations from 2023, which is more than any previous year even with less time.

run the numbers the right way, and you’ll see.

I’ll try and figure out the links you sent.

For some of us analog types, the magic of the Forum may still elude us.

I was referring to multiple dissenting opinions coming up years later after earlier opinions leading to a RG designation. Sorry if I was unclear.

Regardless of the quantity, dissenting IDs are not a good reason to delete an observation. That action removes data that otherwise has the potential to be useful - if not immediately, perhaps years later. There is no benefit to deleting an observation, unless it has been irreparably corrupted in some way and you intend to reupload it. If you’re concerned that an observation has been repeatedly misIDed, I recommend posting it on the iNaturalist Discord server with a request to clarify the ID. Many users there are happy to help with that type of situation.

7 Likes

No shame there, just explaining why I only quoted enough to indicate which bit of your message I was responding to without repeating the entire argument you’d made.

In that case then, are you sure they were wrong? And if you are, why not tag in more experts to weigh in an opinion instead of just deleting it?

Taxonomy isn’t a thing carved into stone tablets, our understanding is actively evolving - and I’ve seen plenty of cases where respected experts will go back to old obs they had ID’d and dissent with their own original ID years later, because something that everyone thought was one species in a particular location has been revealed to be a different one by a more recent study.

8 Likes

I have a similar situation which I am finding rather frustrating. There is a user involved with a geopark project who has been uploading observations with sometimes very obviously wrong IDs, and a second user also involved with the project who has been confirming them.

A while ago I attempted to suggest this is strongly discouraged according to iNat guidelines. It did not go over well. I was told that it was their project and they could do what they wanted and this was nobody’s business but their own. The observation in question was later deleted.

Recently I have encountered a recurring pattern of the first user deleting observations and reuploading them in response to getting a disagreeing ID. I find this very concerning. It creates unnecessary work for IDers (they reuploaded the observations with the same original wrong ID, which I corrected a second time before these observations, too, were deleted) and it results in a lack of transparency about the IDing process, because it hides mistakes and any discussion surrounding those mistakes.

They have an observation now which I tentatively ID’d as a fairly uncommon species so therefore naturally both users withdrew their IDs and agreed with mine, which I am very uncomfortable with (even if I were to withdraw my ID now, the observation would still be RG). I suspect that if I ask them to reconsider agreeing with IDs they cannot confirm, this will result in the observation being deleted – erasing the history of initial wrong IDs along with it.

I am reluctant to refer this to someone higher up the moderation chain, since I fear it would be perceived by the users as escalation/attacking them and of course there is no record of any of the past exchanges because the observations in question were deleted.

3 Likes

Perhaps you can pre-empt in future by IDing up a taxon level, and putting the species ID as a comment. Then your record stands, and so does their bad behaviour. But if it is a pattern, I would report it (especially if you recognise it as a second attempt to avoid your ID)

If it’s a project, is it possible to contact the person in charge? Just thinking that if it’s school related, some teacher or professor would be very unhappy with their behavior.

If you flag their observation then there will be a record, even if they delete the observation later.

5 Likes