Hello. I been wondering if it is possible to start a Collection/Umbrella project with the help of Artifitial Intelligence. I know about a project called Global Roadkill that seems to use AI (if you enter a photo of a dead animal laying over concrete or over a road, most likey will be added to that project). I think this can be problematic since not all animals in that situation are roadkills (I once uploaded a photo of snake killed by a person, which was left in a road. I made clear in the notes that this was not a roadkill, but was added to this project anyway), but I think it is a cool option. I want to start a project to gather data about eggs from birds from El Salvador, and, instead of me searching in all the birds observation of my country (+17,000), I’d like to know if there is way to use AI to search photo of eggs and include them in a this project. I’d really appreciate your help. Thanks in advance.
Hi Guillermo, I’m not sure, but using machine learning to automatically assign observations to a project could be considered a violation of iNaturalist’s terms. More about this here: https://www.inaturalist.org/pages/machine_generated_content
But for example if you’re building a system that can generate a list of observations that might be eggs for subsequent human-powered annotating / addition of those to projects, I would think that would be okay.
I don’t know anything about the AI aspect, but since Egg is a Life Stage annotation, you could set up a collection project that automatically adds all the bird observations from El Salvador with Life Stage = Egg. Unfortunately, this does require that those observations already be annotated and it looks like there are currently only 6 that meet the criteria: https://www.inaturalist.org/observations?place_id=7563&subview=table&taxon_id=3&term_id=1&term_value_id=7
So like bouteloua said, maybe you could use AI to generate a set of bird observations with possible eggs, add Life Stage = Egg to the ones that really do have eggs, and have them collected by your project?
just to clarify, the key isn’t the human-powered annotation… it’s the human review, right? or am i thinking about that wrong?
for example, suppose i have my AI present me a page of photos from 1000 observations that it thinks has eggs. i go through and look at them all and exclude 50 of these observations. if i then run a script to annotate the 950 observations as having eggs, that’s still acceptable, right?
I’d consider that type of individual review “human-powered” (even if the actual annotating was done en masse). But probably best to confirm with staff on any large-scale project. From that page:
It’s ok for humans to use machines as tools for arriving at their choice or to facilitate posting this content, as long as there is human involvement/oversight creating the content/decisions about each individual observation, identification, or comment.
We were just talking about this on another thread. Basically @jeanphilippeb has a script that runs the computer vision on unknowns, and tested out the idea of grouping some unknowns into projects based on what the AI thinks they are.
I doubt the project uses AI. I expect either a person looks for the observations and adds them to the project, or the person adds some kind of annotation and then the project collects everything with that annotation.
Edit: if it is this project, it is a “traditional” project, meaning it doesn’t do anything automatically, and all the observations are added by a person by hand. Probably the person did not read your notes.
Yes, I have created a project that includes El Salvador with Life Stage = Egg. And, as you mentioned, not all observations are annotaed, but so far it seems like wil be my only option.
I honestly do not know for sure, but the project has thousands of observation and I do remeber seen one that were unlikely to be roadkills…anyway. It seems like in this moment is not possible, so all observations form eggs needs be to added manually. Thanks