Didn’t want to put this in bug reports as it might just be how the AI works, but I’ve noticed recently (perhaps since the last CV model update?) some odd suggestions coming up for species ID, see as follows:
Anyone else seeing this (e.g. octopus, centipede, rust seem a bit off, the octopus in particular considering location is well inland…)? It’s on the desktop web iNat when uploading.
Sorry about that, I’ve added links to the actual observations.
The process I go through is: Upload button → Choose files → group select within a folder (all same date) → Select All → add location → individually select ID’s. So all of these should have location set when I select for species ID.
My attempt resulted in the suggestion that the Australian heteropteran was a brightly colored South American bird!
And that the ant might be a North American fern…even though there’s not a speck of green in the ant image.
It isn’t very safe to assume the AI has any anthropomorphic perspectives at all. We have a long way to ge before AI can rationalise a decision in human terms.
Of course–it’s not a human brain. But I don’t think we were making that assumption. I’ve used it for years, and have never seen it suggest something so far astray (my experience from my location).
Yeah that’s my observation (ha), it’s never been spot on and I wouldn’t expect it to be, but I don’t remember it previously being so far off as to guess an octopus for an ant (especially where it’s a reasonably quality image)…
Do you have the “only nearby suggestions” filter turned on?
I’m not a developer, but the screenshots in the original post make me think it’s less of a problem with the model than with the pipeline to the model. Like the right info for that observation is not being sent to the model or somehting. But that’s just a guess.
Not exactly “of late,” but the most hilariously wrong suggestions I saw was when I uploaded a photo of some local cattle to the website, and CV suggested: apricot, donkey, and rock pigeon. At least a donkey is a four-legged farm animal. As I mentioned in the observation notes, the Seek app was able to make a correct ID (from the phone’s Camera Roll, not in-person), but somehow the website wasn’t. Anyway, I checked just now, and it appears to have been fixed since then.
Yesterday I had a faded daisy, grey and pointed.
CV also suggested snapping turtle … and I can see the pixels for the pointy grey shape are in both pictures.
Again, I would like an option to downvote so we could add human input to the next CV.
I have a feeling it’s something like that too, perhaps the metadata (or higher res image version) being uploaded has not yet replicated to the DB being used for CV guess? I think I’ve also seen the recommendations change around a bit going back and forth to certain observations while uploading too.
@tiwane I do have it defaulted to only nearby (i.e. can see "Include suggestions not seen nearby).
Also, going back to these observations, if I check suggestions now they seem ok (no octopuses!).
I noticed something along the same lines when uploading observations for Gunnera tinctoria, which is a pretty distinctive plant. (For context, there are at least 10 observations of this species at pretty much exactly these coordinates, going back at least 2 years, and several are RG)
This is the first observation I uploaded: https://www.inaturalist.org/observations/132455150.
The initial suggestions get the genus right, but then suggest some plants bear very little resemblance:
Same story with the second observation: https://www.inaturalist.org/observations/132455153
When limited to “seen nearby” it gets wildly wrong suggestions, when expanded to all, it offers better suggestions.
I talked to @alex about these and I was incorrect, this is working as designed right now. @graysquirrel’s examples sum it up pretty well.
The actual visual model seems to be working fine, but if you have “Seen nearby only” selected and the top visual match hasn’t been seen nearby, it won’t show that match in “Here are our top suggestions” section. The “We’re pretty sure this is…” section is purely based on visual similarity.
Gunnera tinctoria has been seen nearby spatially, but not temporally (eg in July, August, and September). So sea kale, opium poppy, etc, are the closest visually similar matches that have been seen nearby in both the spatial and temporal sense. More on seen nearby here: https://www.inaturalist.org/pages/help#cv-seen-nearby
When you remove the “Seen nearby only” filter, then the model shows you Gunnera tinctoria as the top visually similar result. This is why we have the “Include suggestions not seen nearby” option, to make up for these kind of holes in the observation data used for seen nearby. As Alex mentioned in his vision model blog post, we’re working on a better way to use geospatial info for suggestions, but this is how it currently works.
When/if @graysquirrel’s observations get to RG and we update the seen nearby data, these observations will fill in some of those temporal gaps.
Is this option available in the Identify Modal, or only in the Upload Modal? Is this the default (or only option) for the Identify Modal also? " By default, the iNaturalist only displays suggested taxa that are visually similar and have been seen nearby if visually similar taxa have been seen nearby."
For Compare/Identify you can filter by taxon and then you see a list of suggestions labeled visually similar or visually similar/seen nearby. But there isn’t a way to remove taxa not seen nearby.