AI-powered summary of suggested species in Seek

Platform(s), such as mobile, website, API, other:

Description of need:
When scanning and finding a species, I often find myself lacking a bit more information about the result. There is the short wikipedia summary, but this is too lacking and rather boring to be honest. Most of the time, I have to open up google or similar to find more info about the subject outside the app.

I wish there was a more interesting read about the species I just scanned. Including some interesting facts, uses, rarity, unique characteristics etc.

Having this extra summary would make the application a lot more engaging for both newcomersand experienced people.

Feature request details:
Instead or maybe next to the wikipedia section, there could be a simple AI generated summary for the scanned species. This can simply be the response of a GPT prompt tailored to seek.

Hi, welcome to the forum @kamerat! I approved this request, but its likelihood of being implemented is probably low due to the need to pay for or train a new model as well as the high error rate of existing models. I’m just a volunteer here in the forum though so staff will weigh in eventually.


If the species is so rare/understudied that it does not have a fleshed out Wikipedia page, any AI summary would probably be making things up. I recommend becoming a contributor to Wikipedia to flesh out stub articles about insects, plants, or whatever it is you’re looking at. It would be a service to all of us.


AI makes a lot of promises, but it is unreliable for accurate information. I am constantly looking into specific information regarding animals for my job and personal knowledge and to use Virginia Opossums as an example, the AI generated response that google implemented had a lot of information intermixed that related to the possums in Australia.

As natev said though, there are many organisms with little information known. Insects are a good example as there are many that lack common names and are only known by their scientific name.

Another example is Seek, Merlin, Picture This, and other ID apps that identify through sound or photo algorithms. They are good, but they are not perfect. I have had Seek identify an organism as one thing, but then experts on iNaturalist have disagreed with Seek, and frankly I trust their word over an algorithm.


I don’t know what seek shows atm. Is it the full wikipedia article or just the first paragraph (that indeed would often be boring)?

If it’s only the first paragraph for space reasons, one could consider replacing that with an AI summary of the full wikipedia page, promting it to concentrate on some intersting areas. The models hallucinate a lot (like: orders of magnitudes) less when doing such summaries vs. just being asked without any information given.

I don’t have a strong opinion about whether this would be worth the effort, just pointing out the possibility.

I’m not a fan of LLMs (“AI”) in part because of their use of resources (water, energy). Some LLMs may be worse than others about making things up or getting information incorrect–though that’ll always be a problem for all of them. Out of curiosity, I used Phind on a species that I know has little written about and was pleased that the LLM behind Phind seemed to not make stuff up:

“Pachybrachis subvittatus is a species within the genus Pachybrachis, which belongs to the family Chrysomelidae, commonly known as leaf beetles. This classification places it within the order Coleoptera, encompassing beetles, and further within the class Insecta, indicating it is an insect. The phylum Arthropoda and kingdom Animalia categorize it as an animal. The species was named and described by John L. LeConte in 1880. Unfortunately, the provided sources do not offer detailed information on the specific characteristics, habitat, or behavior of Pachybrachis subvittatus beyond its taxonomic placement.”


The iNat app only shows the first sentence it seems.


Harmonia axyridis is a large lady beetle or ladybug species that is most commonly known as the harlequin, Asian, or multicoloured Asian lady beetle.

Compare to AI generated summary:

Harmonia axyridis, commonly known as the multicoloured Asian lady beetle, is a fascinating species with an exceptionally wide range of color forms. Originally native to eastern Asia, it has been introduced to North America and Europe for aphid and scale insect control. This large lady beetle is now well-established in various regions, including North America, where it’s sometimes playfully called the “Halloween beetle” due to its tendency to invade homes during October. Its diverse color patterns and voracious appetite make it a remarkable biocontrol agent!

I don’t think this would be particularly useful. “AI”(A LLM, in this case) has no mechanisms for fact checking or validating information. It only assembles a statistically likely sequence of words. The information provided by an LLM about any given species would be prone to hallucinating incorrect information, making things up, or misattributing information from other articles, as we’ve seen with Google’s recent AI feature. This could be mostly harmless, or it could lead to problems, such as if the AI decides that “this mushroom is edible” is a common series of words in mushroom wikipedia articles, and puts it into the summary of an inedible or deadly species.

And as someone else in the thread stated, any organism with a wikipedia article that isn’t fleshed out enough to be “interesting” would be especially prone to the AI just making things up, as it has very little to pull from.

The potential drawbacks of this would much outweigh any advantages, I think


But seek only shows the first sentence (as I found out by now). So the point from the OP that these are boring also applies to species that have multi-page wikipedia articles.

It’s been said by others in the thread, but AI generated summaries are likely to contain incorrect information since LLMs essentially function like a more advanced version of predictive text.

For some species, there’s enough misinformation that you’re likely to get that appearing in the generated summary purely because it’s so widespread, while for other species you may get misinformation because of examples like the one @hawkparty gave, or because they have a similar name to another species or to some other unrelated thing.

I’m not sure what a better solution would be since part of the issue seems to be Seek only showing a small part of the article, but part of it is also that some species have very little written about them if they have a Wikipedia page at all. I mostly just don’t think using a tool that’s likely to present people with misinformation is a good idea on an app that’s designed for helping people learn more about the world around them.


I completely agree; this could be a real issue of risk to iNaturalist if incorrect information is spread.

I think that iNaturalist’s short blurb of information as pulled from Wikipedia inspiring someone to google the organism is actually the website working as intended. Not all information about every organism can be hosted on iNat.


It’s interesting that we once considered Wikipedia to be the least reliable source of information–and discouraged its use–because anyone could write anything they wanted (and sometimes did). LLMs come along and now the only reliable source of information is Wikipedia (which still could be written by anyone writing anything they wanted–including misinformation. I once maintained a Wiki page for a couple of species and every so often someone would go in and change a few words to make the entry say the opposite of what I had written.


I would not be in favor of this change. This forum (and many other places) are full of examples of erroneous information presented by LLMs. Just presenting an AI-based summary is even more of a problem than interacting with an LLM, as the user wouldn’t be able to ask the LLM to clarify/fact check its information (which sometimes leads to it correcting itself). As others have noted, when knowledge about a specific taxon isn’t widely available, LLMs often lean towards making up info based on similar taxa rather than just saying “we don’t know much about that” which would be the most correct answer.

Users are free to pull up an LLM or Google with an added 5-10 sec of time without iNat appearing to approve of the results by including them directly in-app.

The cost of presenting misinformation could be high both immediately (learning something is edible/nonvenomous when it’s not) or in the long run (if iNat presents inaccurate info and its reputation declines and loses users).

Lastly, LLM results aren’t free. If iNat were to include them in an app resulting in 1000s of calls to the LLM, it would certainly cost money which I’d rather a non-profit spend on most anything else. As others have also mentioned, calls to LLMs also require power and, at large scales, contribute to climate change, etc. I’m a fan of more observing, less AIing.

Edit addition: Also, isn’t Seek designed to work offline? AI-based summaries wouldn’t be available offline presumably, so this would further restrict the value of such a feature.


I was pretty interested in having an AI summary instead of the uninformative snippet…

Until it blatantly promoted releasing invasive species into the wild. It seems that sometimes AI “blindness” can be just as harmful as “hallucinations”

1 Like

I think another thing to consider is that generally, there are very few circumstances where you’d need to have that much information on a species right there and then.

Maybe if you’re foraging you’d want to know the uses for something, whether it’s edible, whether it’s okay to pick or if it’s a protected species, etc, but for something like that I would really hope that person had learnt this information beforehand from a reputable source, rather than trusting the computer vision to correctly identify the species and then trusting AI to generate that information, which it may not do correctly.

Sure, having information right there for you to look at immediately is very convenient (if the information is accurate), but sometimes the experience of trying to find information yourself by looking through various sources and piecing things together is an interesting and valuable experience in itself.

Being able to research and find trustworthy sources is a skill that I worry may be on the decline with the advent of AI chatbots you can ask about a subject and get a quick answer from, and maybe that curiosity and desire to learn more about nature is the perfect opportunity to practice that skill.