Human obs used for cyberbullying

seems like bullying or adjacent behaviors could be accomplished in several ways in iNaturalist:

  • via image (ex. image of a person in a compromising situation)
  • via text (ex. “[name] is a [pick your word of choice]”)
  • via audio recording (ex. spoken version of “[name] is a [pick your word of choice]”)
  • via juxtaposition of observation elements (ex. human in photo but identified as an animal)

if you’re planning to fully address bullying, it seems like you would need workflows to handle all of the above, not just the hiding of images.

it seems to me like even beyond these, it would be helpful to have specific functionality in the existing flags workflow / UI, perhaps with its own “escalate” category, that directly escalates the observation to help@inat. although it might drive up the work of iNat staff a little, it seems like it would allow someone to more quickly remove truly offensive / inappropriate items. it wouldn’t have to be a situation where only staff gets the flag. moderators could also see these and attempt to do minor resolutions if they get to it first, but they wouldn’t have to separately then e-mail iNat staff or whatever the process is currently.

it seems like at least some curators are already effectively hiding pictures by hacking the copyright infringement flow.

although this helps address inappropriate images, the underlying image record is still there, and if the underlying image record is still there, it’s not difficult to then access the underlying image file. i think staff are currently the only ones who can remove the image record.

but then even when staff remove the underlying image record, the image files still remain on the media servers, accessible to all. so this means that if bad actors linked directly to the image files, then removing the image records would still not be enough to totally resolve the problem. so any complete solution would need to address this as well.

3 Likes

What happens when a photo of a nude human or parts of a nude human or two nude humans, etc., is posted? Surely it goes away very quickly. Is that really worse than a picture of a school-child, especially a picture accompanied by malicious comments? Whatever erring is done, it should almost be certainly be done on the side of protecting children.

3 Likes

you’d be surprised at how slowly the process can take, as i found out while trying to do something similar to what you’re describing though with probably a college-aged subject (but possibly younger). as described in the bottom half of my above post, a curator hacked the copyright infringement flow to hide the photo, but it took at least many days, if not weeks, for staff to remove the observation and image record, and actually the image file is probably still out there accessible to all if you know the link to it.

3 Likes

Entries for H. sapiens could still be made, just not observations of actual people. We could flag them for removal if that happens.

2 Likes

Thank you for killing the flagged comments. I flagged the picture. I wrote to help. I admit defeat.

I wish iNat wasn’t set up for - enable the bullies and blame the victim.
Too much emphasis on treating the bullies with kindness. Setting iNat up as a playground for bullies.

If iNat guidelines enforced Engage with Nature, and hid or deleted obs of humans - there would not have been a target for those comments. iNat deliberately chooses to offer targets.

We now have 3 long threads running in parallel.

4 Likes

All of this. I’ve just been following this whole thread quietly because frankly i dont have the emotional energy for another fight but like…

Why anyone thinks it should be okay to tolerate bullying because it MIGHT infringe on someone just goofing around completely baffles me.

4 Likes

Social media uses, and abuses, people who were taught to be kind and polite. That is why scams work - but but That is My Friend!! Be kind is the iNat rule. My social media 101 - 3 strikes and you are out. These conversations can be interesting if both sides are open to learning from each other. My social media 202 - keep heated discussions public, where we can look out for each other.

As they said on the other thread - if 2 people disagree, sometimes one is right and the other is wrong. Then ‘equal’ treatment is misguided.

3 Likes

This is quite alarming to me - totally agree with the concerns you had about this in the original thread. The fact that you could theoretically use iNat as, pretty much, unmoderated photo hosting (is this also the case for audio content?) could easily be abused. I don’t know how much actually illegal content gets uploaded here but I would assume on a site this large it’s at least some, so I would hope in that case there’s some way for the devs to pull the AWS copy down…

I rather expect that most of the people posting the content this thread concerns would not be tech-savvy enough to access that permanent copy, so obscuring the photo as proposed is probably good enough (although hopefully the obs can be un-obscured by curators in the cases discussed where human obs are of some value). But I can easily imagine other types of bad actors who could do quite a bit with that.

2 Likes

yes, this is also true for audio

I agree that we need to deal with the photos of humans, preferably hide most of them. These photos don’t help and some of them do represent bullying. Unfortunately, banning them won’t work. If iNaturalist doesn’t allow human photos to be posted, or automatically deletes or hide photos labeled human, people will just post them under other names, where they’ll be harder to find. We have to find simple ways to deal with them after they’re posted.

The discussion here is about hiding photos, I don’t think eliminating human as an ID option for uploads is being seriously considered

1 Like

I have been looking at flags to get a better sense of how they are currently handled, in the process I have seen that flags of human observations are resolved by instructing the flagger to ID as human instead of flagging, I even saw at least one where there was a derogatory comment on a picture of a teen (I think at school) and it was resolved by a staff member (as opposed to a curator) by just IDing as human, which I think is an indication that the current policy is inadequate. I am not attacking the staff, my intent is only to point out that the current policy is inadaquate

4 Likes

Something I wonder is, why is it even possible to ID to Homo neanderthalensis? Exactly 100% of the ~207 Homo neanderthalensis IDs are trolling, ‘jokes’, harassment or some combination. Given that ‘recent evidence of organism’ DQA is officially supposed to be within give or take 100 years (certainly not more than 10000), there can’t ever be any qualifying Homo neanderthalensis IDs.

Edit: on further review, actually only 203 of the 207 are ‘trolling, ‘jokes’, harassment or some combination’, the other 4 are pictures of museum exhibits.

3 Likes

For flags on inappropriate IDs, I resolve those once the community ID is correct and I’ll typically comment something about making sure you ID things correctly and don’t make joke identifications. Once they’re overruled, there’s not a lot to do. Especially when the accounts become inactive, which a lot of them do.

Inappropriate notes are a different beast. I personally hate dealing with them because staff needs to step in and, as lazy as it is, emailing them each time is a hassle so I usually ignore those flags unless I made the flag myself or am particularly bothered by whatever the content was. Like super clearly offensive stuff. I suspect a lot of curators/users do the same and don’t want to go through the effort of emailing staff each time, especially when it’s schoolkids making goofy jabs at one another. It’s a gray area, I think. It’s kinda like it feels like more work than the content is inappropriate, if that makes sense. Or I’m just uber lazy and suck at curator stuff. I guess realistically I could just send an email with a link to the flag and some generic outline but again, I don’t usually feel compelled to do that. And I think that’s why so many sit there, because there’s more steps than convenient to deal with that type of thing.

Or I just be lazy.

3 Likes

No - you are a good curator. I rely on you.

We need a better way (as evidenced by 3 long threads of unhappy iNatters)

2 Likes

I think it is unavoidable when that is the only category that hides an observation. Yesterday someone uploaded a couple AI-generated images of beetles. Technically, the terms of service for services like DALL-E explicitly say that you do own the images generated, so in a literal sense it is not copyright infringement. But, the content is inappropriate for inat and ‘copyright infringement’ is the flag category most similar to what was happening.

Also, technically, that would be true for for pictures that are CC0, which are inappropriate for inat but not infringement legally, but curators can’t reasonably be expected to check and adjudicate that so in that case we mean ‘doesn’t belong to you’ more than ‘copyright infringement’ per se.

1 Like

@anon83178471 The argument I am making is that when the notes are inappropriate, and the photo is of a person, that makes the photo also inappropriate, because it may be being used for harassment

I do not think we can tell reliably what is an innocent joke and what is harassment without knowing more about the social context in which the photo was taken, so I think all human photos with derogatory captions should be removed, but I do not think you are lazy, the staff do the same thing you do, the workload on the staff of handling every single case would be too much under the current workflow, which is why I am saying we need different policies to handle this better

it is a policy issue where curators do not have the tools to handle this, not a lazyness issue

2 Likes

Why would CC0 be inappropriate for inat?

Posting wikipedia page cover photos is inappropriate for inat even thought they are CC0 is what I mean

2 Likes

Right, but that is becasue the location and date are inaccurate and they are not yours, not becasue they are CC0. I thought you were saying that if I take a picture and make it CC0 instead of my usual CC-BY-NC I can’t put it on inat