While importing some images from iNaturalist to Wikimedia Commons, I noticed that most of the images have been downsampled to 2048 pixels wide (as the “original” size). While I’m sure that iNaturalist doesn’t want to store hundreds of thousands of 20 megapixel photos, 2048 is pretty small, and it’s a shame that we’re losing the original higher resolution versions, especially when they are freely licensed and could be used elsewhere. Even a modest increase to 3000 would make a significant difference when reusing images in print, for example.
While the high res images being available is a great thing…but they sure do hurt people that are rural and/or have limited data allowance to use each month and no options for any other internet service. It really restricts how much you can view and enjoy iNat when the large images loading use up precious data allowance.
I understand the cap is partly to avoid iNat becoming an image hosting service, with the idea that people can link to higher-res versions (e.g. on flickr). I guess there are tradeoffs? A related policy is that iNat supports animated gifs but not video – and encourages links to youtube/vimeo/etc where it could be helpful.
Yes! Rural here with no cable and terribly slow satellite internet available. I’m now doing unlimited data on my phone and using a hotspot but that is restricted down to 600kb/s. I get so frustrated by huge files especially when it is huge because they didn’t crop so I can’t ID from the size that shows up on the observation and have to view full size. Then I’m waiting for ages for the top half of the photo that is just vegetation to load so I can see the subject of the observation.
I’d love to have higher-res photos on iNat, but IMO the infrastructure and data connection issues supercede the image quality one.
Also, @zygy don’t forget to vote for your own request. :-)
How should this work?
- You upload a picture to iNaturalist.
- iNaturalist checks if they are freely licensed and if so iNaturalist stores the original file in Wikimedia Commons.
- An observation field is updated to store the relation with the original file in Wikimedia Commons.
- iNaturalist resizes the original file to 2048 pixels wide and stores it in its own database.
- If the observation becomes Research Grade the tags of the file in point 3 are updated with the taxon name.
Yes. Please stop compressing images from Trusted users.
So much quality images are being unnecessarily ruined. Images from trusted users should not be compressed.
To be clear, I’m not suggesting displaying full resolution images on the observation pages. I’m suggesting that a higher resolution version be retained and made available specifically for people who want to download the higher resolution version from the photo page or API.
A lot of my time on iNaturalist is spent zooming in on photos in a separate browser window to try to make out barely visible or entirely invisible features. It was a bit disappointing to realize that iNaturalist is deliberately deleting a lot of the information that I’m squinting to try to make out.
Of course, I understand that server space and bandwidth aren’t free. There may be better solutions, though. For instance, if an image is already on flickr… why not use the version hosted there? When I click on an image to make it bigger, just show me the full resolution version that’s already hosted. Yes, I can get to it from iNaturalist as-is, but it’s a lot of clicks and page loads for each image. And if I’m going through observations rom other users, it’s a lot of clicks just to tell if there are larger versions.
Or maybe make access to full resolution images a paid feature. I’m sure some people would be irritated, but if the cost of server space and bandwidth is the limiting factor, get people to pay for it. I would. And since no one can download full resolution images now… who would be harmed?
With regard to download times for people with slower internet connections, a user setting along the lines of “display maximum images at … x … pixels” would probably be a good solution. Making data access faster by deleting a lot of the data is not a good solution.