With the new app, the first goal has been to update and improve the core functionality of making observations, getting notifications, and searching for observations, which I think it’s mostly accomplished, with some caveats that still need work. It’s always going to be an evolving process.
We’ve done some design sketches and thoughts about how an Identify screen would look and work, and I’d love to have the functionality. I do think it would need to be designed for efficiency but shouldn’t be too efficient. It’s so much easier (at least in my experience) to make mistakes with quick gestures or button taps on a mobile device, and I already make plenty with my keyboard and mouse when using the web-based Identify page.
Even as it is I do think the new app is a big improvement for identifying over the previous iNaturalist Classic for iOS, though. I can use the Explore filters to only see Needs ID observations, for example, and I can make Data Quality Assessment votes. I also like that if I don’t feel comfortable with a suggested species, I can quickly go to its taxon page, tap on its genus or family or something, then select that as my ID, all without typing in the search field (I hate typing on my phone).
Thank you for making these points! I appreciate it, and didn’t recognize those improvements.
I’ll fumble and see if I get those things to work right for me. To share my experience from beginning to familiarize myself with this new app, just a moment ago:
I searched eastern North America for Pholcus and found a non-Pholcus observation; I selected it, it took ~4-5 seconds to load, and then after I searched for and selected the ‘Araneoid spiders’ taxon suggestion, I received a “Something went wrong” error
But indeed, I am still hopeful this is more efficient than last app iteration!
I do hope that feature comes some day. I’d love to be able to ID efficiently on the go. (Could get like 5-10 additional hours in every week, IDing while on trains/trams/busses). :D
The most important things are probably to quickly flick between observations (probably just swipe left&right?), an immediately accessible taxon-search bar, and a quick way to agree with an ID (checkmark button?) or mark something as reviewed (double-tap? swipe up?).
I like these ideas! I’m just concerned about a 1-tap agree. I still think there needs to be a confirmation step here just because touch screens are so…touchy.
I agree. But there will probably be some people will complain that they cannot agree in one tap. I think a good compromise may be having the agree button just enter the taxon you agree to into the search bar which has then to be submitted. So two taps, but no pop-ups.
To add, responsivity isn’t just about mobile devices. It’s also about adapting to different devices and when we need to zoom in or zoom out, text doesn’t disappear or truncate and there is no loss of functionality, and it adjusts to the page, no matter if I’m using a giant screen or a small one, or how much zoom I’m using.
ETA for those who didn't know what responsivity means exactly
It’s basically this if you want to take a read: Reflow.
let users enlarge text and other related content without having to scroll in two dimensions to read. (..,)
Users who need to enlarge (zoom) text to read it benefit significantly when it reflows. Regardless of the size of the text, it continues to wrap within the visible viewport. A whole line of text is visible, making it easier for the eye to track from the end of one line to the start of the next. Users only need to scroll in the direction of reading to read additional lines of text.
Modern websites and applications commonly employ responsive web design best practices to adjust or relocate sections of content to fit within smaller viewports. Neither adjusting or relocating content is considered a loss of information or functionality, so long as users are still able to access the content. When content can fit within smaller viewports it not only helps those using mobile devices to read content. It also helps people who need to resize or zoom in a web interface on larger devices, as is the intent of this success criterion.
A super advanced fully featured mobile-capable responsive app platform that handles the whole suite of iNat features equally well on all device platforms would be cool
This is typically longterm multi-million dollar dev/ui/ux by deep teams at billion-dollar companies though.
Could the React webapp ultimately handle any-device responsiveness? Maybe so but sounds like a big job given how many different pages, layouts and features there are.
As an alternative or intermediate solution, the iNat API seems advanced and accessible enough that someone with reasonable proficiency could implement their own interface to a favorite feature, with a basic UI suiting their needs.
For example I was pretty excited to learn I could access the API (responsibly) to make a mini-app for exploring species and ‘top faves’ on a more mobile-responsive map:
The docs list write/submit/put-level API functions (incl OAuth?) as well that I’m guessing might support people writing their own custom ‘identify’ interfaces, etc (possibly with user-level API keys implemented to monitor/throttle abuse/DOS weaknesses etc…)
The DuoNat app created by @portioid is very inspiring, one could imagine something like this that could act as an “Identify app” (or module) which plays much better on mobile devices than existing options https://duo-nat.web.app/
Yeah something like this. Although this is something more like a practice tool (but just for common, not Latin names).
I was thinking about some features for a possibly new identify mode (if ever implemented). It could be used in the potential ID app or even on the mobile website.
Maybe there could be the filter settings first (map for location, iconic taxa or a specific taxon etc.), and then something like a swiping mechanism (vertical swiping for the observations, horizontal swiping for the pictures of one observation).
In the lower screen area there could be a field where someone can type in ID suggestions/comments. On the right side a tool button where e.g. annotations pop up when touched.
Maybe this time the annotations (e.g. plant phenology) could appear as symbols on a side bar (I would really love that). This would make the identifying process simple (the simpler the better), fast and more efficient.