Currently the project is setup to fetch fresh data every four hours to keep the burden on the iNat API (v2) low as I don’t need a real live feed (GitHub Deploy Workflow). The project is more meant for our upcoming post-event analysis. The plan is to add a few more analytics, but free time is rare and limited (and its much more fun to be outside on the iNat hunt in summer ;-) ).
Nevertheless, maybe my source code can help other fellow iNat`s who wanted to create something similar as starting point.
Very nice to see someone else working on tools like this, and in Austria too! Is there any reason why data from Wien isn’t being included?
For the past few years I’ve been running these little python scripts to generate some comparatively rudimentary analysis similar to your “ Unique Taxa mit Research-Grade”. I also have a tool which I will start running on Tuesday, which highlights high-priority observations to identify to species level (example).
I’ve been running these manually, using GH actions is a very nice solution for automating them. My scripts use a combination of API data and exported CSV files to reduce load on the API, which was necessary during development so that I could re-run them constantly without hitting the API, but now they’re stable I could probably switch it completely over to API data and run it once or twice a day.
simply forgot about Vienna :0, thanks for the reminder (it is now on added to the analysis)!
For local development I have a few hacks included, so I don’t need to fetch all data. Additionally the API download is saved as CSV for playing around.
Yeah GH actions are great especially if you facilitate the cache as the first run took quite long (roughly 30 minutes) to install all packages.
That is brilliant! Thanks Hannes for creating that website! I’ll add the link to the citynaturechallenges.at-Website, if that is ok, so people can check it out.