Use Wikidata for place names and Wikipedia descriptions

Just to be clear, I have no idea if the query engine optimizes anything or not. If not, there must be a very good reason why one has not been added after the site has been active for years.

A second option if you have specific needs is the request a query option

If someone does help , likely they are an experienced user with the tool.

1 Like

Maybe here


  • Where? Online, using BigBlueButton. Please register here (free of charge, a valid email address is needed). The link to the call will then be sent to you.
  • When? During 24 hours, from October 28th at 17:00 UTC, until October 29th at 17:00 UTC
  • What? Discussing about what makes you feel enthusiastic about/with Wikidata
  • Who can join? Everyone! The call will be moderated by various people (see schedule below)
  • Ground rules: all participants must complied to the code of conduct. No recording allowed. No screenshot allowed without express consent from all participants. Showing video is not mandatory, sound-only is perfectly OK.
  • Contacts: in case of technical problems, if you have trouble joining the call, etc. feel free to contact Lea Lacroix (WMDE). During the meetup, you can write a private message to the person taking care of moderation (having “(moderation)” in their nickname, also mentioned in the table below). If you want to want to report a problem with a participant’s behaviour, you can contact (see details)

In wikidata I changed the text IDs to numerical IDs, for example


I think it’s better to have the numerical IDs stored in Wikidata to use them in iNat search URLs and in Wikidata queries.

SELECT DISTINCT ?item ?itemLabel ?pic ?coord ?iNatlink WHERE {
?item wdt:P31 wd:Q1221156.
?item wdt:P7471 ?iNat.
?item wdt:P625 ?coord.
OPTIONAL { ?item wdt:P18 ?pic. }
BIND(URI(CONCAT("", STR(?iNat))) AS ?iNatlink)
SERVICE wikibase:label { bd:serviceParam wikibase:language "de,en". }


1 Like

Should an iNaturalist user support this proposal,
does it make iNaturalist better?

1 Like

This returns an empty zip file for me.

I have been matching iNaturalist place ids with Wikidata using OpenRefine, but get the id to place name mappings using the API. I am getting close to where I would consider doing this data scraping. Basically, I took a place on an iNaturalist observation and then iterate from x below and y above to that place id, where all ids between x and y are pointing to places in a single country. I then let OpenRefine do the reconciliation.

This works quite well (for those places that do exist on Wikidata and I do not create entities for those that do not match), so if a zip file is available, I could continue doing this, without having to hamper the API.

1 Like

I had an version from 15 november 2020
Download link
You are right, the file is empty.

This is the old one
1 item

14,6 MB

1 Like

Sorry, the cron job we had to generate the file was referring to a hard-coded db host name and we changed our db host in early Feb 2021, so the cron job broke. I regenerated the file and am updating the cron job so this won’t happen again.

1 Like

Thank you, I hope the 120.210 Places (nov2020) gets an update but no idea how easy it is to match. It seems Western Europe is not Ready yet.
2894 resultaten in 1170 m
4752 resultaten in 5711 ms (2 weeks)
( Open Refine. Zie:

SELECT ?place ?coords ?placid WHERE {
?place wdt:P7471 ?placid ;
wdt:P625 ?coords .
} Open Refine. Zie:

Is there a reference table for PLACE_TYPE A beginner’s guide to Mix’n’Match, a tool used for adding catalog data to Wikidata. It guides the user through their choices when using Mix’N’Match and finally demonstrates multiple methods of using the tool.’n’match/en

1 Like

Germany heeft iemand hier boven gedaan en Andra heeft Suriname en Mongolia gedaan Open Refine. Zie: het amazonewoud But it should be automated by some way with a track and trace or a logging but i have no idea how to write an bot or how to automate it. I am afraid someone who know how it works should do it. Maybe it was CHRISTIANSW

Map of places i/37Sx (Query)
3369 resultaten in 6743 ms
4594 resultaten in 8628 ms (Tuesday 23mrt)
4752 resultaten in 5711 ms (2 weeks)
Mix en Match

1 Like


Yes, I did much work on Germany, but I don’t know how to use a bot. Maybe it’s helpful to use Mix’n’match:

It’s so much fun to explore iNat via Wikidata:


Thank you. For me MixandMatch and OPEN REFINE is just complicated, no idea how it works.
3869 resultaten in 173 ms
Monday 4594 resultaten in 6108 ms
4752 resultaten in 5711 ms (2 weeks)
Uploading: image.png…

Is there a legenda for Place_Type A beginner’s guide to Mix’n’Match, a tool used for adding catalog data to Wikidata. It guides the user through their choices when using Mix’N’Match and finally demonstrates multiple methods of using the tool.’n’match/en

1 Like

Is there some one who can give the translation of PLACE_TYPE in the file ?
6 could be a street or a place
9 Place Dorp, City, Stad
100 could be “Nature Reserve”
33 could be “Estate”
Place_Type =7,20,33,100,101,1011

And is there a difference between Open Refine and Mix And Match? place_type

Type of place to retrieve
Allowed values: Aggregate | Airport | Colloquial | Continent | Country | County | Drainage | Estate | Historical County | Historical State | Historical Town | Intersection | Island | Land Feature | Local Administrative Area | Miscellaneous | Nationality | Open Space | Point of Interest | Region | State | Street | Street | Street Segment | Supername | Town | Undefined | Zone ou can also get place_id lists using the search URLs, as described here: 21.

For example, the following URL returns JSON data for all the places in Oregon (ancestor_id=10) that are open space (place_type=open+space) and have the word “Park” in their description (q=Park): 6

Each JSON block starts with the place_id and the place name. Unfortunately, you seem to be interested in California (ancestor_id=14), which has a LOT more places defined than Oregon. The JSON interface limits you to only 200 items per search (per_page=200 max). To go beyond that, you need to get the first page, save it, then increment the page counter (page=1, page=2, page=3, …) until one returns fewer than 200 items.

All the options for places.json searches are described here: A beginner’s guide to Mix’n’Match, a tool used for adding catalog data to Wikidata. It guides the user through their choices when using Mix’N’Match and finally demonstrates multiple methods of using the tool.’n’match/en

1 Like

The complication is not so much with the tools, but more with the data. The most challenging thing is to reconciliate the place names with what the items in Wikidata. For example, in the Netherlands it is not easy to distinguish a place, a municipality, a province or a recombined municipality. I saw the same complexity in Peru, where names are even shared by rivers, places and provinces.

At first I thought the latitude and longitude could help, but then I ran into places like “Texel” which is both a municipality and an island and “Schiermonnikoog” which in wikidata shares both the status of a municipality and and island.

Based on the data I got it is not easy to distinguish what type of place names we are dealing with. I suspect that the place-type could help here, but I have yet to understand to what the numbers relate. Is there a data dictionary about the place name table?

Once identification of the place type is possible and combining that with latitude and longitude, adding the place ids should be straightforward.

I must admit that it was fun trying to add place ids to Wikidata using open refine. I had to get local awereness to make the right choices which felt a bit like a virtual trip to the region. As you might notice some countries like chile, panama, mongolia, etc are better covered than other. I virtually travelled to those regions :D. The downside is that this approach does not scale.


If it is a bad idea to place the video one just can take it away without any problems
Wikipedia Weekly Network - Live editing Wikipedia: Biodiversity edition #3 #EarthDay Honoring #EarthDay​, Users Siobhan Leachman, Andra Waagmeester, Mike Dickison, Albin Larsson and Jan Ainali are joined by iNaturalist co-founder Ken Ichi to discuss and showcase how iNaturalist is used in Wikimedia. We’ll start with the new property iNaturalist place ID in Wikidata and see where our discussion leads us. Join in the chat to ask questions during the session.

Wikipedia Weekly Network - Live editing Wikipedia: Biodiversity edition #3 #EarthDay

Live on Facebook ?? A WOEID (Where On Earth IDentifier) is a unique 32-bit reference identifier, originally defined by GeoPlanet and now assigned by Yahoo!, that identifies any feature on Earth.


If anyone is still looking for this, it’s here, and admin levels are here. Note that while pretty much all places at the continent, country, state, and county levels will have their admin_level set correctly, many towns and parks will not have admin_levels.


Can you give an idea of how many iNat places have a woeid set? Spot-checking suggests it’s not all that many, but if I’m wrong, that might be nice to have in the export.

About 5174 places have a WOEID set. FWIW, these were mostly originally imported from Yahoo’s now defunct GeoPlanet API and subsequently assigned a boundary, mostly by users (I assume). I’ll include them in the export, with the caveat that I have no idea if they’re accurate.

1 Like


Welcome! Enjoy your stay!

Is there somebody with how to handle wikidata