Yes, good catch, that’s definitely another way the Locality Notes can get filled in, i.e. you search for a place by name (geocoding), we get coordinates and a bounding box and then ask Google for some text that describes those coordinates (reverse geocoding).
Are we able to change how the accuracy value is calculated?
I’m guessing the bounding box returned by the services is unrelated to the bounding box associated with the iNat place. Assuming there is nothing bizarre about the iNat place bounding box, then it would definitely be a matter of asking Google to change the bounding box returned for that particular “geocoding”. So the question now becomes “how do we ask Google to do this? What do we ask them to change it to? Would it be sufficient to just ask them to review it and then they would of their own accord establish a bounding box that would fix the OPs problem?”
@rayray the link provided above by @pisum https://support.google.com/maps/answer/3094088?co=GENIE.Platform%3DDesktop&hl=en
Tell them the text being used to find the location, and supply a screenshot of the iNat place map showing the iNat boundary. Ask them to look at the bounding box they are returning.
If you don’t like what you see, you can just edit it.
Hey… Not saying we should, just trying to establish if that’s possible. OP has raised a problem they are facing, I’m just trying to help. I can butt out…
Sorry, wasn’t trying to be mean, just trying to answer your question. To expand, yes you can change the accuracy value. We could change how we derive the accuracy from Google’s bounding box (e.g. use a circumcircle vs. an incircle for the bounding box), but I’m sure that would just cause more problems. Allowing the user to choose that methodology seems like overkill to me, especially considering the user has absolute control over the size of the accuracy circle, or whether to include it at all.
Ok, sorry for reacting like that. I was a little frustrated…
OP has raised a problem they are having, and we explored “quick fix solutions” at the start of the thread. I believe they have enough to band-aid around the problem for now.
It’s likely to impact on others and also to re-occur for this user, and I think it would be good to attempt a more “heart of the problem” fix. Even if that is just a roadmap to a better and quicker solution for those that encounter it in future.
An analogy: I’m driving down the road, and I encounter potholes. I drive around them. I find out other people are having the same problem. We report it to the council, they look into it and notice that it happens only with roads that are sealed by one particular company. The company responds with “our work and materials are compliant, just have the drivers go around them”. An investigation could reveal that one particular aggregate used in mixing asphalt binds less than others, and because there is no way to determine if that particular aggregate is in play, a slight adjustment upwards to the bitumen ratio in the specifications solves the problem for all roading companies, regardless of what aggregate they are using, and let’s assume it doesn’t introduce any other problems. If this is the case, then it makes sense to tweak the specification.
When you look at how marginally it breaches the bounding box, it occurs to me that a tweak to the calculation of the accuracy value might solve this in most of these cases. Once calculated, adjust it down by say 5%. The accuracy value calculated in this way is already a fairly arbitrary value, what would it matter if it were 5% less? That is the only reason I raised the question of “is it something we can change” (ie how it is calculated), because if it isn’t then we don’t even need to consider that option as a possible solution. For instance, there might be a reason it is calculated that way that precludes it from being calculated any other way.
Yes, we can go to google and say “adjust this one please”, “oh, and this one too…” but if the fix is as simple as tweaking the formula for the approximation of an accuracy value?
I appreciate the suggestion, but this could go on ad infinitum. At least the current approach is arbitrary only based on what Google gives us. Adding additional arbitrary adjustments might fix the problem for this observation, but not for another, which needs 6% less, or the next that needs 7%. I don’t think this kind of an edge case requires that much fine tuning.