Good question - we actually did a benchmark recently and even with maps/search grounding it is pretty suboptimal from a end user's perspective https://news.ycombinator.com/item?id=47366423 For instance, we found this: "The pattern extends to navigation: Gemini’s own Maps grounding makes it worse at giving directions"
Second, Google Maps API has pretty strict licensing terms - you have to delete data within 30 days, have to attribute it properly. There are reasons why OpenAI and other companies are not using their APIs
That is a very interesting insight and thanks for sharing it. We realize that some countries might be having poor maps data, and that example is very helpful - we will take a look
I looked for their APIs not long time ago, which I said non-existent. If you'd like to work with me, I am open to work. If you simply want to talk to me, just drop me an email. I will also checkout your dev tool if I can apply it here.
Fair point - one doesn't instantaneously dethrone Google three months after starting. Here are some benefits of our API. Google's api is recall focused as any search - it shows you all possible places even if that one you are looking for doesn't exists. It is your job to find out if any of the returned results match the one you were looking for. Also, we have more convenient terms of use - you don't have to delete data after 30 days and don't have to attribute data when displaying.
Aren't you just scraping Google's data though? Also I looked up a place that doesn't exist and it responded with: "existence_status":"not_exists","open_closed_status":"open"
No, we are not - it is against Google's the terms of service.
open_closed_status is an indicator showing if there are sufficient evidences of the closure. If place doesn't exist, then there are likely no evidence of anything related to it, including closure. It could be confusing, but we will iterate on that.
Ok. I don't believe you but I respect the hustle. My only other note is the repo doesn't have any code in it and your github handle has zero history. At least vibe code a python client and put it in there if you want people to take you seriously.
We have models which take all of this into account when producing the verdict. For enterprise clients we emit a calibrated confidence score. With public api we decided to start simpler. Also, we are not using Google data. I’m not a lawyer, but doing that for any maps-related company is simply against Google’s terms of use
Interesting. Can you elaborate more on that? I was wondering the same thing with other providers, if I can scrape data from their platforms. On what grounds do you think it might be okay?
On the grounds that there's no law that says someone can put some instructions on their website and you have to follow those instructions.
You should consult a lawyer just in case there is such an obscure law, but I don't think there is. Browsing a website doesn't create a contract between you and the website operator, even if the operator says it does.
That is our vision of where we want to be. There is a lot of information about the places on the public web which you analyze and cross-reference. And we started to solve this problem with validation API which can tell you if a business or point of interest exists at current location.
Appreciate thinking about it, but I think there's some misunderstanding - we don't track people or movements. VOYGR validates places - for instance, we are able to answer a question if this business still open?
We are not providing opening times yet - we just check if place is permanently closed or not. But it is in the works under our experimental enrichment API (which is not yet open to public)
I started scraping restaurant websites in Zürich and extracted and hand-checked opening hours in the OpenStreetMap format. The goal is to build a corpus for evaluation purposes which maps website texts to correct opening hours strings for all restaurants in Switzerland. Maybe you can use that to benchmark your own hours extracting system... https://github.com/wipfli/opening-hours/
I don't know how well does Haiku handle OSM opening hours syntax, but with Kimi K2.5, I got better results when I asked it to provide opening hour ranges for every day of the week, and then constructed the opening hours manually.
Appreciate sharing this project - democratizing this data is indeed a very important step. Interesting that you settled on Haiku - did you have a chance to check flash-2.5-lite or gpt-5-nano performance?
Second, Google Maps API has pretty strict licensing terms - you have to delete data within 30 days, have to attribute it properly. There are reasons why OpenAI and other companies are not using their APIs