Main content

Mapping out the future of map-making - how Overture collaboration could unseat Google's location dominance

George Lawton Profile picture for user George Lawton April 19, 2024
Google pioneered the location intelligence industry about twenty years ago, which allowed it to dominate the field. Overture Maps Foundation, a collaboration between traditional mapping vendors and Google Cloud competitors, has introduced a new open map data set to give everyone else a leg up.

Google Maps

Manually curated map data was traditionally considered the main event for legacy map makers. But everything changed when Google pioneered map-making innovations in the early 2000s and gave them away for free (to consumers, at least). It could be argued that its early lead in location intelligence was an essential ingredient driving search and other revenues valued at $305 billion in 2023. In contrast, the digital map market was estimated at $19.1 billion in 2021. 

Now, legacy map makers, Google’s cloud competitors, and others are starting to recognize that collaborating to give away base map data for free is table stakes for the much larger opportunities in location intelligence and analytics services. The Overture Maps Foundation has recently released the beta version of its global open map data set.  Mapping service and application providers can begin kicking the tires of  Overture’s data for use in various commercial mapping applications and geospatial analysis services. A production version is expected this summer. 

Overture is a joint development foundation project and an affiliate of the Linux Foundation. The data is free to use and remix for various commercial offerings. Leading Overture members include Amazon, Meta, Microsoft, ESRI, Navis, and TomTom. Location vendors Mapbox and Here Technologies are not members. 

I spoke with Marc Prioleau, Executive Director of Overture Maps Foundation, about current progress and some of the challenges with stewarding competitors to create better opportunities for all. Prioleau has been a mapping industry veteran since the 1990s and has had a front-row seat to its evolution and expansion into location services in the 2000s. 

Prioleau says Overture started releasing the Alpha version of data in April 2023, and the new Beta version represents a new level of stability that promises to make maps data that goes into products used by billions of people. He explains:

There's always been data, and there have even been some ways of doing open data, but much of the data is proprietary. And so what that means is I either need to license the data or I need to work on someone else's system. So, if I build something in a proprietary system, I'm on their system. I think what we're really trying to unlock is that maps have become sort of this fundamental, horizontal platform across mobility. And we think that'll continue out into the Metaverse and the AR (Augmented Reality)  world. But it's become a horizontal layer that cuts across so many applications. 

And so if people want to build on that, they want to build on their systems. They don't want to build on someone else's system. And the hard part that's really preventing them from doing that has not been the software. It's really the data because mapping the data is the super expensive, super hard, super time-consuming piece of it. And so that was really the idea when we started. Let's take that super hard part and see if we can join up and build that as a community and make it open. And if you can do that, then the competitive dynamics or the market dynamics of building mapping applications change because you made the hard part a little easier. And now you can really focus on the software application.

Agreeing on the foundation

All the members agreed with the strategic reasons for building a common map layer. One of Prioleau's hardest challenges in the early days was stewarding members where they could cooperate while leaving open opportunities for competitive innovation to create business value. In the beginning, members would carve out boundaries differently. Priloeau says:

The hard part was that you need to get people to think about maps and what they're doing in maps and break the data into two pieces. In one piece is what we're calling the base layers, which is a part that everyone needs. But you don't get a competitive advantage from it. Everyone knows there's a road out here. So, how do we cooperate on building the parts where they are common? That's a big task, and then it still leaves room for us to differentiate between different things that mean different things to us and add value.

Here are the base layers that are part of the initial release that have been formatted in the Overture Schema and assigned Global Entity Reference System (GERS) identifiers:

  • Places of Interest: This includes data on nearly 54 million places worldwide with associated confidence scores that allow users to build map-based local discovery tools or power local search engines. 
  • Buildings: Data on  2.3 billion unique building footprints allows developers to add data to the building information to support new use cases in property management, risk assessment, economic development, and 3D visualization.
  • Transportation: Represents roads, footpaths and other travel infrastructure in a normalized schema for easy use in map applications.
  • Administrative Boundaries: This is a global open dataset of national and regional administrative boundaries, including regional names in over 40 different languages.
  • Base: Land and water data provide contextual layers to help complete display maps when needed.

Another big improvement is an updated schema that allows developers to ingest and use map data in a standard, documented, interoperable way to simplify development. Also, Prioleau says the new GERS format makes it easy to connect open or proprietary data to the base layers:

That says there's an identifier for everything on the map, whether it's a place, an address, a building, or a road segment. And if you want to conflate data, which is the word where they take different data and attach it to it, you can conflate it through that identifier. So, this road segment has an identifier. If you want to attach speed limits, traffic signals, potholes, or whatever you want, you can conflate it to that segment through this identifier. And that's a really big idea because that can be open data, or it can be proprietary data and  probably is proprietary data in more cases than not.

The base layer of data has grown considerably since Prioleau started thirty years ago. Maps just used to tell you where to go; now, they can tell you what lane to be in. Companies like Meta are adding additional value by building connections between businesses and their member's likes, reviews, and images of them. Delivery firms can figure out where to park, while ride-share firms can figure out the safest place to drop people off and pick them up. Governments can make sense of land use and zoning, while insurance companies can benefit from better tools for predicting flood risk. Now, much of this kind of rich data is free for vendors and open source developers to build on without paying the Google Maps API tax or succumbing to its conditions.

The elephant out of the room

All of this brings us to the elephant in the room, or, in this case, the company outside the Overture consortium. Google was the first search vendor to dive wholeheartedly into providing a free map service. In the early days, Google paid significant licensing fees to the map vendors, limiting its location search ambitions. Then, in the early 2000s, between internal development efforts and acquisitions, it pioneered a new process for building maps using cars that drove around and collected streetview data about roads. 

Translating camera footage into maps was expensive then, and whether it would scale was unclear. However, Google engineers found a way to automate many aspects of the process that were much cheaper than traditional surveying approaches. Once they figured out how to apply the same DevOps approaches they had been using for writing apps to maps, they quickly gained the upper hand against traditional map makers who still relied on expensive surveyors and a manual process and sometimes inserted fake roads to identify copyright thieves. Later, the Google blue dot showing where you are was introduced as part of the famous Steve Jobs demo for the first iPhone. The rest is history. 

If you want to dive deeper, it's worth reading Bill Killdoy’s Never Lost Again, who helped lead the effort at Google before transitioning to Niantic (an Overture member). When I mention this to Prioleau, he quickly pulls the book from his shelf, complete with dozens of colored bookmarks. He frames it as the seminal Google innovation that transformed mapmaking from a manual enterprise to an automated digital thread for dynamically updating a digital twin of the world like this:

Google Maps started in roughly 2004. Before that, all the maps were built using surveys. And so people would get in vans, drive around, annotate the world, and digitize it. But you didn't update anything until the surveyors went back. The growth of digital mapping has pretty closely covered a time when the Internet came into being, and mobile came into being. Mobile is important for mapping. Your maps were more useful because you're out in the world using things and knowing where you and things are. 

But it also became important because now, as people use the maps, they can give feedback about them. Whether that's explicit feedback, like this place is here, or passive feedback like I'm driving down the road, you can infer that there's a road there and traffic. So mobile was a big thing. Cloud Compute was huge because we can process all that data when we take it back. And now we're getting into AI, although AI has been used in mapping for a while.

And so I think the way maps were built fundamentally differs from when Google started. People looked around and said, ‘Hey, the real value in our mapping apps is not knowing there's a road here, or there's a place here, that building is there.’ That stuff's reasonably static. It's hard to do because the world's a big place, but that's not where any of us make our competitive advantage. Our competitive advantage is all the data we attach to that, and so that drives a logic which says, ‘So, let's all just work together to make that base layer really good.’

This is kind of an analogy to the barcode as a way that I can unambiguously say that thing, that building, that road segment, that place is this. And now I can attach data to that. I can also attach data from many different areas to that. When I describe that to people in the industry, everyone gets it. Sometimes, I think we invented this. And sometimes, we just wrote down what everyone was thinking because everyone gets it. That's the way the industry has gone. We're helped because it's not as hard to maintain that map data as it used to be. I'm not saying it's not hard, but you have ways of doing it. Now, if you see many cars going down a road, you're pretty well assured that roads are still there. If one day, no one's going down that road, that road probably went away, or if they're driving across the field, there's perhaps a new road or subdivision there. And so those are ways we can maintain the map that is much different than it was 20 years ago.

Building trust

But not everything is smooth sailing. Prioleau estimates that about 25% of places' data changes every year. The pandemic helped catalyze people's interest in getting their addresses correct, owing to the increase in home deliveries. At the same time, mapmakers have also struggled with a surge of bad actors finding creative ways to misplace competitors. Businesses needed a simpler way to trust that their business information was correct. So, Overture started to develop a framework to correlate multiple signals to simplify this trust. 

Ideally, map data should be a digital representation of the physical world. One of the big concerns within the mapping community is improving clarity for getting to ground trouth.  In the old days, highly trained and trusted surveyors helped build this foundation. But how do these maps make sense of feedback from ‘Jake474’? 

One of the things Overture is working on is looking at the data they collect from people who interact with places regularly. They map, pick things up or drop off packages, and post social media updates about places. Prioleau argues:

If you aggregate all these, this combination gives you a pretty good picture of whether that place is still here today. And so that's a way that you can build trust in the result by not being overly dependent on social media posts that might go away for one reason or another. If you can look at it from ten different ways, now, all of a sudden, maybe we trust that because the community is giving different looks at it.

The other aspect of trust is supporting the reason someone might want to contribute this data. Contributors want to know if they are getting a better experience or contributing to their communities. 

A third aspect lies in helping companies appreciate that increasing trust can help contribute to the commons so they can focus on new opportunities rather than invest in processes that provide no competitive advantage. Overture launched with four companies that often compete across their various offerings. Prioleau observes:

You're going to need to have those companies all trust each other to go work on it. And, you know, we do not just have Amazon; we have AWS and Microsoft and Azure, who are competitors. However, those people needed to see the value of working together in certain areas to improve their whole business situation. So we started, and we very purposely didn't expand it until we had the thing established because it's hard enough trying to get four big companies to trust each other. If you had 28 companies,  we would still be talking about it now.

My take

Map-making is a fussy affair. We can all agree that having better maps means we have a better understanding of the world around us. But people all interpret the world differently. Take Google Maps, for example; it mostly helps me get around but gets confused when navigating tall buildings, where I imagine it gets confounded by the GPS signals bouncing off the building in strange ways. And for some unforeseen reason, it insists my house is 200 yards down the road despite multiple efforts to correct this oversight. 

Google also insists I live right next to a river, an eight-foot wide body of water, which my American sentiments would call a stream. It's mostly OK since Amazon and the Royal Mail always seem to know where to drop packages. But this morning, I noticed a package meant for someone on the opposite side of our block sent via a third-tier firm that has not quite figured out this incredibly important distinction. 

It seems important that competitive companies are figuring out how to build a more trusted layer of ground truth and consensus processes to help accelerate trust in this data. Standardization will not be easy, particularly where competition is involved. But I hope they succeed. If open map data achieves takeoff, maybe at least my friends and delivery drivers will face fewer challenges finding my home. 

A grey colored placeholder image