If you’ve ever scanned a PokéStop or recorded AR footage in Pokémon GO, you may have helped train the navigation “brains” of real-world delivery robots. Niantic Spatial — the geospatial AI offshoot formed after Niantic sold Pokémon GO to Scopely — is now working with Coco Robotics to use player-captured imagery to improve robot navigation in dense cities where GPS can struggle.
This is one of those stories that sounds like sci-fi clickbait until you look at the specifics: billions of images, centimeter-level positioning, and a very real fleet of pizza-hauling bots already operating in multiple cities. It’s also a flashing neon reminder that AR games don’t just gamify your neighborhood — they can quietly become infrastructure for the next wave of machine navigation.
What’s Happening: Niantic Spatial + Coco Robotics, Powered by Pokémon GO Scans
Niantic revealed in March 2026 that it has teamed up with Coco Robotics, a startup building last-mile delivery robots, to improve how those robots understand and move through urban environments. Coco’s robots are described as suitcase-sized/flight-case-size and are designed for food and grocery delivery in busy city streets.
The key point: Coco is leaning on Niantic Spatial’s “spatial AI” and its Visual Positioning System (VPS) — tech that uses imagery of the real world (and the metadata attached to it) to determine location based on surroundings, not just satellite coordinates.
That matters because GPS alone can be unreliable in cities, where signals can bounce off buildings and create positioning errors. Niantic’s pitch is that a robot (or AR glasses, or other machines) can navigate more accurately when it can “recognize” the world visually and match it to an extremely detailed map model.
Niantic Spatial CTO Brian McClendon summed up the ambition bluntly in comments reported from an interview: “I’m very focused on trying to re-create the real world.” He also described the system’s precision: “We had a million-plus locations around the world where we can locate you precisely… We know where you’re standing within several centimeters of accuracy and, most importantly, where you’re looking.”
That last clause — where you’re looking — is the line that should make anyone who’s ever casually tapped “yes” on an AR scanning prompt sit up a little straighter.
The Data: 30 Billion Images and a Map Built From Hot Spots
The scale here is staggering. Niantic Spatial has trained its model on 30 billion images captured in urban environments. Those images come from players capturing scans and footage while playing Pokémon GO (and also Ingress, which is explicitly mentioned as part of the dataset used to build the city models).
Why is this data so valuable? Because Pokémon GO naturally funnels people to the same real-world locations — gyms, PokéStops, landmarks, popular public spaces — and encourages repeated visits. That creates a dense, multi-angle, multi-condition dataset: the same statue in sunlight, rain, nighttime, different seasons, different camera heights, different phone models, different crowds.
In other words, it’s not just “a lot of photos.” It’s a structured visual dataset of the exact places robots (and future AR devices) are most likely to need reliable positioning.
Niantic’s VPS approach is also positioned as a way to locate someone (or something) from their surroundings, rather than relying solely on GPS coordinates. That’s a major conceptual shift: instead of asking “where am I on a satellite map,” the system asks “what am I looking at, and where does that place exist in the model?”
And yes, Pokémon GO had explicit in-game mechanics that fed this machine. A Field Research feature introduced in 2020 rewarded players for taking photos and scans of their surroundings in exchange for items and encounters. That’s the gamification engine: you get loot; the system gets training data.
Opt-In, Terms, and the Privacy Reality Check
There’s an important clarification that’s easy to lose in the outrage cycle: multiple reports emphasize that not all Pokémon GO players contributed to this mapping dataset. The imagery used for this kind of model comes from players who opted into scanning locations and capturing AR mapping footage.
It’s also noted that Niantic’s terms and conditions state that images “are banked as mapping data.” In other words, there is disclosure — but it’s disclosure in the way tech companies love: technically present, often ignored, and rarely understood in terms of downstream consequences.
Another key detail: the reporting stresses that Niantic did not secretly harvest everything on your phone. The dataset being discussed is tied to intentional capture while actively playing, particularly scanning features that players agreed to use, and it’s described as being for publicly accessible locations.
Still, “opt-in” doesn’t automatically equal “informed.” A lot of players likely thought they were helping improve AR features, add PokéStops, or enhance gameplay. Fewer probably imagined they were contributing to a commercial-grade visual positioning system that could guide autonomous machines through cities.
And that’s the tension at the heart of this story: Pokémon GO turned real-world exploration into play. Now that play is being repurposed into industrial navigation. Even if the data collection was disclosed, the meaning of that collection changes when the output becomes a product sold to robotics partners.
The Robots: Coco’s Fleet and Where They Operate
Coco Robotics isn’t theoretical. The company is described as having a fleet of around 1,000 delivery robots. They’re built to carry meaningful loads — reported as up to eight extra-large pizzas or four grocery bags — and they’re deployed in Los Angeles, Chicago, Jersey City, Miami, and Helsinki.
That last city is notable because it underscores this isn’t just a single-market experiment. Coco is operating in the US and Europe, and Niantic Spatial’s mapping ambitions are explicitly global.
Niantic Spatial CEO John Hanke framed the partnership as the start of something bigger, describing it as the beginning of a broader vision for a virtual simulation of the world that updates as the world changes — and that could gather more mapping data from robots using the system.
Hanke also delivered the quote that will live forever in AR history: “It turns out that getting Pikachu to realistically run around and getting Coco’s robot to safely and accurately move through the world is actually the same problem.”
It’s a great line because it’s true in a very specific engineering sense: both require robust spatial understanding, localization, and navigation in messy real-world conditions. It’s also a reminder that “game tech” and “real-world automation tech” have been converging for years — AR just made the bridge obvious.
Why This Matters: Pokémon GO Was Always Bigger Than a Game
Let’s be honest: Pokémon GO has always been a Trojan horse for something larger than catching monsters. It’s an AR game, yes — but it’s also a system that incentivizes mass participation in building a machine-readable layer of the real world.
Niantic Spatial’s stated long-term goal is to enable machines, robots, and AR glasses to understand and navigate the physical world with centimeter-level precision. That’s not a “nice-to-have.” That’s foundational tech for:
- autonomous delivery and last-mile robotics
- persistent AR experiences that actually stick to the world
- wearable AR devices that can anchor content reliably
- any application that needs accurate localization in dense cities
And it’s also foundational tech for less comfortable applications, which some coverage explicitly raises: once you can build a model that can locate someone within centimeters and infer “where you’re looking,” you’re talking about capabilities that could be attractive to major logistics platforms — and potentially far beyond that.
The story lands now with extra force because Niantic no longer owns Pokémon GO. The game was part of a $3.5 billion sale to Scopely in early 2025, and Niantic’s post-sale identity has shifted toward Niantic Spatial and geospatial AI. That context matters: the company that built the game is now leaning into the data and mapping layer as a core business.
So if you’re wondering why this is surfacing so loudly in 2026, it’s because the corporate arc is becoming visible. The game was the funnel; the map is the asset.
Community Reaction: Surprisingly Chill… With an Asterisk
One of the more interesting notes in the coverage is that the reaction from players appears, at least in some corners, surprisingly positive — or at least not furious. There’s a sense of “well, we already knew scanning was for mapping,” and the idea of helping “little robots deliver pizza” feels benign compared to the darker theories people have floated for years about AR mapping.
That said, even the more optimistic takes carry the same caveat: there’s no guarantee delivery robots are the only use case for a model trained on tens of billions of urban images.
And that’s the real issue. Pizza delivery is the friendly demo. The underlying capability is the story.
What Remains Unknown
A lot is now clearer — but the most important consumer questions still don’t have satisfying public answers.
- Exactly what data is being sold or licensed, and under what terms? The partnership is public, but the commercial structure and scope aren’t fully detailed.
- How is the imagery stored, retained, and protected long-term? Details have not yet been confirmed publicly in the reporting summarized here.
- Which other companies are using Niantic Spatial’s model or VPS? Coco is the named robotics partner, but broader customer lists and deployments haven’t been fully disclosed.
- How will future data collection work now that Niantic no longer runs Pokémon GO? Niantic has indicated future images will be collected via an opt-in third-party service through Niantic Spatial, but specifics are still limited.
- What does “opt-in” look like today for players, and how clearly is downstream usage explained? The existence of opt-in scanning is noted, but the clarity of consent in practice remains an open question.
Niantic Spatial’s tech is undeniably impressive — and if it genuinely makes robots safer and more reliable in chaotic city environments, that’s a tangible benefit. But the bigger takeaway is unavoidable: AR gaming has matured into a data engine, and the line between “playing outside” and “building machine infrastructure” is now thin enough to step over without noticing.



