Blog post

From Physical to Digital: Tech-Led Innovation in Mapping and Navigation

Learn about trailblazing trends in mapping and navigation technologies and new opportunities for competitive product development

Updated: September 12, 2023 12 mins read Published: August 03, 2023

Mapmaking is a centuries-old tradition. In the past, adventurous trailblazers journeyed into the wild to record landscape features, roads, and settlements.

Today, most geographic information is collected with technology — dashcams, drones, and satellites. Thanks to these advances, we can create highly detailed, feature-rich maps and receive real-time navigation instructions.

And progress in navigation and digital mapping is only picking up speed.

Innovation in digital mapping and navigation: New opportunities

The location-based services (LBS) market is expected to grow tenfold between 2021 and 2031. Much of this growth will come from the mobility sector.

The newest generation of advanced driver assistance systems (ADAS) — and, soon, fully autonomous driving systems — demand real-time access to accurate data, dynamic routing capabilities, and self-healing maps.

Although mapping and navigation data is easier to procure now than it was ten years ago, gathering it is still labor-intensive. Google Maps development was a multi-year, multi-billion-dollar effort led by a global team of engineers. Most of the road features in the early product version were “hand-massaged by a human,” as an article in The Atlantic describes it.

Even though most corners in the world today have already been recorded in public and proprietary geographic information systems (GIS), maps still require regular maintenance. Data accuracy and freshness are the two main challenges in the mobility industry, followed by coverage (the physical world keeps evolving too).

…And then there’s the biggest challenge, of course — delivering dynamic map updates and competitive navigation products.

That said, a good industry challenge is an ample opportunity for new product engineering.

Here are the six most interesting innovations in the navigation and mapping space that are driving future competitiveness and market growth.

Enriching mapping data with AI

Satellite imagery was a breakthrough for map creation. The wrinkle, however, is that most mapping software cannot work directly with satellite photos.

Visual data first needs to be codified into comprehensive navigation datasets in a suitable format such as the Navigation Data Standard (NDS). Then map owners must keep it up to date. Both processes are cost- and labor-intensive, making them great use cases for AI in mapping.

AI algorithms improve the speed and precision of digital map building by offering the ability to update maps more regularly and map new areas faster. They can classify objects in satellite images — buildings, roads, vegetation — to create enriched 2D digital maps as well as multi-layer 3D map models.

With precise maps, you can delight users with better ETA predictions, detailed fuel or energy usage estimates, and richer point-of-interest information. For example, Grab — Asia’s biggest ride-hailing company — uses , and proprietary AI models to detect undocumented road segments and update maps promptly. With always up-to-date location insights, Grab improves dispatch management for ride booking and deliveries as well as ETA estimates.

Apart from facilitating the collection of mapping data, AI can also help with generating such data.

Researchers from MIT and the Qatar Computing Research Institute (QCRI) recently released RoadTagger — a neural network that can automatically predict the road type (residential or highway) and number of lanes even with visual obstructions present (such as a tree or building).

The model was tested on occluded roads from digital maps of 20 US cities. It correctly predicted the number of lanes with 77% accuracy and predicted road types with 93% accuracy. The team is now training the model to predict other road features such as parking spots and bike lanes.

That said, sensor data collection from connected vehicles isn’t going anywhere. OEMs are increasingly relying on their fleets to collect new insights for digital map creation, and this process is becoming easier with advances in machine learning.

HERE recently presented UniMap — a new AI-driven technology for faster sensor data processing and map creation. The new solution can effectively extract map features in 2D and 3D formats, then combine them with earlier map versions. Thanks to this unified map content data model, new digital maps can become available in 24 hours. Companies can also easily augment created maps with their own or crowdsourced data to gain comprehensive insights.

NDS.Live format

NDS.Live is the new global standard for map data in the automotive ecosystem, promoting the transition from offline to hybrid/online navigation. It holds lots of promise: minimizing the complexities of supporting different data models, storage formats, interfaces, and protocols with one flexible specification.

NDS.Live is not a database; it is a distributed map data system

Conventional onboard navigation systems are designed, developed, and integrated with proprietary databases, which become obsolete with every new product generation. NDS.Live provides an opportunity to embed map data from multiple vendors and enables support of continuous high-frequency updates (including in real time).

Key characteristics of NDS.Live

  • Unified access to online services that offer basic navigation functionality like search or routing
  • Ability to process data from multiple vendors and sources (cloud, edge, vehicle)
  • Shared language for the entire mapping ecosystem of local map cache data, ECU live data, OEM fleet data, and map provider data
  • Access to real-time information on road conditions, traffic, points of interest, EV charging station availability, indoor parking, and more
  • Modular design, meaning that standard components can be developed and used independently
  • Ability to organize different data types (static, dynamic, live) as containers, with one configuration per service
  • Access to highly detailed data for ADAS and autonomous driving

From Physical to Digital: Tech-Led Innovation in Mapping and Navigation

Source: NDS Association — What is NDS.Live?

NDS.Live was co-developed by global OEMs and tech leaders, including Intellias. Our team has contributed to development of the NDS.Live standard by extending the NDS.Live Certification Kit with custom tests to ensure top map quality.

NDS.Live is now available for in-vehicle applications, companion applications, cloud applications, and vehicle-related services, supporting navigation, driver assistance, and autonomous driving systems. Daimler, HERE, Denso, Renault, and TomTom are among those who have already adopted it.

For example, second-generation Mercedes-Benz User Experience (MBUX) systems are powered by NDS.Live. The distributed map data system provides fresh information for the driver assistance system, which gets visualized as augmented reality (AR) instructions on the head-up display (HUD). Using an in-car voice assistant, drivers can also ask the system to provide information on visible landmarks and nearby points of interest.

NDS.Live can help massively improve the navigation experience for EVs and regular connected vehicles. It also helps OEMs deploy value-added subscriptions for assisted driving and navigation.

NDS maps expertise

Compile your NDS-based maps for software-defined vehicles
Learn more

3D and HD map generation

Three-dimensional (3D) maps enable accurate rendering of physical objects in a three-dimensional form. High-definition (HD) maps feature detailed information about road features (lane placements, road boundaries) and terrain type (severity of curves, gradient of the road surface).

Both types of maps are essential for launching advanced ADAS features and, ultimately, ushering in the era of autonomous driving.

3D maps define how the vehicle moves and help it interpret the information it receives from onboard sensors. Since most sensors have a limited range, HD maps assist by providing the navigation system with extra information on road features, terrain, and other traffic-relevant objects.

The bottleneck of both HD and 3D mapping is collecting and rendering data. In the case of 3D maps, you need to:

  • Capture video in real time from multiple cameras
  • Plan for interference due to vibration, temperature, and hardware issues
  • Then repeat the process across billions of kilometers of roads across the globe

Rather than doing this huge task alone, mobility players and OEMs join forces. For example, HERE and Mobileye partnered to crowdsource HD mapping data collection, with VW joining later. Mobileye developed a compact, high-performance computer vision system-on-chip called EyeQ™. Installed by over 50 OEMs across 300 vehicle models, the system supplies Mobileye with ample visual data they can then render into maps with the help of partners.

TomTom, in turn, teamed up with Qualcomm Technologies to crowdsource HD mapping insights from its users. Qualcomm provides the underlying cloud-based platform for making and maintaining HD maps from various sources, including swarms of connected vehicles.

To deploy HD and 3D maps, companies will need to invest in a high-performance computing platform for data storage, model setup, and SLAM processing. In addition, they’ll require a robust pipeline for map compilation and subsequent distribution.

Intellias has been supporting global OEMs and location technology companies in deploying and maintaining future-ready HD and 3D maps.

In one case, our team has developed a new cloud-based platform for HD map creation and distribution. We’ve established a process for collecting data from onboard cameras, sensors, and GPS devices, further augmented by core maps. All data can be consumed in an NDS format, which avoids vendor lock-in and ensures interoperability across systems.

Incoming location intelligence is rapidly aggregated, processed, and filtered with the help of machine learning models. This way, the company receives live multi-layer 3D maps that enable precise vehicle positioning. We also developed an edge perception stack that leverages onboard sensor systems to detect road features in real time and deliver self-healing live HD maps.

In another case, we’ve helped a client develop a system for delivering autonomous driving-ready HD maps. It has allowed our client to become the first company to offer HD map coverage of most European and Asia-Pacific countries.

Our engineers have also developed a proprietary streaming perception stack for the real-time collection and processing of data from multiple sources. It’s the backbone for providing accurate lane-level information, speed data, and real-time road intelligence on traffic, weather conditions, and more.

All HD maps are packaged as a geographically tiled and functionally layered data service, suitable for direct-to-car and OEM cloud consumption. Additionally, we were involved in all phases of 3D map development, from gathering source data to creating and publishing autonomous driving maps.

Autonomous driving simulations

Autonomous vehicles require extensive road and track tests to pass security checks. Manufacturers also need to simulate near-crash events without putting anyone in danger.

Enter advanced driving simulations.

Hyper-realistic virtual worlds can be much safer testbeds for autonomous vehicles (AVs). Especially as virtualization technology improves.

A group of researchers recently released an open-source, data-driven simulation engine for building photorealistic environments for AV training. The engine can simulate complex sensor types including 2D RGB cameras and 3D Lidar, as well as generate dynamic scenarios with several vehicles present. With the new engine, users can simulate complex driving tasks such as overtaking and following.

Waymo takes a similar approach of using real-world data collected from vehicle cameras and sensors to create highly detailed virtual testbeds. The Waymo team has built virtual replicas of several intersections complete with identical dimensions, lanes, curbs, and traffic lights. During simulations, Waymo algorithms can be trained to perform the most challenging interactions thousands of times, using the same or different driving conditions and different vehicles from its fleet.

To perfect the algorithm’s performance, the team uses a fuzzing technique. During training sessions, engineers alternate the speed of other vehicles, traffic light timing, and the presence or absence of zig-zagging joggers and casual cyclists. Once the Waymo algorithm learns the trick of driving through a specific intersection with a flashing yellow arrow, the “skill” becomes part of the knowledge base, shared with every vehicle across the fleet. And another round of simulations begins.

The new generation of high-fidelity 3D environments can be built with data from different sensor types to effectively convey all details of the material world to the algorithm. Existing 3D visual databases already include realistic details for traffic signs, pavement markings, and road textures. With machine learning and deep learning algorithms, complex ADAS/AD scenarios can simulate close to real-life conditions.

Digital twins of road infrastructure

While OEMs leverage dashcam data collection for building better navigation systems, transportation managers use the same intelligence to digitize road infrastructure.

A digital twin is an interactive, virtual representation of physical assets or systems such as a smart traffic light network or smart parking facilities. Powered by real-time data, digital twins of road infrastructure can enable advanced urban planning scenarios:

  • Dynamic traffic light signal optimization to reduce congestion
  • Traffic flow analysis for better road regulations
  • Prioritized public and service transport management
  • Accurate traffic predictions to optimize planning, signage, construction work schedules, etc.

Moreover, digital twins could help streamline site expectations and offer continuous monitoring of road network conditions. UK authorities have an ambitious plan to create a digital twin of the UK’s strategic road network (SRN) in the next five years. The team has already completed a full 3D scan of motorways using drones, which was the largest mapping exercise to date in the country.

Digital twins of global road systems will also be essential for accelerating the era of (semi-)autonomous mobility. Smart road infrastructure (and its digital counterparts) can be used to distribute real-time navigational instructions to connected vehicles — and even new versions of 3D maps.

Low latency is crucial for autonomous driving. Yet 3D map generation on the edge requires substantial computing power. Moreover, vehicles cannot store all mapping data on their route and need to constantly receive over-the-air updates.

A group of researchers has proposed placing compact map distribution devices on roadside edges to facilitate point cloud data (PCD) map delivery on the go. The results show that autonomous vehicles can perform self-localization while downloading PCD maps. Thanks to such a system, AVs can receive dynamic new maps for each new destination instead of storing tremendous data records onboard.

AR in HUD navigation products

The latest vehicles come with an upgraded HMI design, featuring new hardware and software elements that allow for augmented reality (AR) navigation.

AR in head-up displays can deliver all standard information from static displays (driving speed, status of the ADAS system, fuel or charge levels), alongside dynamic routing instructions including information on traffic signs, speed limits, construction work alerts, and ETAs.

Audi released HUD AR products for their Q4 e-tron and Q4 Sportback e-tron range. The AR content screen is equivalent to a 70-inch (diagonal) display. Below it, a flat near-field area digital window displays driving speed, traffic signs, the status of the driver assist system, and navigation symbols. When activated, the driver sees turning arrows from the navigation system, starting points, and destinations superimposed on the real-world environment and updated dynamically. These appear at a floating distance of ten meters from the driver.

Overall, AR navigation systems can help drivers make better decisions on the road. A recent comparative study found that drivers using AR-augmented HUDs made fewer errors and drove faster on average than those using conventional HUDs. Participants also rated AR HUD instructions as more useful and easier to understand. There was only one downside to the interface: drivers couldn’t anticipate maneuvers. However, this factor can be addressed with better system design.

Indeed, drivers don’t always understand why their vehicle behaves in a certain way (for example, why it slows down when in cruise control mode). A new generation of AR navigation systems could visualize how and why ADAS makes certain decisions to instill peace of mind.

The global AR head-up display market in the automotive sector is set to reach $315 billion by 2029.

Exactitude Consultancy

The next advance in navigation will be holographic displays, offering AR instructions in 3D. Advances in Lidar (light detection and ranging) technologies already allow for projecting ultra-HD holographic representations of road objects in real time into the driver’s field of view. Such systems can enable shorter obstacle visualization times and reduce driving-related stress according to Tech Explore.

AR HUD systems can also be deployed to create novel infotainment experiences for passengers, especially as cars will gain greater autonomy. Huawei’s AR-HUD solution, for example, already delivers a cinema-like in-vehicle experience for video consumption and true-to-life video calls.

Finding your way towards better navigation

Technology-assisted navigation brings a new degree of safety, ease, and convenience to the physical world. High-definition, enriched digital maps are the backbone of multiple new technology use cases in mobility, from real-time route optimization and commercial fleet management to better urban transportation planning and transportation sector decarbonization.

Software-defined connected vehicles have also massively increased the potential for collecting road intelligence and developing embedded navigational software. The imminent arrival of autonomous vehicles further magnifies the need for effective HD map creation and distribution. Companies that accelerate their efforts in digital mapping development will future-proof their growth potential for the long term.


Intellias is a technology partner to global OEMs and location-based service providers, delivering class-leading innovation in mapping and navigation. Let’s build the future of mobility together.

Rate this article
5.0/5.0 Thank you for your vote. 62765 b44cee2394
How can we help you?

Get in touch with us. We'd love to hear from you.

    I give consent to the processing of my personal data given in the contact form above as well as receiving commercial and marketing communications under the terms and conditions of the Intellias Privacy Policy.

    We use cookies to bring best personalized experience for you.
    By clicking “Accept” below, you agree to our use of cookies as described in the Cookie Policy

    Thank you for your message.
    We will get back to you shortly.