Why centralized sensor fusion is the future of autonomous vehicles | Automotive News

2022-08-20 07:15:07 By : Ms. Sisi Xu

Most automated vehicles today rely on sensor fusion — i.e., the combination of data from sensors that use radar, lidar and cameras to collect environmental information. As AV giants have demonstrated, sensor fusion boosts self-driving vehicle performance and produces safer outcomes.

But not all sensor fusion is alike. While many AV manufacturers rely on high-level sensor fusion, only deep, centralized sensor fusion provides AVs with the information they need to make optimal driving decisions.

Here, we'll further explain the differences between high-level and centralized sensor fusion — and why centralized fusion will prove indispensable.

Fusing radar, lidar and camera sensors maximizes the quality and quantity of collected data to produce a cohesive environmental image.

Recognizing these advantages, many AV manufacturers prefer high-level sensor fusion over single-sensor systems. In this model, object data collection, processing, fusion and classification occur at the sensor level. However, by pre-filtering all sensory information before processing, these "smart" sensors effectively strip away the context necessary to make fully informed AV driving decisions.

Centralized sensor fusion avoids this outcome. Here, radar, lidar and camera sensors send low-level raw data to the middle of the vehicle for processing. This approach maximizes the AV's access to information, enabling better decision-making than high-level fusion provides.

Today's AVs already centralize camera data processing. But when it comes to radar data, centralized processing has proved impractical. High-performance radars often require several hundred antennas and processing channels, which increases the volume of data generated. As such, local processing presents a more cost-efficient option.

AI perception software enhancements, however, can boost radar resolution and performance without requiring additional physical antennas. Raw radar data from fewer channels can then be transported to a central processor at little cost. When AVs fuse raw artificial intelligence-enhanced radar data with raw camera data, they fully leverage the strengths of both complementary sensing modalities to build a complete image of their environment, enabling the fused results to see things that neither sensor could see individually.

Radar upgrades can cost-effectively improve AV performance at scale. Conventional low-cost radars can be cheaper than $50 per sensor in mass production — an order of magnitude cheaper than where lidar aims to be. Combined with ubiquitous low-cost camera sensors, AI-enabled radar delivers affordable accuracy, which is essential to mass commercialized AV production. Lidar sensors overlap with AI-enabled camera/radar fusion capabilities but could complement camera and radar if the costs eventually come down for redundancy in Level 4 and Level 5 systems.

With high-level sensor fusion, edge processing limits each smart sensor's size, power and resource distribution, constraining overall AV performance. Additionally, high-volume data processing can quickly exhaust the vehicle's power and reduce its range.

Algorithm-first central processing architecture, on the other hand, enables what we call deep, centralized sensor fusion. Taking advantage of the most advanced semiconductor technology nodes, this tech optimizes AV performance by dynamically distributing processing capabilities across all sensors, enabling increased performance across different sensors and directions depending on the driving scenarios. With access to high-quality, low-level raw data, central processors can make more intelligent — and more accurate — driving decisions.

AV manufacturers can use low-power radar and camera sensors along with bleeding-edge, algorithm-first, application-specific processors. The result: optimal perception and path-planning performance with the highest power efficiency envelope, which significantly increases each AV's range while lowering battery costs.

AVs require a diversity of data to make the right driving decisions — and only deep, centralized sensor fusion can deliver access to the breadth of data needed for optimal AV performance and safety. In our ideal model:

1. Low-power, AI-enhanced radar and camera sensors are locally connected to embedded processors along an AV's periphery.

2. Embedded processors send raw detection-level object data to a central domain processor.

3. Using AI, the central domain processor analyzes the combined data to identify objects and make driving decisions.

Centralized sensor fusion can improve on existing high-level fusion architecture. But industry news signals a potential step backward. In 2021, Tesla announced its shift to Tesla Vision, which uses only cameras and neural net processing to inform driving decisions.

AV manufacturers should avoid following in Tesla's footsteps. Powerful and reliable AVs depend on sensor fusion. To reap the benefits, AV manufacturers must invest in algorithm-first central processors and AI-enabled radar and camera sensors. With these efforts, AI manufacturers can usher in the next phase of AV development and adoption.

Have an opinion about this story? Click here to submit a Letter to the Editor, and we may publish it in print.

Please enter a valid email address.

Please enter your email address.

Please select at least one newsletter to subscribe.

See more newsletter options at autonews.com/newsletters. You can unsubscribe at any time through links in these emails. For more information, see our Privacy Policy.

Sign up and get the best of Automotive News delivered straight to your email inbox, free of charge. Choose your news – we will deliver.

Get 24/7 access to in-depth, authoritative coverage of the auto industry from a global team of reporters and editors covering the news that’s vital to your business.

The Automotive News mission is to be the primary source of industry news, data and understanding for the industry's decision-makers interested in North America.

1155 Gratiot Avenue Detroit, Michigan 48207-2997

Automotive News ISSN 0005-1551 (print) ISSN 1557-7686 (online)

Fixed Ops Journal ISSN 2576-1064 (print) ISSN 2576-1072 (online)