Ambarella advances 4D radar processing on CV3 SoC

Ambarella is advancing 4D radar perception through CV3 processing.


IN Brief:

  • Ambarella’s Oculii radar software runs on the CV3-AD central AI domain controller family.
  • The software supports adaptive 4D radar perception with high angular resolution, dense point clouds, and long-range detection.
  • The architecture centralises radar, camera, and AI processing for autonomous systems.

Ambarella is advancing 4D imaging radar perception through its Oculii AI radar software running on the CV3-AD central AI domain controller system-on-chip family.

The Oculii software provides adaptive 4D radar perception for advanced driver assistance and autonomous systems. It adapts radar waveforms to the surrounding environment, supporting angular resolution of 0.5 degrees, dense point clouds, and detection ranges beyond 500m while using fewer antennas than competing 4D imaging radar approaches.

The software is available for licensing and can be supplied pre-loaded for operation on Ambarella’s CV3-AD SoC family. Ambarella also provides visualisation software that runs on Windows PCs, allowing radar perception output from the SoC to be viewed during development and evaluation.

The company’s centrally processed radar architecture includes Oculii Eagle-C and Falcon-C 77GHz imaging radar head reference designs. These stream raw ADC radar data to Ambarella’s SoC for central processing, supporting high angular resolution and elevation information across wide fields of view.

4D radar is gaining attention because conventional radar remains resilient in poor weather, darkness, dust, and glare, but has historically lacked the spatial detail offered by cameras and lidar. Imaging radar improves angular resolution, elevation sensing, and object separation, allowing radar to contribute richer environmental information to planning and safety systems.

The central processing model changes the radar system architecture. Many automotive radar systems process data at the sensor head before sending object-level outputs to the central controller. Ambarella’s approach shifts more processing to the central SoC, where raw radar data can be combined with camera and other sensor inputs at a lower level.

That approach can improve fusion quality, but it also increases demand on bandwidth, timing, compute, and software integration. Raw sensor data must be moved, synchronised, processed, and fused quickly enough to support safety-critical decisions.

The same pattern applies beyond automotive systems. Industrial autonomy, mobile robotics, logistics vehicles, mining equipment, agricultural machines, and security systems all face perception problems in environments where cameras alone can struggle. Radar can provide velocity, range, and resilience, while AI processing can interpret complex scenes in real time.

Autonomous perception is becoming less about adding isolated sensors and more about processing their data together. Radar, cameras, lidar, ultrasonics, inertial sensing, and maps each provide different forms of confidence and failure behaviour. The engineering task is to combine them with enough timing precision, compute efficiency, and error handling to support reliable decisions.

Ambarella’s Oculii and CV3-AD architecture follows that shift towards centralised AI domain controllers handling multiple data streams closer to the source. The approach increases pressure on SoC performance, memory bandwidth, software architecture, and sensor interface design, while offering a path towards more capable autonomous perception systems.


Stories for you


  • Rambus adds PCIe 7.0 Switch IP with TDM

    Rambus adds PCIe 7.0 Switch IP with TDM

    Rambus has introduced PCIe 7.0 Switch IP with time-division multiplexing, targeting scalable AI, cloud, and HPC systems requiring more efficient data movement across compute and storage resources.


  • ifm extends OGD sensors to four-metre detection

    ifm extends OGD sensors to four-metre detection

    ifm has introduced a new generation of OGD distance sensors using PMD time-of-flight technology, with millimetre-precision measurement, reflectance output, IO-Link, and detection ranges up to four metres.