Keysight, Samsung, NVIDIA unify AI-RAN validation

Keysight, Samsung, NVIDIA unify AI-RAN validation

Keysight and Samsung are validating AI-RAN from lab to air. At MWC Barcelona, the pair and NVIDIA are demonstrating a single workflow for dataset generation, AI/ML training, and benchmarking, using a PUSCH channel-estimation use case.


IN Brief:

  • Keysight and Samsung are demonstrating an end-to-end AI-RAN validation workflow with NVIDIA at MWC Barcelona 2026, running 2–5 March.
  • The workflow targets physical-layer AI, using PUSCH channel estimation to combine data generation, model training, and repeatable benchmarking in one automated loop.
  • The demo ties Keysight’s AI-RAN Simulation Toolset into NVIDIA’s Aerial Testbed and Omniverse-based digital twin, alongside commercial radio hardware including Analog Devices’ Titan O-RU platform.

Keysight Technologies and Samsung Electronics are using Mobile World Congress 2026 in Barcelona to show what AI-native RAN development looks like when the tooling stops behaving like three separate projects. The companies are demonstrating an end-to-end AI-RAN testing and validation workflow with NVIDIA that links dataset generation, AI/ML training, and performance benchmarking into a single, automated pipeline.

The demo is built around a physical-layer use case: PUSCH channel estimation. That choice is pointed. As more of the RAN stack is pushed toward software-defined implementations, the temptation is to treat AI as a scheduling, optimisation, or “operations” layer. Channel estimation sits much closer to the physics: multipath, fading, interference, and the mess of real propagation conditions that can make models look brilliant in one dataset and fragile in the next. Keysight’s argument is that validation has to be repeatable and comparable before any of those models have a route into live networks.

Keysight is using its AI-RAN Simulation Toolset to orchestrate the workflow: generate realistic scenarios and datasets, train models, and then benchmark them in a controlled environment that can be rerun, audited, and compared across approaches. In practice, the goal is to collapse the usual hand-offs between simulation teams, ML teams, and system test teams into something closer to a continuous integration loop for RAN algorithms.

Under the hood, the demonstration integrates the Keysight toolchain with NVIDIA’s Aerial Testbed — an over-the-air AI-RAN research testbed — and the NVIDIA Aerial Omniverse Digital Twin, with compute spanning platforms including NVIDIA GH200 and NVIDIA DGX Spark. Commercial radio hardware in the loop includes Analog Devices’ Titan O-RU platform, a reference design built around the ADRV906x RU SoC and aimed at accelerating O-RAN radio unit development.

Soma Velayutham, Vice President, AI & Telecoms, NVIDIA, said: “As AI moves deeper into the radio access network, validating complex algorithms across diverse, real-world conditions becomes even more critical. By leveraging NVIDIA AI Aerial platforms into a unified workflow, Keysight is simplifying the end-to-end data pipeline, essential for training, validating, and deploying AI-native 5G and 6G networks.”

The practical value is less about a single “best” channel estimator and more about the discipline of proving one. A lab workflow that can regenerate scenarios, retrain models, and re-run benchmarks makes it harder for teams to accidentally optimise for yesterday’s dataset, or to ship models that cannot be reproduced outside a single environment.

Charlie Zhang, Executive Vice President, Samsung Research America, said: “Our partnership with Keysight and NVIDIA is revolutionising network deployment by delivering integrated AI-RAN validation, bridging the gap between theoretical models and real-world implementation. This collaboration empowers the industry to confidently adopt AI-driven RAN solutions, ensuring robust and commercially viable foundations for the future of 6G networks.”

Balaji Raghothaman, Chief Technologist-6G, Keysight, said: “AI in the RAN only delivers value when it can be validated with confidence. Working with NVIDIA and Samsung, we’re demonstrating a streamlined, automated workflow that unifies data generation, AI/ML training, and benchmarking, helping operators and vendors accelerate deployment, reduce risk, and make more informed decisions as they introduce AI-driven RAN capabilities.”


Stories for you


  • Dymax brings fast UV curing to APEX EXPO

    Dymax brings fast UV curing to APEX EXPO

    Dymax will demo UV curing workflows at APEX EXPO 2026. The booth programme centres on PCB-level adhesives, coatings, and maskants, pairing automated dispensing with UV/LED curing demonstrations aimed at reliability challenges including outgassing, component stability, and environmental protection.


  • Teledyne wideband limiter hardens radar ESM fronts

    Teledyne wideband limiter hardens radar ESM fronts

    Teledyne releases wideband limiter to protect EW receiver front-ends hardware. The passive 0.1–20 GHz module is built to absorb high-power RF before it reaches sensitive electronics, with SMA connectors and fast recovery aimed at retrofit protection in R-ESM and EW systems.