Radxa has launched the AICore DX-M1M, a compact M.2 2242 AI acceleration module that delivers up to 25 TOPS of INT8 performance while consuming just 3W of power. Built around the DeepX DX-M1M neural processing unit, the module targets industrial robotics, autonomous mobile robots, edge servers, drones, and AIoT devices. It connects via PCIe Gen3 x2 and works with both x86 and Arm systems, including the Raspberry Pi 5 and Radxa's own ROCK series single-board computers.

The module integrates 1GB of LPDDR4X memory at 4266 MT/s and 1Gbit of QSPI NAND or NOR flash on a 4.2 cm (1.65 inches) by 2.2 cm (0.87 inches) board. It operates reliably from -25°C (-13°F) to 65°C (149°F) without throttling, with thermal protection kicking in between 65°C (149°F) and 85°C (185°F). The M.2 2242 form factor fits directly into compatible slots or adapts to M.2 2280 slots with an included adapter.

Software support comes through the DEEPX DXNN SDK, which handles model compilation, optimization, and hardware-accelerated inference. The toolkit supports PyTorch, ONNX, TensorFlow, and Keras models via the DX-COM compiler, converting them to the DXNN format for execution on the NPU. Developers get access to pre-compiled models for face detection, image classification, object detection, image denoising, semantic segmentation, and pose estimation, plus GStreamer plugins for real-time video processing. The stack runs on Windows 10/11 and Ubuntu Linux 24.04, 22.04, and 20.04 LTS, with Docker support for simplified deployment. DeepX maintains an open-source Linux PCIe driver for the NPU that integrates with the kernel build system, and Radxa provides installation documentation for both the driver and runtime environment on Debian-based distributions.

The Radxa AICore DX-M1M is priced at $85 (€78) and requires active cooling or a metal enclosure with thermal pads for sustained performance. More details are available on the Radxa product page and getting started guide.