What Is a Companion Computer and Why Do You Need One?
A flight controller — even a sophisticated one running PX4 or ArduPilot — is a real-time embedded system with limited computational resources. The STM32H7 processor in a Pixhawk 6C runs at 480 MHz with 1MB of RAM. That's enough for sensor fusion, PID control, and mission execution, but not for running computer vision algorithms, neural networks, or complex situational awareness pipelines.
A companion computer is a second processor onboard the drone, connected to the flight controller via MAVLink. The companion computer handles high-level decision-making and computationally intensive tasks, while the flight controller handles real-time attitude control and safety.
Typical companion computer applications:
- Object detection and avoidance (YOLO, OpenCV)
- Visual odometry (ORB-SLAM, T265 depth camera)
- Precision landing on AprilTags or landing pads
- Payload control (cameras, servo mechanisms, release mechanisms)
- Telemetry aggregation and logging to cloud services
- On-device AI inference (person detection, crop analysis, infrastructure inspection)
- ROS 2 mission scripting and dynamic re-planning
Browse GPS modules and flight controllers that integrate well with companion computer setups.
Hardware Options
Raspberry Pi 4 / Pi 5
The Raspberry Pi 4 is the most common companion computer for drone projects. It is well-documented, has a large community, and uses standard interfaces.
Raspberry Pi 4 (4GB):
- CPU: 1.8 GHz quad-core Cortex-A72
- RAM: 4 GB
- Interfaces: USB 3.0 (×2), USB 2.0 (×2), UART, SPI, I2C, GPIO
- Power consumption: 3–7W typical in-flight
- Weight: 46g
- Best for: MAVROS, mission scripting, basic computer vision
Raspberry Pi 5 (8GB):
- CPU: 2.4 GHz quad-core Cortex-A76
- RAM: 8 GB
- Power consumption: 5–12W
- Weight: 51g
- Best for: More demanding computer vision, ROS 2 Humble, real-time inference at 30fps
The Pi 4 runs ROS 2 Foxy/Humble on Ubuntu Server 22.04 LTS (arm64 image). Install via the official Ubuntu ARM image rather than Raspberry Pi OS for best ROS 2 compatibility.
NVIDIA Jetson Series
Jetson modules are purpose-built for AI inference on the edge. The GPU provides massive parallelism for neural network inference tasks that would be impractical on the Pi.
| Module | CPU | GPU | RAM | Power | Weight | Best For |
|---|---|---|---|---|---|---|
| Jetson Nano (B01) | 1.4 GHz A57 ×4 | 128-core Maxwell | 4 GB | 5–10W | 140g (devkit) | Entry-level AI inference |
| Jetson Orin Nano | 1.5 GHz A78AE ×6 | 1024-core Ampere | 8 GB | 7–15W | Custom | YOLO v8, depth estimation |
| Jetson Orin NX 16GB | 2 GHz A78AE ×8 | 1024-core Ampere | 16 GB | 10–25W | Custom | Production AI payloads |
| Jetson AGX Orin | 2.2 GHz A78AE ×12 | 2048-core Ampere | 64 GB | 15–60W | Heavy | Maximum performance |
The Jetson Orin Nano on a SEEED reComputer carrier board (approximately 50g) is a popular balance of capability and weight for research builds. The Jetson Nano B01 on its developer kit is too heavy (140g) and too large for typical UAV integration — use a carrier board like the Waveshare Nano C or similar.
Khadas VIM4 / Orange Pi 5
These ARM-based SBCs offer competitive CPU/GPU performance at lower cost than Jetson.
Orange Pi 5: Rockchip RK3588S, 4-core Cortex-A76 + 4-core Cortex-A55, NPU 6 TOPS, 8–16 GB RAM. Runs Ubuntu 22.04, ROS 2 Humble. Excellent value for MAVROS setups that don't require CUDA.
Khadas VIM4: Amlogic A311D2, 4-core Cortex-A73 + 4-core Cortex-A53, 8 GB RAM. Strong media encoding capability — good for onboard video recording alongside autonomous operation.
MAVLink Connection: UART vs USB vs Ethernet
The companion computer communicates with the flight controller via MAVLink protocol. Physical interface options:
UART (Serial)
UART is the most reliable and lowest-latency connection for MAVLink. Most flight controllers expose one or more UART ports as telemetry connections (TELEM1, TELEM2 on Pixhawk).
Connect the Raspberry Pi's UART TX/RX pins (GPIO 14 and 15, BCM numbering) to the FC's telemetry UART. Match baud rates (typically 57600 or 921600 bps).
Raspberry Pi UART configuration:
# Enable UART on Pi 4/5, add to /boot/config.txt:
enable_uart=1
# Disable the console serial (so UART is free for MAVLink):
# In /boot/cmdline.txt, remove: console=serial0,115200
# Test UART connection:
sudo apt install minicom
minicom -b 921600 -D /dev/serial0
Voltage levels matter: the Raspberry Pi GPIO operates at 3.3V logic. Pixhawk UART ports are also 3.3V. Check your specific FC — some output 5V on UART, which will damage the Pi. Use a logic level shifter if needed.
USB
USB is easier to set up and handles higher data rates, but introduces slightly more latency than UART due to USB's interrupt-driven nature. Connect the FC's USB-C/micro-USB port to the Pi's USB port.
The FC appears as /dev/ttyACM0 or /dev/ttyACM1 on Linux. MAVProxy or MAVROS connects to this device path.
# Check that device is present:
ls -la /dev/ttyACM*
# Add user to dialout group for serial access:
sudo usermod -aG dialout $USER
Ethernet / UDP
For high-bandwidth applications (multiple camera streams, dense logging), Ethernet via UDP is the preferred connection. The Pixhawk 6C supports Ethernet natively. Connect via an Ethernet switch or direct cable. Configure both devices on the same subnet and use UDP MAVLink on port 14550.
UDP has the highest bandwidth but requires more configuration and is not appropriate for critical control links (use as secondary only).
MAVROS Setup on ROS 2
MAVROS (MAVLink ROS) provides ROS 2 topics and services as a bridge to the MAVLink-connected flight controller. It translates between ROS 2 message types and MAVLink messages in both directions.
Install ROS 2 Humble and MAVROS
# Install ROS 2 Humble (Ubuntu 22.04):
sudo apt install software-properties-common
sudo add-apt-repository universe
sudo apt update && sudo apt install curl -y
sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc | sudo tee /etc/apt/trusted.gpg.d/ros.asc
echo "deb http://packages.ros.org/ros2/ubuntu $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/ros2.list
sudo apt update
sudo apt install ros-humble-ros-base -y
# Install MAVROS and geographic libraries:
sudo apt install ros-humble-mavros ros-humble-mavros-extras -y
wget https://raw.githubusercontent.com/mavlink/mavros/master/mavros/scripts/install_geographiclib_datasets.sh
sudo bash ./install_geographiclib_datasets.sh
Launch MAVROS
# Source ROS 2:
source /opt/ros/humble/setup.bash
# Launch MAVROS connected via serial UART at 921600 baud:
ros2 launch mavros px4.launch fcu_url:=serial:///dev/serial0:921600
# Or via USB:
ros2 launch mavros px4.launch fcu_url:=serial:///dev/ttyACM0:115200
# Verify connection — should show MAVLink heartbeat:
ros2 topic echo /mavros/state
A successful connection shows connected: True in the state topic. You can then read telemetry, send commands, and control flight modes via ROS 2 topics.
Key MAVROS Topics
| Topic | Direction | Description |
|---|---|---|
/mavros/state | Subscribe | Connection status, flight mode |
/mavros/imu/data | Subscribe | IMU data (accelerometer, gyro) |
/mavros/global_position/global | Subscribe | GPS position (lat/lon/alt) |
/mavros/local_position/pose | Subscribe | Local EKF position (X/Y/Z) |
/mavros/battery | Subscribe | Battery voltage and percentage |
/mavros/setpoint_position/local | Publish | Send position setpoints |
/mavros/setpoint_velocity/cmd_vel | Publish | Send velocity commands |
/mavros/cmd/arming | Service call | Arm/disarm vehicle |
/mavros/set_mode | Service call | Change flight mode |
DroneKit Python: Simpler Scripting
For mission scripting without the complexity of ROS 2, DroneKit-Python provides a higher-level API over MAVLink. It connects directly to the FC via a serial or UDP connection without requiring ROS.
from dronekit import connect, VehicleMode, LocationGlobalRelative
import time
# Connect to vehicle via serial:
vehicle = connect('/dev/serial0', wait_ready=True, baud=921600)
# Print current state:
print(f"Mode: {vehicle.mode.name}")
print(f"Armed: {vehicle.armed}")
print(f"GPS: {vehicle.gps_0}")
print(f"Battery: {vehicle.battery}")
# Arm and take off to 10m:
vehicle.mode = VehicleMode("GUIDED")
vehicle.armed = True
while not vehicle.armed:
time.sleep(1)
vehicle.simple_takeoff(10) # meters
# Wait until altitude reached:
while True:
if vehicle.location.global_relative_frame.alt >= 9.5:
break
time.sleep(0.5)
# Move to a waypoint:
target = LocationGlobalRelative(47.3977419, 8.5456058, 10)
vehicle.simple_goto(target)
time.sleep(20)
# Land:
vehicle.mode = VehicleMode("LAND")
vehicle.close()
DroneKit is simpler than MAVROS for straightforward mission automation but has less community support for newer PX4 versions and lacks the real-time sensor stream capabilities of MAVROS.
Computer Vision Pipeline
A typical computer vision pipeline on a companion computer uses:
- Camera capture: USB camera (e.g., Logitech C920), CSI camera (Raspberry Pi Camera Module 3), or depth camera (Intel RealSense D435, Stereolabs ZED)
- Frame processing: OpenCV for classical computer vision, TensorFlow Lite or ONNX Runtime for inference
- MAVLink command output: Based on detection results, send position/velocity setpoints to flight controller
import cv2
import numpy as np
from dronekit import connect
# Initialize camera:
cap = cv2.VideoCapture(0)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, 640)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)
# Connect to FC:
vehicle = connect('/dev/serial0', wait_ready=True, baud=921600)
while True:
ret, frame = cap.read()
if not ret:
continue
# Example: detect red objects with HSV thresholding:
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(hsv, np.array([0, 120, 70]), np.array([10, 255, 255]))
contours, _ = cv2.findContours(mask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
if contours:
largest = max(contours, key=cv2.contourArea)
if cv2.contourArea(largest) > 1000:
M = cv2.moments(largest)
cx = int(M["m10"] / M["m00"])
cy = int(M["m01"] / M["m00"])
# cx, cy = pixel coordinates of target center
# Convert to real-world coordinates and send MAVLink setpoint...
For neural network inference on Jetson, use TensorRT for maximum throughput. A YOLOv8n model on the Jetson Orin Nano runs at 60+ FPS.
Power Requirements and Thermal Management
Power Budget
Companion computers draw significant power. Budget carefully:
| Computer | Idle Power | Peak Load Power | Recommended Supply |
|---|---|---|---|
| Raspberry Pi 4 (4GB) | 3.4W | 7.6W | 5V/3A (15W) BEC |
| Raspberry Pi 5 (8GB) | 4.5W | 12W | 5V/5A (25W) BEC |
| Jetson Nano (devkit) | 5W | 10W | 5V/4A (20W) BEC |
| Jetson Orin Nano | 7W | 15W | 5V/5A or direct 5V |
| Orange Pi 5 | 3W | 8W | 5V/3A (15W) BEC |
Use a dedicated BEC regulator for the companion computer — never share with ESC BEC outputs that also power servos or high-current devices. Voltage droop from shared rails causes brownouts and hard crashes.
Thermal Management
At full load, a Raspberry Pi 5 can reach 80°C without airflow. Onboard a drone, the aircraft's own forward motion provides cooling, but this may not be sufficient during stationary hover or slow flight.
Options:
- Active cooling: small 25×25mm 5V fan, 1–3g, effective even at low RPM
- Passive heatsink: sufficient for Pi 4 in most drone applications below 30°C ambient
- Throttling: Linux will throttle CPU at 80°C — monitor with
vcgencmd measure_temp
Mount the companion computer in a location with airflow. Avoid mounting directly under a battery (insulates heat). A bottom-plate mount with exposure to underframe airflow is often sufficient.
Frequently Asked Questions
Do I need a companion computer for autonomous waypoint missions?
No. Flight controllers running ArduPilot or PX4 can execute pre-planned waypoint missions autonomously without a companion computer. You upload the mission via QGroundControl and it executes on the FC. A companion computer is needed only for dynamic replanning, computer vision, external sensor integration, or tasks that exceed the FC's computational capacity.
What is the minimum RAM for running MAVROS?
MAVROS itself uses approximately 200–400 MB RAM. A full ROS 2 Humble desktop install uses 1–2 GB. The Raspberry Pi 4 with 4 GB is the practical minimum for comfortable MAVROS + basic computer vision. The 2 GB Pi 4 variant is marginal and may struggle under load.
Can I use a Raspberry Pi Zero 2 W as a companion computer?
The Pi Zero 2 W (1 GHz quad-core A53, 512 MB RAM) can run minimal MAVROS or DroneKit for basic applications — logging, relay, or simple scripts. It cannot run ROS 2 or any computer vision pipeline due to RAM limitations. At 10g it's excellent for weight-constrained builds where only lightweight scripting is needed.
How does the companion computer affect flight time?
Each watt of companion computer power is approximately 0.4–0.7 Ah per hour of flight time on a typical 4S 3000mAh battery. A Raspberry Pi 4 consuming 5W average reduces a 30-minute flight to approximately 27–28 minutes. Plan your battery capacity accordingly or use a separate small LiPo for companion power (common in long-endurance builds).
What's the best way to handle safe shutdown of the companion computer before landing?
Write a script that monitors MAVLink messages for the PRE_DISARM event and initiates a clean shutdown (sudo shutdown -h now). Alternatively, monitor the flight controller's mode — when it transitions to LAND or DISARMED after landing, trigger shutdown. Never just cut power to a running Linux system; unclean shutdowns corrupt the SD card over time. A hardware watchdog combined with an overlay filesystem (read-only rootfs) eliminates this concern entirely for production deployments.
Which companion computer is best for a 5" long-range build?
Weight is the dominant constraint on a 5" build. The Raspberry Pi Zero 2 W (10g) handles DroneKit scripting and lightweight logging. The Raspberry Pi 4 2GB (45g) handles MAVROS and moderate computer vision. Anything heavier (Jetson, Pi 5) typically requires upgrading to a 7" or larger frame. For pure MAVLink telemetry forwarding without computation, a small ESP32 (4g) running MAVLink-router is the lightest option. See the MAVLink telemetry guide for ESP32 Wi-Fi bridge setup details.
How do I store maps and offline data on the companion computer?
For operations in areas with poor connectivity, pre-download map tiles to QGroundControl's cache directory on the companion computer before flight. ArduPilot and PX4 support fully offline mission execution — the GCS only needs a map for visualization, not for mission execution. Store missions as .waypoints or .plan files on the companion computer and load them via MAVLink at flight time. For terrain-following missions, pre-download the terrain elevation database to the FC's SD card as well. Log all telemetry locally on the companion computer as a backup to the FC's dataflash log.
Was this guide helpful?
Related Articles
PX4 vs ArduPilot: The Complete Comparison for Drone Developers
A thorough comparison of PX4 and ArduPilot flight firmware — architecture, hardware support, use cases, community, and which to choose.
MAVLink Telemetry Setup and Optimization Guide
Complete guide to MAVLink telemetry for drones — protocol overview, hardware options, GCS setup, link budget, latency tradeoffs, encrypted telemetry, and debugging.
PX4 Setup Guide: Complete Step-by-Step for Beginners
A complete beginner's guide to setting up PX4 autopilot — hardware selection, QGroundControl installation, firmware flashing, sensor calibration, RC setup, flight modes, and first flight.