[ PROJECT: 003 ] [ MODE: DETAIL_VIEW ]

Crop Weed Detection

May 2025

YOLOv8OpenCVROS2Computer VisionEdge AI

Accuracy

97%

Throughput

12 FPS

Dataset

3000 samples

System ArchitectureINTERACTIVE_DIAGRAM

Overview

A real-time precision agriculture system that detects crop weeds from onboard camera feed using a fine-tuned YOLOv8 model. Deployed on the ROSMaster X3 ground robot, it runs inference at 12 FPS with 97% accuracy and publishes detection events as ROS2 topics for downstream actuation.

Problem Statement

Manual weed identification and removal is labor-intensive and imprecise. Herbicide over-application damages crops and the environment. This project targets autonomous weed detection aboard a ground robot navigating crop rows — enabling precise, per-plant interventions. The key constraint is edge deployment: the model must run at acceptable frame rates on an embedded ARM processor without a dedicated GPU.

Dataset & Training

A custom dataset of 3000 labeled images was assembled from the CropWeed Field Image Dataset and augmented with brightness jitter, random crop, and horizontal flip to improve robustness to lighting and field conditions. YOLOv8-nano was chosen for edge deployment. Fine-tuning ran for 100 epochs on an NVIDIA GPU, reaching 97% mAP@0.5 on the held-out test split.

ROS2 Integration

The inference pipeline was wrapped as a ROS2 lifecycle node subscribing to the robot's camera topic (/camera/rgb/image_raw). Each frame is preprocessed (resize to 640×640, normalize) and passed to the ONNX-exported model. Detections are published as a custom DetectionArray message containing bounding boxes, class IDs, and confidence scores. A second visualization node overlays detections onto the raw feed for operator monitoring.

Edge Deployment on ROSMaster X3

The ROSMaster X3 runs a Rockchip RK3588S SoC. The YOLOv8-nano ONNX model was quantized to INT8 and executed via the RKNN SDK, achieving 12 FPS — sufficient for row-speed robot navigation. CPU and memory usage stay within 60% headroom, leaving resources for navigation and communication stacks.

Downstream Actuation

Detection events trigger a ROS2 action server that logs GPS coordinates (via GNSS fusion) of each weed position. In the full system concept, this feeds a sprayer controller for targeted herbicide application. The current implementation outputs a CSV weed map for post-mission analysis, with the actuation interface stubbed for hardware integration.

weed_detector.pyfingerprint
def detect(self, frame):
    results = model.predict(
        frame, conf=0.5
    )
    for box in results.boxes:
        cls = box.cls.item()
        if cls == WEED_ID:
            self.flag(box.xyxy)
videocamProject Demo
VIDEO

Tools & Stack

YOLOv8Python 3.11OpenCVROS2 (Humble)ROSMaster X3ONNXRKNN SDKPyTorchNumPy

Key Outcomes

  • 97% mAP@0.5 on held-out test set of 600 images

  • 12 FPS real-time inference on ROSMaster X3 ARM processor

  • Full ROS2 pipeline from camera feed to weed map output

  • INT8 quantization reducing model size by 4× vs float32