Research Complete2025

ARES

Autonomous drone system that executes natural language mission commands

Next.jsROS2LangGraphOllamaPythonPX4YOLO

Full simulation with complete mission execution, real hardware agentic takeoff demonstrated

Video Demo

Northrop Grumman gave my UCF senior design group a research problem: build an agentic drone system that accepts natural language mission commands and executes them autonomously. Something like "fly around this room and identify all the red boxes" and the drone just does it, building a map as it goes.

How the pieces fit together

I led the software architecture. The core idea: use a locally-hosted LLM through Ollama and LangGraph to parse natural language commands into structured ROS2 action sequences that Nav2 and PX4 could actually execute. The LLM handles the translation from intent to action plan. Nav2 handles the navigation execution. PX4 talks to the flight controller. RTAB-Map handles real-time SLAM as the drone flies, continuously building a 3D spatial model of its environment.

Custom ROS2 nodes handled image processing and ran a YOLO model for object detection. A WebSocket bridge connected the ROS ecosystem to a React dashboard so a human operator could see live video, the SLAM feed, and telemetry simultaneously. The human-in-the-loop design was intentional: the system was autonomous but the operator could monitor and intervene.

Building hardware from scratch with no hardware background

None of us were electrical or computer engineers, so designing and building a custom drone from a bare frame was genuinely steep. Parts had long lead times. Our first drone kit sold out while on order. We had real crashes during testing that set us back weeks. The Jetson Orin Nano plus the ZED2i stereo camera plus the PixHawk flight controller plus the full ROS2 stack is a lot of weight to get airborne stably.

The simulation environment in Gazebo worked reliably throughout, but the sim-to-reality gap was real. Behavior that was consistent in simulation sometimes fell apart on hardware in ways that were hard to reproduce and diagnose. That gap is a known problem in robotics and we hit it hard.

What we got to

Fully working simulation with complete mission execution. Agentic autonomous takeoff demonstrated on real hardware. We could not get full missions running in the real world before the semester ended, primarily because the drone kept ending up overweight after adding all the sensors and we ran out of time to tune the flight controller for the actual weight distribution. Northrop Grumman was impressed with the depth of the simulation work and the research quality. We produced a 120-page paper and presented to UCF faculty, Northrop Grumman leadership, and at a science fair.