Gaze-Driven In-Flight Drone Control Utilizes Lightweight Wearable Eye-Tracking System

Researchers from the University of Pennsylvania, the New York University, and the U.S. Army Research Laboratory have developed a gaze-driven method of controlling unmanned aerial vehicles. The deep learning system uses NVIDIA GPUs to facilitate in-flight drone control by a pilot directing their eyes in the direction they want the drone to travel.

Tobii Pro Glasses 2 – a lightweight, non-invasive, wearable eye-tracking system that also includes an inertial measurement unit (IMU) and a high-definition camera that detects the positioning of the operators eyeballs – have been used in the project. A portable Nvidia Jetson TX2 CPU and GPU is used to process the data. With this system in place, the glasses detect the drone’s position, track the eyeball’s movements, combine the location and orientation data of those two elements, and calculate the distance between the operator and the UAV based on the drone’s apparent size.

“To compute the 3D navigation waypoint, we use the 2D gaze coordinate provided from the glasses to compute a pointing vector from the glasses, and then randomly select the waypoint depth within a predefined safety zone,” the research paper explains. “Ideally, the 3D navigation waypoint would come directly from the eye tracking glasses, but we found in our experiments that the depth component reported by the glasses was too noisy to use effectively. In the future, we hope to further investigate this issue in order to give the user more control over depth.”

In future, the researchers expect that their system will enable people with very little drone experience to safely and effectively fly drones in situations where finding a dedicated drone pilot might not be realistic.