As part of my Master’s thesis, I have developed a robust data pipeline that transmits live video feed from drones to a SLAM system for feature detection and map construction. My system is also compatible with Docker for efficient deployment and portability.

I have also incorporated network analysis in the Edge-SLAM. By monitoring latency and throughput of the live video feed in real-time, the SLAM system can dynamically switch between edge processing and client processing. This enables Edge-SLAM to continue mapping even under suboptimal connectivity conditions, ensuring a resilient and adaptable solution for autonomous drone navigation in extreme outdoor environments.

Results

The testing setup included Aurelia X6 Standard drone, ZED Depth camera for live depth data, and Jetson AGX Xavier for SLAM computation. The ZED camera and Jetson were attached under the drone.

Aurelia X6 Standard with ZED Depth Camera and Jetson AGX Xavier were used for testing purposes.

Testing was carried out in the Engineering Building courtyard at Binghamton University. Videos below show features and the drone location detected by our SLAM algorithm in real-time.

Dense point cloud generated in a feature-rich environment by our SLAM algorithm.

Our SLAM algorithm is able to detect the floor and the trees around the drone, even when the number of obstacles in front is sparse.

Feature detection between walls and trees.