Autonomous Navigation and Path Planning

I am developing network-aware edge-based SLAM techniques and tailoring them for deployment on small drones with limited computation power. I have developed a robust data pipeline that transmits live video feed from drones to a SLAM system for feature detection and map construction. My system is also compatible with Docker for efficient deployment and portability.

I have also incorporated network analysis in the Edge-SLAM. By monitoring latency and throughput of the live video feed in real-time, the SLAM system can dynamically switch between edge processing and client processing. This enables Edge-SLAM to continue mapping even under suboptimal connectivity conditions, ensuring a resilient and adaptable solution for autonomous drone navigation in extreme outdoor environments.

Results

The testing setup included Aurelia X6 Standard drone, ZED Depth camera for live depth data, and Jetson AGX Xavier for SLAM computation. The ZED camera and Jetson were attached under the drone.

Aurelia X6 Standard with ZED Depth Camera and Jetson AGX Xavier were used for testing purposes.

Testing was carried out in the Engineering Building courtyard at Binghamton University. Videos below show features and the drone location detected by our SLAM algorithm in real-time.

Dense point cloud generated in a feature-rich environment by our SLAM algorithm.

Our SLAM algorithm is able to detect the floor and the trees around the drone, even when the number of obstacles in front is sparse.

Feature detection between walls and trees.

Robot Task Planning with LLMs

During my undergraduate research, I explored the capabilities of large language models like GPT-3 in interpreting grounded instructions for everyday tasks which can be executed by robots. Through prompt engineering and parameter optimization, I was able to demonstrate that LLMs can efficiently understand and break up everyday tasks into executable actions without retraining or fine-tuning.

To read more about the implementation, experiments, and results, please check out the paper here.