PX4 + ROS2 + Visual SLAM + VIO + Obstacle Avoidance + Autonomous Return-Home
This repository contains the full software stack for a fully autonomous drone capable of navigating, mapping, avoiding obstacles, detecting targets, and returning to its launch position without GPS.
Designed for research-grade missions, ISRO-style No-GPS challenges, DARPA-style underground navigation, and indoor autonomous flight.
β GPS-Denied Navigation β Visual SLAM + VIO Fusion β Autonomous Exploration β Obstacle Detection & Avoidance β Visual Landing Pad Detection β No-GPS Return-to-Home (Keyframe Homing) β Fully ROS2-based Modular Architecture β PX4 Offboard Flight Control β Simulation Ready (PX4 SITL + Gazebo/Ignition)
- Frontend: Feature extraction + tracking
- Backend: Bundle adjustment + loop closure
- Output:
/slam/pose,/slam/map
- IMU + SLAM + optical flow + barometer
- Provides
/odom,/tf - Smooth, drift-corrected pose
- Depth obstacle detection
- Optical flow for drift control
- AprilTag / landing pad tracking
- Voxel map / Octomap generation
- Used by local planner for obstacle avoidance
- Global Planner β A* / D* Lite
- Local Planner β MPC / DWA
- Trajectory smoothing β Polynomial/MPC
- Flight FSM
- Exploration logic
- Fail-safe handling
- Visual return-home
- Battery-aware landing
- Offboard mode control
- Setpoint publishing
- Arm/takeoff/land API
no_gps_drone/
βββ README.md
βββ docker/
βββ environment/
βββ src/
β βββ slam/
β βββ perception/
β βββ state_estimation/
β βββ mapping/
β βββ planning/
β βββ mission_manager/
β βββ px4_bridge/
β βββ control/
β βββ simulation/
βββ launch/
βββ config/
βββ data/
βββ tests/
βββ docs/
no_gps_drone/
β
βββ README.md
βββ LICENSE
βββ .gitignore
βββ docker/
β βββ Dockerfile.dev
β βββ Dockerfile.sim
β βββ docker-compose.yml
β βββ entrypoint.sh
β
βββ environment/
β βββ ros2.repos # vcs import repos (ORB-SLAM3, mavlink, perception libs)
β βββ requirements.txt # Python requirements
β βββ setup_instructions.md
β
βββ src/
β βββ slam/
β β βββ orb_slam3_ros/
β β βββ rtabmap_ros/
β β βββ vio_fusion/ # VIO/IMU fusion wrapper (ekf2 alternative)
β β
β βββ perception/
β β βββ apriltag_detector/
β β βββ optical_flow/
β β βββ obstacle_depth/
β β βββ landing_pad_detector/
β β
β βββ state_estimation/
β β βββ ekf_fusion/
β β βββ imu_preintegration/
β β βββ tf_manager/
β β
β βββ mapping/
β β βββ octomap_server/
β β βββ voxel_map/
β β βββ occupancy_grid_tools/
β β
β βββ planning/
β β βββ global_planner/ # A*/D* Lite / RRT*
β β βββ local_planner/ # MPC / DWA / APF
β β βββ trajectory_optimizer/ # polynomial, bezier, or MPC smoothening
β β βββ path_follower/ # converts path->waypoints->commands
β β
β βββ mission_manager/
β β βββ autonomous_flight_node/
β β βββ return_home_manager/
β β βββ keyframe_homing/
β β βββ failsafe_manager/
β β βββ battery_monitor/
β β βββ mission_api.srv
β β
β βββ px4_bridge/
β β βββ microRTPS_agent/
β β βββ px4_msgs/
β β βββ mavros_plugins/
β β βββ setpoint_api/
β β
β βββ control/
β β βββ attitude_controller/
β β βββ velocity_controller/
β β βββ landing_controller/
β β
β βββ utils/
β β βββ transforms/
β β βββ logging_tools/
β β βββ calibration/
β β βββ math_lib/
β β
β βββ simulation/
β βββ gazebo_worlds/
β βββ px4_sitl_launcher/
β βββ sensor_emulators/
β βββ fake_vio/
β βββ fake_apriltag/
β βββ challenge_worlds/
β
βββ launch/
β βββ full_system.launch.py
β βββ slam_only.launch.py
β βββ perception.launch.py
β βββ planning.launch.py
β βββ mission.launch.py
β βββ return_home_test.launch.py
β βββ sim_world.launch.py
β
βββ config/
β βββ cameras/
β β βββ calibration.yaml
β β βββ stereo_params.yaml
β β βββ rectification.yaml
β βββ ekf/
β β βββ ekf_params.yaml
β β βββ noise_models.yaml
β βββ planners/
β β βββ global_planner.yaml
β β βββ local_planner.yaml
β β βββ mpc.yaml
β βββ slam/
β β βββ orb_slam3.yaml
β β βββ rtabmap.yaml
β βββ mission/
β β βββ mission_params.yaml
β βββ px4/
β βββ fw_params.params
β βββ ekf2_no_gps.params
β βββ vision_yaw_fusion.params
β
βββ data/
β βββ bags/
β β βββ flight1/
β β βββ slam_debug/
β βββ logs/
β β βββ test_runs/
β β βββ errors/
β βββ maps/
β β βββ octomap/
β β βββ voxel/
β βββ keyframes/
β
βββ tests/
β βββ hardware_tests/
β β βββ imu_noise_test.md
β β βββ camera_latency_test.md
β β βββ system_id/
β β
β βββ simulation_tests/
β β βββ slam_relocalization_test.md
β β βββ return_home_test.md
β β βββ obstacle_avoidance_test.md
β β βββ landing_accuracy_test.md
β β
β βββ unit_tests/
β βββ test_slam_utils.cpp
β βββ test_planner.py
β βββ test_mission_node.cpp
β
βββ docs/
βββ architecture.md
βββ sensors_and_calibration.md
βββ mission_fsm.md
βββ return_home_algorithm.md
βββ failsafe_modes.md
βββ simulation_setup.md
βββ evaluation_metrics.md
Full explanation is inside docs/architecture.md.
git clone https://github.com/your-name/no_gps_drone.git
cd no_gps_drone
sudo apt install python3-colcon-common-extensions \
ros-humble-navigation2 \
ros-humble-slam-toolbox \
ros-humble-tf2-tools
vcs import < environment/ros2.repos
pip install -r environment/requirements.txt
colcon build --symlink-install
source install/setup.bash
cd PX4-Autopilot
make px4_sitl gazebo
ros2 launch no_gps_drone full_system.launch.py
The drone uses keyframe-based visual homing:
- Capture keyframes during outbound flight
- Store positions + descriptors
- For RTH, match live camera feed to stored keyframes
- Use reprojection + homography to estimate direction home
- Global planner generates RTH path
- Local planner avoids obstacles
- Autonomous landing at return position
Detailed in: docs/return_home_algorithm.md.
- SLAM drift test
- Relocalization test
- Obstacle avoidance test
- No-GPS return-home test
- Landing accuracy test
Run:
ros2 launch no_gps_drone simulation/sim_world.launch.py
- PX4 flight controller (Pixhawk 6C / CUAV X7 / Holybro Durandal)
- Stereo camera (Intel Realsense D455 / ZED2 / MYNT-EYE)
- IMU (built-in or external)
- Companion computer (Jetson Orin Nano / Xavier NX / Raspberry Pi 5)
- Optical flow sensor (optional)
- LiDAR or depth camera (optional)
- ROS2 Humble / Iron
- PX4 / MAVROS / microRTPS
- ORB-SLAM3 or RTAB-Map
- Nav2 Stack
- FastDDS
- OpenCV / Eigen / g2o / Ceres
PRs, issues, and feature requests are welcome.
Follow the coding standards in:
docs/contribution_guidelines.md
For queries, reach out at: bibinnbiji924@gmail.com