https://photos.app.goo.gl/hEGGxqTiPUGiFu7Q6
HKUST ELEC5660
Project Overview:
Project 1 Control and Planning
- Phase 1: Quadrotor trajectory tracking control. A simulator implementing the dynamics model of a quadrotor is given. You need to implement a controller that outputs force and moment while commanding the quadrotor to track pre-defined trajectories.
- Phase 2: Optimization-based trajectory generation. Implementation of a trajectory generator to obtain a trajectory that connects pre-defined waypoints and meets smoothness requirements. Use the controller in Phase 1 to track the trajectory.
- Phase 3: Path planning + trajectory generation + control. Grid maps containing obstacles and start and end locations are provided. You need to implement an A* pathfinder to search for the shortest path with a safety guarantee. Then, you should connect your path using the previous trajectory generator and track it with your controller.
- Phase 4: In this lab assignment, you will learn how to manually and autonomously control the drone. You need to set up the development environment and fly your drone in autonomous control mode with a motion capture system called OptiTrack. By using OptiTrack, you can get highly accurate position feedback on the drone. You need to verify the controller and trajectory planning algorithms that you developed.
Project 2 Visual Estimator
- Phase 1: PnP-based localization on marker map. You are provided with images containing AR markers and a tag detector to calculate the 3D positions for those markers. You need to implement the PnP-based localization method to calculate the camera pose corresponding to each image.
- Phase 2: Visual odometry in a markerless environment. You need to implement a PnP-based estimator to estimate the camera’s incremental motion. The provided images contain no AR marker, so you need to do feature detection and matching and use them for single keyframe-based pose estimation.
Project 3 EKF Sensor Fusion
- Phase 1: Sensor fusion of IMU and PnP localization on marker using EKF. You need to implement the process model of IMU, and the measurement models of the PnP pose estimator to integrate them into an EKF-based sensor fusion method.
- Phase 2: Sensor fusion of keyframe-based visual odometry together with Phase 1 using augmented state EKF.
- Phase 3: In this lab assignment, you need to integrate the whole system onboard. Using the information from the IMU and camera, your visual estimator and the EKF compute the state of the quadrotor. The drone will use this state for feedback control and execute trajectories computed by your path planner and your trajectory generator.