Autonomous Quadrotor Project Update

As a general tech/robot nerd and a human being living in 2017, one can’t help but notice that drones (more formally UAVs, Unmanned Aerial Vehicles) are everywhere. From birthday and holiday gifts to defense applications to delivering your Amazon Prime packages in the future (which, as an avid Prime user, I’m excited about), drones have reached the mainstream and seem to be here to stay.

Previously they had very limited applications for military and defense uses for intelligence gathering, to self-guiding flying bombs…for lack of a better word, but thanks to those applications and research, it helped mature this now massive sector of technology and is finding itself being a hot topic of research. Labs like GRASP Labs at Penn have shown time and time again how far you can push aerial robotics. People see applications from surveying agricultural infrastructure to first responder swarms and multi-robot control. If you’re interested in more information about drones, you can find more at this link.

Thanks to the holidays I got a Parrot Bebop as a gift. These little pieces of hardware are pretty impressive, they come with an SDK as well as a slew of sensors such as:

  • 3-axes magnetometer
  • 3-axes gyroscope
  • 3-axes accelerometer
  • Optical-flow sensor: Vertical stabilization camera (Every 16 milliseconds, an image of the ground is taken and compared to the previous one to determine the speed of the Bebop Drone)
  • Ultrasound sensor (Analyzes the flight altitude up to 8 meters)
    Pressure sensor
  • GPS

Which is fantastic for such a tiny form factor. The Android/iOS App is also pretty solid for flying around, trying to take some cool videos and pictures.

With such capable hardware, I decided it would be a cool challenge to try and add as much autonomy to this platform as I could. I’m interested in trying to implement things such as:

  • Monocular SLAM (Simultaneous Localization and Mapping)
    • Probably with AR Tags for the first run through.
  • Keypoint Tracking
  • Obstacle Recognition and Avoidance
  • State Estimation through implementation of a Kalman Filter/Sensor Fusion
  • Path Planning with a built up map.

Thankfully, the ROS community was quick to integrate the SDK into the ROS ecosystem, which I have a fork of with some of my own personal GitHub with some of my own personal changes on it. You can get it by running:

git --branch 0.6.1

This driver will work for Bebop 1 and 2. This is a great starting point for a lot of what I’m working on.

The first thing I did after I got the driver building was to integrate a joystick interface to control the drone and fly it around. You can find that package here:

Below is a short gif showing me controlling the drone (and prompting it to do a flip).



Demonstraion of Joystick control of the drone via ROS Driver.


To run this, you will need to have cloned in the Bebop Autonomy Package and the Bebop Joystick packages into a catkin workspace and built it, then connect to the Bebop’s wifi network. Then in two terminals run:

$ roslaunch bebop_driver bebop_node.launch

Then in the second terminal:

$ roslaunch bebop_joystick joy_teleop.launch

You will probably want to take a look at how my config file is setup and tweak it to match your controller.

Using a Drone with AR Tags

To extend on my first post, Detecting and Tracking AR Tags, configuring a Bebop to track and follow AR tags is pretty easy. I have the launch file already configured to work with my URDF, which you can find here:

Once you run the bebop driver, and launch the Bebop drone, you will then launch this file by running:

$ roslaunch ar_tag_toolbox ar_track_bebop.launch



Demonstration of 3D Localization of the AR Tag in space with respect to the drone’s camera frame.


If all goes well, you should see something similar to the image above.
It should be noted that you can see the STL model of the drone moving with respect to how the gyro is sensing the drone to be twisting. You should also note the familiar square indicating the location and rotation of the AR tag with respect to the camera, and arrows showing that the pose of the tag is with respect to the camera frame.

If you don’t get URDFs, fret not, it’s a post I’m working on in parallel!

If you liked what you read, please share and subscribe! I want to share any and all knowledge I can with the world!