November 27, 2018
Flying high: Computer graphics researchers make drone navigation easier
WEST LAFAYETTE, Ind. — Drone operators struggling to fly the multi-propeller device and take pictures simultaneously could soon have a much simpler method to steering flight.
Bedrich Benes, a professor of computer graphics technology, and doctoral student Hao Kang collaborated with corporate researchers to develop a touch-screen method to navigate and take pictures with drones.
The method, called FlyCam, works with the concept of combining the drone and the camera movements.
“So the user doesn’t have to think about multiple controls for the drone and the camera,” Benes said. “He or she can think about the drone as a simple three-dimensional flying camera that is being controlled by simple gestures on a touch-screen device.”
The research was published in the October issue of IEEE Robotics and Automation Letters.
A video of the research is available online at https://youtu.be/SEvfRBMVTH8.
FlyCam uses one- and two-finger drags across a smartphone or tablet to control the drone as it accelerates or turns and takes images. The drone moves forward or backward along the camera’s axis with single or double taps to the screen.
Traditional, more complicated drone controls utilize dual joysticks for the drone navigation as well as an addditional joystick and gimbal – a pivoted support that allows rotation on a single axis – to control the camera.
“We did a user study and most of the users performed with the FlyCam better,” Kang said. “It is easier to use just a single simple mobile device compared to combination of cumbersome remote controls.”
As part of the research, Benes said they looked at the ability of licensed drone pilots with a remote control and compared that to people who have picked up a drone for the first time using FlyCam.
“And the people who have picked up a drone for the first time were equal or better than those who are licensed to fly,” Benes said. “That is what actually impressed us the most.”
FlyCam was testing using an Android system. Fliers worked better when using a tablet, which allowed for larger movements and better control.
Benes said he and Kang will continue working with on the research, which will include automated photos from the camera.
Writer: Brian Huchel, 765-494-2084, firstname.lastname@example.org
Source: Bedrich Benes, 765-496-2954, email@example.com
FlyCam: Multitouch Gesture Controlled Drone Gimbal Photography
Bedrich Benes, Purdue University; Hao Kang, Purdue University; Haoxing Li; Jianming Zhang; Xin Lu
We introduce FlyCam-a novel framework-for gimbal drone camera photography. Our approach abstracts the camera and the drone into a single flying camera object so that the user does not need to think about the drone movement and camera control as two separate actions. The camera is controlled from a single mobile device with six simple touch gestures such as rotate, move forward, yaw, and pitch. The gestures are implemented as seamless commands that combine the gimbal motion with the drone movement. Moreover, we add a sigmoidal motion response that compensates for abrupt drone swinging when moving horizontally. The smooth and simple camera movement has been evaluated by user study, where we asked 20 human subjects to mimic a photograph taken from a certain location. The users used both the default two joystick control and our new touch commands. Our results show that the new interaction performed better in both intuitiveness and easiness of navigation. The users spent less time on task, and the System Usability Scale index of our FlyCam method was 75.13, which is higher than the traditional dual joystick method that scored at 67.38. Moreover, the NASA task load index also showed that our method had lower workload than the traditional method.