3D hover with wide FOV optical flow

Firmware/software/electronics/mechanics
geof
Beginner
Posts: 17
Joined: Fri Jan 17, 2014 6:22 pm

3D hover with wide FOV optical flow

Post by geof »

Hello everyone,

I'm very happy that the folks behind Crazyflie decided to make this product as a developmental platform. I've done a lot of work in the past putting optical flow and vision on helicopters slightly larger (such as the eFlite mCX and mQX) and now want to tackle something smaller. The Crazyflie is perfect!

Usually when optical flow is put onto quads, just a single sensor is pointed down. This works fine if you also have a sonar sensor or stereo vision to get direct height. However we find it better to aim the sensors sideways e.g horizontally- in addition to stabilizing drift you can also stabilize height and even yaw if you wish (though gyros are good enough to take care of that). Also in most environments there is a lot more texture sideways than on the floor. Here is some of my past work: http://66.147.244.149/~centeyen/technol ... -in-place/

So my first goal is to put four optical flow sensors onto the Crazyflie and do a full 3D hover. (I've already done this on a larger mQX quad.) My first attempt will use four of our vision chips mounted on the arms of the quad. This will give us 360-degree optical flow around the Crazyflie, with small gaps between the lenses. For processing I'll probably use a Teensy3.1 Altogether I think the vision system will weigh about 5 grams. That's pretty heavy, but we can reduce that to about a gram later. For now I'm just trying to get up and running.

Geof
csholmq
Beginner
Posts: 12
Joined: Fri Dec 27, 2013 1:20 pm

Re: 3D hover with wide FOV optical flow

Post by csholmq »

Very nice approach. How is the fusion between the different sensors done?
geof
Beginner
Posts: 17
Joined: Fri Jan 17, 2014 6:22 pm

Re: 3D hover with wide FOV optical flow

Post by geof »

Good question. First, the optical flow sensor array extracts global optical flow patterns and produces measurements of motion along corresponding cardinal directions. For example the average of the vertical optical flow components of the four sensors produces a height change (or height rate) measurement. Weighted sums of the horizontal optical flow components can be used to compute similar forward-backwards motion measurements (from the sideways sensors) and sideways motion measurements (from the forward and backward sensors). These are the values that get sent to the quadrotor's controller.

On the quadrotor itself, the three measurements are then used to generate synthetic human stick inputs. A PD loop on the height change modulates the thrust, and keeps the quadrotor at the same height. Similar PD loops for the horizontal drift motions can be applied to the pitch and roll signals. It's not that difficult.

So all that above accomplishes just hover. To put the human back in control, you can do things like have the human sticks adjust set points, that the controller then uses to control the quadrotor. If the human pushes the thrust stick up, the quadrotor is given a new height set point that will cause it to climb, and vice versa. This way the human can fly the quad around, and then release the sticks and the quad will hold that position. Theoretically it could hold that position forever, if the visual texture has a strong enough signal. (We've gotten to 6 minutes on our hacked 'mCX and were limited only by the battery life.)

I'd upload a paper that summarizes the whole thing but it seems that PDF attachments are not accepted.
csholmq
Beginner
Posts: 12
Joined: Fri Dec 27, 2013 1:20 pm

Re: 3D hover with wide FOV optical flow

Post by csholmq »

I got most of that from your link and videos. I was wondering if you had some more info on
First, the optical flow sensor array extracts global optical flow patterns and produces measurements of motion along corresponding cardinal directions.
but I realize now that you actually produce this product.

Are the sensors available for purchase?
orcinus
Member
Posts: 36
Joined: Thu Jan 23, 2014 11:03 pm

Re: 3D hover with wide FOV optical flow

Post by orcinus »

How well does the sensor fusion algorithm handle spurious optical flow output from people and other objects moving past the drone?
geof
Beginner
Posts: 17
Joined: Fri Jan 17, 2014 6:22 pm

Re: 3D hover with wide FOV optical flow

Post by geof »

@csholmq- We've made OF sensors of that size many times in the past, but never sold them publicly. (Our lightest was 125 milligrams!) The market didn't seem there yet and they were a bit more difficult to make. But I think that is changing- if there is enough demand then yes we will offer them as a product.

@orcinus- Good question! With the 8 sensor version on the 'mCX that you may have seen in the linked videos, we found that a single person moving by would have a little, but not much, effect on the helicopter's position. This is because a person would occupy no more than, say, 30 degrees out of the 360 degree field of view. So the person's motion gets averaged away by the lack of motion everywhere else. But there was some motion, especially if the person was close and wore clothing that was distinct from the background. For example if you walked by it, it would mirror your motion just a tiny bit. However if you cupped your hands around it then you could very easily make it go up and down, which makes for some fun interaction. I expect similar behaviors from what we put onto the CF.
orcinus
Member
Posts: 36
Joined: Thu Jan 23, 2014 11:03 pm

Re: 3D hover with wide FOV optical flow

Post by orcinus »

Interesting :)

The reason i'm asking is, because i've had birds "hijack" an AR drone by flying below it (it has a single, downward facing cam, i was wondering how well a 360 view mitigates that).
geof
Beginner
Posts: 17
Joined: Fri Jan 17, 2014 6:22 pm

Re: 3D hover with wide FOV optical flow

Post by geof »

Wide FOV optical flow mitigates that pretty well. You don't see birds "hijacking" other birds, or insects hijacking other insects. :) On the other hand, if you ever look at an insect hovering amidst branches in a breeze, as the branches sway the insect will often sway along. It's pretty cool actually...
csholmq
Beginner
Posts: 12
Joined: Fri Dec 27, 2013 1:20 pm

Re: 3D hover with wide FOV optical flow

Post by csholmq »

geof wrote:@csholmq- We've made OF sensors of that size many times in the past, but never sold them publicly. (Our lightest was 125 milligrams!) The market didn't seem there yet and they were a bit more difficult to make. But I think that is changing- if there is enough demand then yes we will offer them as a product.

@orcinus- Good question! With the 8 sensor version on the 'mCX that you may have seen in the linked videos, we found that a single person moving by would have a little, but not much, effect on the helicopter's position. This is because a person would occupy no more than, say, 30 degrees out of the 360 degree field of view. So the person's motion gets averaged away by the lack of motion everywhere else. But there was some motion, especially if the person was close and wore clothing that was distinct from the background. For example if you walked by it, it would mirror your motion just a tiny bit. However if you cupped your hands around it then you could very easily make it go up and down, which makes for some fun interaction. I expect similar behaviors from what we put onto the CF.
So, this got me interested. I've done a couple of courses in basic Image Analysis so I'm tempted to have a go. However, getting my hands on lightweight optical sensors proved more difficult that I thought. Any tips?
orcinus
Member
Posts: 36
Joined: Thu Jan 23, 2014 11:03 pm

Re: 3D hover with wide FOV optical flow

Post by orcinus »

geof wrote:Wide FOV optical flow mitigates that pretty well. You don't see birds "hijacking" other birds, or insects hijacking other insects. :) On the other hand, if you ever look at an insect hovering amidst branches in a breeze, as the branches sway the insect will often sway along. It's pretty cool actually...
That's the thing, i've read about neural net optical flow research with insects in artificial, simulated environments. I wondered how something more similar to that might work :)

How well does it work with just two sensors?
Post Reply