3D hover with wide FOV optical flow
Re: 3D hover with wide FOV optical flow
@orcinus- Actually I work a lot with that crowd- scientists who study vision and flight control in insects. This multi-camera technique is really an implementation of global motion sensing neurons found in the blowfly and studied by Holger Krapp (Imperial College) and later formalized by Sean Humbert (University of Maryland). I also like to describe it as somewhat similar to those motion capture camera systems you see various people using, except that the system is shrunk down, turned inside-out, and mounted on the quadrotor.
It works somewhat with two sensors, depending on how you do it. With two very wide FOV sensors, you can get 3D stabilization. We've done it- one looking up and one looking down. But if the sensors are narrow then you are more limited- you can basically get limited 2D stabilization not 3D.
@csholmq- There really isn't a lot out there for ultra light sensors that interface well with microcontrollers, except for what we could provide. Please let me know your thoughts as I post pictures, and feel free to PM me.
It works somewhat with two sensors, depending on how you do it. With two very wide FOV sensors, you can get 3D stabilization. We've done it- one looking up and one looking down. But if the sensors are narrow then you are more limited- you can basically get limited 2D stabilization not 3D.
@csholmq- There really isn't a lot out there for ultra light sensors that interface well with microcontrollers, except for what we could provide. Please let me know your thoughts as I post pictures, and feel free to PM me.
Re: 3D hover with wide FOV optical flow
OK I've started assembling the vision system. I'm going to use 4 WhiteOak image sensor chips. (These are new image sensor chips that we got back from the fab pretty recently but haven't publicly announced yet.) Each WhiteOak chip has a mode that allows operation with 3 wires or with 5 wires. I'm using the 5-wire mode since it is algorithmically simpler. Each WhiteOak has a 64x64 array, but I'm only using a 16x16 subset of that. The bare die Whiteoak chips are mounted on little breakout boards (I think 8mm x 13mm or so) and a 2mm focal length lens is glued onto each chip. I then connected the four chips to a Teensy 3.1 using 36 gauge enamel wire.
Here is the vision system next to Crazyflie. It's a rat-nest of wires, but adequate for a first prototype.
Here is a closeup of one of the WhiteOak chips + a lens on the breakout board.
Here is the vision system alone with the four chips taped down in a cross configuration. This will eventually go on the Crazyflie
So right now the Teensy 3.1 is programmed to grab images from the WhiteOaks, compute optical flow on each of the four chips, and then output a 4DOF global "visual IMU" type measurement. These will correspond to the Crazyflie drifting horizontally (forward/back and left/right), changing height (up/down), and yaw rotation. I won't integrate yaw into the code, but it makes for a good diagnostic and calibration variable.
The interface between the Teensy 3.1 and the CF will be via I2C.
A risk factor is that I will power this directly from the battery since the Teensy 3.1 has it's own 3.3V regulator which I want to keep separate from the other chips. Hopefully the battery keeps the potential up enough for the 3.3V linear regulator to work well.
For my next post I'm going to mull over integration details.
Here is the vision system next to Crazyflie. It's a rat-nest of wires, but adequate for a first prototype.
Here is a closeup of one of the WhiteOak chips + a lens on the breakout board.
Here is the vision system alone with the four chips taped down in a cross configuration. This will eventually go on the Crazyflie
So right now the Teensy 3.1 is programmed to grab images from the WhiteOaks, compute optical flow on each of the four chips, and then output a 4DOF global "visual IMU" type measurement. These will correspond to the Crazyflie drifting horizontally (forward/back and left/right), changing height (up/down), and yaw rotation. I won't integrate yaw into the code, but it makes for a good diagnostic and calibration variable.
The interface between the Teensy 3.1 and the CF will be via I2C.
A risk factor is that I will power this directly from the battery since the Teensy 3.1 has it's own 3.3V regulator which I want to keep separate from the other chips. Hopefully the battery keeps the potential up enough for the 3.3V linear regulator to work well.
For my next post I'm going to mull over integration details.
Last edited by geof on Wed Jan 29, 2014 3:28 am, edited 3 times in total.
Re: 3D hover with wide FOV optical flow
That's precisely what i had in mind.geof wrote: It works somewhat with two sensors, depending on how you do it. With two very wide FOV sensors, you can get 3D stabilization. We've done it- one looking up and one looking down.
Kinda like owl ears - they're offset along X and rotated around both Z and X (one "looking" slightly up, the other down).
Re: 3D hover with wide FOV optical flow
May I please pose a few questions for those intimately familiar with the CF firmware:
1) It seems that eulerRollDesired and eulerPitchDesired are the desired roll and pitch pose angles for the CF. Does "zero" mean pure horizontal, and are the units in degrees (not radians)?
2) It also seems that eulerYawDesired is a desired yaw rate. This is degrees per second, right? Is CCW (looking down from above) positive yaw?
I'm pretty sure about the above two, but just want to make sure before I start needlessly crashing the CF!
3) How do the raw integer variables that get sent up by the CFclient, over the CrazyRadio, map onto the variables eulerRollDesired, eulerPitchDesired, and eulerYawDesired? In other words, suppose I want eulerRollDesired to be 1.0 (for 1.0 degrees) and eulerPitchDesired to be -2.5 and for the yaw rate to be 5 degrees per second. What are the actual integer values that get sent up from CFclient?
In other words, suppose I were to use the "simple code example" at the bottom of this page (http://wiki.bitcraze.se/projects:crazyf ... tils:pylib) to send up the four stick signals (roll/pitch/yaw/thrust):
self.crazyflie.commander.send_setpoint(roll, pitch, yawrate, thrust)
What should variables "roll", "pitch", and "yawrate" be to encode respectively 1.0 degrees, -2.5 degrees, and 5 degrees per second? (I will be using my own controller for this.)
I understand how thrust is encoded, and how the final actuator signals are used to generate the motor signals. I just don't understand the scaling on the uplink.
Thanks!!
1) It seems that eulerRollDesired and eulerPitchDesired are the desired roll and pitch pose angles for the CF. Does "zero" mean pure horizontal, and are the units in degrees (not radians)?
2) It also seems that eulerYawDesired is a desired yaw rate. This is degrees per second, right? Is CCW (looking down from above) positive yaw?
I'm pretty sure about the above two, but just want to make sure before I start needlessly crashing the CF!
3) How do the raw integer variables that get sent up by the CFclient, over the CrazyRadio, map onto the variables eulerRollDesired, eulerPitchDesired, and eulerYawDesired? In other words, suppose I want eulerRollDesired to be 1.0 (for 1.0 degrees) and eulerPitchDesired to be -2.5 and for the yaw rate to be 5 degrees per second. What are the actual integer values that get sent up from CFclient?
In other words, suppose I were to use the "simple code example" at the bottom of this page (http://wiki.bitcraze.se/projects:crazyf ... tils:pylib) to send up the four stick signals (roll/pitch/yaw/thrust):
self.crazyflie.commander.send_setpoint(roll, pitch, yawrate, thrust)
What should variables "roll", "pitch", and "yawrate" be to encode respectively 1.0 degrees, -2.5 degrees, and 5 degrees per second? (I will be using my own controller for this.)
I understand how thrust is encoded, and how the final actuator signals are used to generate the motor signals. I just don't understand the scaling on the uplink.
Thanks!!
Re: 3D hover with wide FOV optical flow
I've just wanted to say I'm really looking forward to this. Let's hope it works out well!
Yes.1) It seems that eulerRollDesired and eulerPitchDesired are the desired roll and pitch pose angles for the CF. Does "zero" mean pure horizontal, and are the units in degrees (not radians)?
Yes. It is supposed to follow the aircraft principal axis but we might have screwed it up if I recall correctly. This is something we intend to correct but since it requires the cfclient and firmware to be in sync it makes it more complicated. We will try fix this in the repository during the week. Until then I think you will have to trail and error. I think pitch and roll are correct and yaw is inverted CW (when looking down from above).2) It also seems that eulerYawDesired is a desired yaw rate. This is degrees per second, right? Is CCW (looking down from above) positive yaw?
roll, pitch and yaw are floats and sent as is, 1.0 means 1.0 degrees or deg/s. Thrust is a uint16.3) How do the raw integer variables that get sent up by the CFclient, over the CrazyRadio, map onto the variables eulerRollDesired, eulerPitchDesired, and eulerYawDesired? In other words, suppose I want eulerRollDesired to be 1.0 (for 1.0 degrees) and eulerPitchDesired to be -2.5 and for the yaw rate to be 5 degrees per second. What are the actual integer values that get sent up from CFclient?
Re: 3D hover with wide FOV optical flow
It will probably be OK for the first 3-4 minutes of flight but after that the battery voltage tend to drop below 3.4V and problems might occur. Is it possible to switch out the regulator for a 3.0V or similar later?A risk factor is that I will power this directly from the battery since the Teensy 3.1 has it's own 3.3V regulator which I want to keep separate from the other chips. Hopefully the battery keeps the potential up enough for the 3.3V linear regulator to work well.
Re: 3D hover with wide FOV optical flow
Thank You for your answers above, Tobias.
Flight times of 3-4 minutes are enough to prove the principle.
On the Teensy 3.1 the regulator is actually located on the ARM chip. (Pretty cool- keeps the BOM count down!) It might be possible to change but I don't know. Anyway this is just an initial version. Eventually I'll make a more integrated version with either a 3.0 regulator or a step-up to say 5V which would get regulated back down. (Sounds silly but it works.)tobias wrote:It will probably be OK for the first 3-4 minutes of flight but after that the battery voltage tend to drop below 3.4V and problems might occur. Is it possible to switch out the regulator for a 3.0V or similar later?A risk factor is that I will power this directly from the battery since the Teensy 3.1 has it's own 3.3V regulator which I want to keep separate from the other chips. Hopefully the battery keeps the potential up enough for the 3.3V linear regulator to work well.
Flight times of 3-4 minutes are enough to prove the principle.
Last edited by geof on Wed Jan 29, 2014 4:34 pm, edited 1 time in total.
Re: 3D hover with wide FOV optical flow
Thanks for the clarification. This won't affect us- for the time being I'm going to let the gyro control yaw. But for my method I will need to know whether the yaw rate is non-zero (e.g. abs(yawrate)>threshold) so that the algorithms can act accordingly.tobias wrote:Yes. It is supposed to follow the aircraft principal axis but we might have screwed it up if I recall correctly. This is something we intend to correct but since it requires the cfclient and firmware to be in sync it makes it more complicated. We will try fix this in the repository during the week. Until then I think you will have to trail and error. I think pitch and roll are correct and yaw is inverted CW (when looking down from above).2) It also seems that eulerYawDesired is a desired yaw rate. This is degrees per second, right? Is CCW (looking down from above) positive yaw?
Update: Uplink established
Just a brief update. For the control I'm using a simple control stick board I made for a prior project. It is basically a pair of joysticks and some buttons connected to an Arduino Mega (chosen only for it's size).
The sticks will be the same layout as everyone is familiar with, but in my case the input is interpreted a bit differently. For example, when you push the throttle up, the helicopter will be commanded to climb, and then will use optical flow to climb to a new position. Let go of the stick to stop climbing. Lower the stick to descend.
There are also three buttons which I will use eventually. Right now just one button is used as a "kill motors" button. Another one could be a "hover now" button, used to go from free-flying to hover, and another could be a "take off" button. I'm encoding these buttons into the thrust command (plenty -o- extra bits!) so as to avoid changes to the Python library.
I have the control stick board talking to a Python script that then accesses the Crazyradio and sends data up to the Crazyflie. I was able to establish the link and get a response (motors turning).
So, I have this chain working:
Sticks => Arduino => PC/Python => CrazyRadio => Crazyflie => Unmodified CF firmware
And from before I have the four sensor vision system working, and verified it's I2C port.
All that remains is to modify the CF firmware to connect to the vision system and then do my control thing, and then physically mount the setup onto the Crazyflie.
I won't jinx this by making predictions on how long it will take.
The sticks will be the same layout as everyone is familiar with, but in my case the input is interpreted a bit differently. For example, when you push the throttle up, the helicopter will be commanded to climb, and then will use optical flow to climb to a new position. Let go of the stick to stop climbing. Lower the stick to descend.
There are also three buttons which I will use eventually. Right now just one button is used as a "kill motors" button. Another one could be a "hover now" button, used to go from free-flying to hover, and another could be a "take off" button. I'm encoding these buttons into the thrust command (plenty -o- extra bits!) so as to avoid changes to the Python library.
I have the control stick board talking to a Python script that then accesses the Crazyradio and sends data up to the Crazyflie. I was able to establish the link and get a response (motors turning).
So, I have this chain working:
Sticks => Arduino => PC/Python => CrazyRadio => Crazyflie => Unmodified CF firmware
And from before I have the four sensor vision system working, and verified it's I2C port.
All that remains is to modify the CF firmware to connect to the vision system and then do my control thing, and then physically mount the setup onto the Crazyflie.
I won't jinx this by making predictions on how long it will take.
Using I2C2
Looks like I hit a snag. I want to use the external connector to connect the CF's processor to my sensor suite using I2C. From the schematic, it appears this is I2C2 (whereas I2C1 is used for the internal IMU), and that 10k pull-up resistors are included. (I verified this with a multimeter.)
Inside your stabilizerInit() function, in stabilizer.c, right after you call controllerInit(), I put in a call to my own function that resets my optical flow controller and also attempts to initialize the I2C2 device with this call: i2cdevInit(I2C2);
The I2C2 but appears to not function. The both the clock and data signals are held low.
Is there anything special I need to do in order to turn on I2C2, other than calling i2cdevInit(I2C2)?
Inside your stabilizerInit() function, in stabilizer.c, right after you call controllerInit(), I put in a call to my own function that resets my optical flow controller and also attempts to initialize the I2C2 device with this call: i2cdevInit(I2C2);
The I2C2 but appears to not function. The both the clock and data signals are held low.
Is there anything special I need to do in order to turn on I2C2, other than calling i2cdevInit(I2C2)?