[Video] Alternative Kinect Control (using depth images)

Firmware/software/electronics/mechanics
Veerachart
Member
Posts: 45
Joined: Mon Jan 13, 2014 4:12 am

Re: [Video] Alternative Kinect Control (using depth images)

Post by Veerachart »

Thanks for the quick reply.

This is the graph I got. A little bit different from Patrik's, roughly the /CrazyflieJoystickDriver is connected to /reconfigure_gui instead.
Image
omwdunkley wrote:- activating the hover mode. However you could probably remove this requirement, by always having the hover mode activated or adding a button to the GUI.
Is this the same as unchecking the "Disable Hover Mode" in the Input tab?
omwdunkley wrote:I guess the easiest (even if rather silly) solution would be to quickly make a node that blindly sends "fake" joystick messages at 100hz on the right channel. While you do that, you could even set the "automatic mode" button to set (1.0 I think)). Alternatively you could modify the joy_driver_pid.py file to do that for you.
So, the "Automatic Mode" button plays the role here. Is it published by /joy_node to the topic /joy in the file joy_driver_pid.py?
omwdunkley wrote:"/cf_xyz" is only the flie position (not rotation) as estimated by the tracker. The tracker cannot estimate the rotation. The rotation estimate comes from the flie itself (so you must be connected to it and receiving roll, pitch and yaw). The GUI program sends out a /cf_xyz->/cf0 transform with just the rotation. The static transform publisher in the guide then links /cf0->/cf_gt (ground truth), which is compared against /goal in the PID controller.
For the yaw angle, I connected to the flie and the HUD moved as I rotate my flie. The only frames I can see in rviz is the camera frames, the world frame, and cf_xyz frame. /cf0 and /cf_gt did not appear.
fetrit
Beginner
Posts: 3
Joined: Sat Apr 19, 2014 1:30 pm

Re: [Video] Alternative Kinect Control (using depth images)

Post by fetrit »

Thanks for the reply,

Is there a way to get the acceleration in cartesian coordinates in your code? I've tried to look at your code and the closest I found was the "self.acc" but I'm not sure if this is the right variable. I need it for the kalman filter :D
omwdunkley
Expert
Posts: 162
Joined: Thu Jun 06, 2013 9:56 pm
Location: Munich

Re: [Video] Alternative Kinect Control (using depth images)

Post by omwdunkley »

fetrit wrote:Thanks for the reply,

Is there a way to get the acceleration in cartesian coordinates in your code? I've tried to look at your code and the closest I found was the "self.acc" but I'm not sure if this is the right variable. I need it for the kalman filter :D
I guess you could use the roll, pitch, yaw information to rotate the acceleration vector. I once did this on the flie, cant remember where I put the code though. Note that yaw drifts unless you compensate (external tracking, magnetometer, etc).
omwdunkley
Expert
Posts: 162
Joined: Thu Jun 06, 2013 9:56 pm
Location: Munich

Re: [Video] Alternative Kinect Control (using depth images)

Post by omwdunkley »

Is this the same as unchecking the "Disable Hover Mode" in the Input tab?
Nope, this is just a hack for the moment. Usually my "hover button" served two funcitons: half press = deadman, so you need to keep it half pressed while flying. if you drop the controller or want to cut power, you just let go. And pressing it all the way activated hover mode. For the kinect functionality, I want the button pressed all the way to active "kinect control mode" instead of hover mode. The checkbox just disables hover messages being sent to the flie to avoid conflicts (some better logic would be appropriate here).
So, the "Automatic Mode" button plays the role here. Is it published by /joy_node to the topic /joy in the file joy_driver_pid.py?
Yes, it is published by /joy_node on topic /joy to joy_driver_pid.py. I use the playstation controller, which maps the axes differently if you use USB or Bluetooth. In the code you can see how I use each axis. To check the deadman, I see if Button.L1 is pressed (defined at the top of the file, think it L1 is axis/button 10 in bluetooth mode and 14 in usb mode). To check if the automatic mode button is pressed enough, I see if the value of Axis.L1 < -0.75 (line 450). Buttons are pressure sensitive and 0=not pressed, -1 = fully pressed, so -0.75 = three quarters pressed. So if you wanted to trick the node into thinking into sending thrust commands the PID controller calculates, you will need all the right TF frames to be available and set Button.L1 to true and Axis.L1 to - 1.0. Also make sure that hover mode is disabled in the GUI.

Ahh, sorry. Just checked - the checkbox is redundant. If the PID controlled is active and can find the TF frames, it disabled the hover flag before it reaches the GUI. It might still be advisable to check the box, incase you rather have the behaviour default to no hover mode in case the PID controller fails (ie cannot find the transforms it needs).
omwdunkley wrote:"/cf_xyz" is only the flie position (not rotation) as estimated by the tracker. The tracker cannot estimate the rotation. The rotation estimate comes from the flie itself (so you must be connected to it and receiving roll, pitch and yaw). The GUI program sends out a /cf_xyz->/cf0 transform with just the rotation. The static transform publisher in the guide then links /cf0->/cf_gt (ground truth), which is compared against /goal in the PID controller.
For the yaw angle, I connected to the flie and the HUD moved as I rotate my flie. The only frames I can see in rviz is the camera frames, the world frame, and cf_xyz frame. /cf0 and /cf_gt did not appear.[/quote]
First, good that it can see the yaw in the hud. When you see see in rviz, what do you mean? Could you post a screenshot of rviz with all the nodes of the TF tree expanded (rviz, in the display panel, under tf, -> Tree?). This will show all the frames, even if they have no parents. /cf0 (or /cf) should automatically be sent out by the GUI if you are logging the parameters Roll Pitch Yaw. Ideally at 100hz. /cf_gt will not appear unless /cf_xyz and /cf(0) exist. Lets focus on getting /cf0 working, then we can look at the rest.

By the way, the tf graph you show me dont help so much ;) Useful would be the pdf generated by the command

Code: Select all

rosrun tf view_frames 
while everything is running. It listens for 5 seconds and graphically displays the tree.
Veerachart
Member
Posts: 45
Joined: Mon Jan 13, 2014 4:12 am

Re: [Video] Alternative Kinect Control (using depth images)

Post by Veerachart »

omwdunkley wrote: First, good that it can see the yaw in the hud. When you see see in rviz, what do you mean? Could you post a screenshot of rviz with all the nodes of the TF tree expanded (rviz, in the display panel, under tf, -> Tree?). This will show all the frames, even if they have no parents. /cf0 (or /cf) should automatically be sent out by the GUI if you are logging the parameters Roll Pitch Yaw. Ideally at 100hz. /cf_gt will not appear unless /cf_xyz and /cf(0) exist. Lets focus on getting /cf0 working, then we can look at the rest.
Here's it
Image
omwdunkley wrote: By the way, the tf graph you show me dont help so much ;) Useful would be the pdf generated by the command

Code: Select all

rosrun tf view_frames 
while everything is running. It listens for 5 seconds and graphically displays the tree.
Actually that was from rqt_graph for all the nodes. Here's the frames.pdf. It looks like /cf0 is not connected to /cf_xyz :?
omwdunkley
Expert
Posts: 162
Joined: Thu Jun 06, 2013 9:56 pm
Location: Munich

Re: [Video] Alternative Kinect Control (using depth images)

Post by omwdunkley »

It looks like cf0 is not being published at all. It appears on your frames.pdf because it is the parent of the statis transform publisher. Are you connect to the flie and logging roll, pitch, yaw at 100hz?
Veerachart
Member
Posts: 45
Joined: Mon Jan 13, 2014 4:12 am

Re: [Video] Alternative Kinect Control (using depth images)

Post by Veerachart »

omwdunkley wrote:It looks like cf0 is not being published at all. It appears on your frames.pdf because it is the parent of the statis transform publisher. Are you connect to the flie and logging roll, pitch, yaw at 100hz?
Well, in the picture the flie had entered sleep mode and disconnected from the computer, but as far as I know I never see /cf0 in the frame. And I also believe that I set the frequency of /stabilizer logging at 100.

I entered the command according to this, with another command of

Code: Select all

rosrun crazyflieROS driver.py
Also there was an expected error appeared when I ran

Code: Select all

roslaunch crazyflieROS pid.launch js:=0
as there was no PS3 controller connected, but I don't think that would cause the frame not published.

I'll try diving into the codes later :P
omwdunkley
Expert
Posts: 162
Joined: Thu Jun 06, 2013 9:56 pm
Location: Munich

Re: [Video] Alternative Kinect Control (using depth images)

Post by omwdunkley »

Good luck :)

Check these lines of code:
Crazyflie log being forwarded to ros node
ros node sending log and tf stuff

Quick look at the code checks condition setPubToRos == True. Make sure the checkbox "publish to ROS" is checked in the settings tab.
Patrik
Beginner
Posts: 20
Joined: Wed Apr 09, 2014 2:43 pm

Re: [Video] Alternative Kinect Control (using depth images)

Post by Patrik »

Hi Oliver!

We've got the tracking working fine but when we press the L1-button the flie just takes of like crazy. This is the graph i get from

Code: Select all

rosrun tf view_frames 
frames.pdf

Any ideas of what might be the problem?

Patrik
omwdunkley
Expert
Posts: 162
Joined: Thu Jun 06, 2013 9:56 pm
Location: Munich

Re: [Video] Alternative Kinect Control (using depth images)

Post by omwdunkley »

Patrik wrote:Hi Oliver!

We've got the tracking working fine but when we press the L1-button the flie just takes of like crazy. This is the graph i get from

Code: Select all

rosrun tf view_frames 
frames.pdf

Any ideas of what might be the problem?

Patrik
Hi Patrik, glad tracking works now. What was the problem before? Did you have to fix anything?

Could be a million things. You going to have to give me more information.
You will need to verify:
- hover mode is disabled (checkboxes in GUI)
- the orientation is correct (will need to set the rotation offset in the GUI)
- the tracking is working well (look at rviz output)
- the goal position makes sense (look at where the tracker thinks the flie is, and where the goal is.)

You transforms are all there and connected, they look good. Of course if your kinect is upside down or something you will need to modify the static transform publisher to something that makes sense. For now I ahve it that the flie is 1.5meters back, and 1meter high, pointing forward to the horizon (is is level). If it is not level, change the camera pitch in the publisher. I always just guessed what was level and what not, it doesnt need to be 100% perfect.

For debugging, test under optimal conditions. Set the goal location to 2.5 meters infront of the kinectand fly to this position and hover there, verify its tracking, hold and press L1, make sure you dont accidentally move the goal location (DO NOT throttle while in kinect control mode, it will move the goal up and the flie will fly up. There is a setting to disable this in the pid dynamic reconfigure gui. You can use this to set the goal too.

Sorry, stupid of me, but I might have forgotten to mention that the flie cross section is quite small for the kinect to track. I added a bit of paper (it should be in the video i posted) so its bigger and this helped.

Image
Post Reply