teaching:se-kiba:perception-tutorial
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
teaching:se-kiba:perception-tutorial [2013/04/24 21:22] – balintbe | teaching:se-kiba:perception-tutorial [2016/05/19 09:19] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | =====Perception Tutorial===== | + | =======Perception Tutorial======= |
Tutorial describing first steps needed to start processing data from the Kinect, in the form of Point Clouds, | Tutorial describing first steps needed to start processing data from the Kinect, in the form of Point Clouds, | ||
+ | |||
+ | ==== 1. Prerequisite ==== | ||
+ | |||
+ | Follow the steps described in [[software: | ||
If you are using a PrimeSense device (Xbox Kinect, Asus Xtion) check whether // | If you are using a PrimeSense device (Xbox Kinect, Asus Xtion) check whether // | ||
Line 8: | Line 12: | ||
sudo apt-get install ros-fuerte-openni-kinect | sudo apt-get install ros-fuerte-openni-kinect | ||
- | Connect | + | Check out the code presented at seminar in your ros-workspace from:\\ |
- | | + | [[https:// |
- | in order to start publishing | + | Prerecorded bag file can be found in ../ |
- | More detailed description to be added soon.\\ | + | ==== 2. Viewing and recording data from the Kinect === |
+ | == Bag files and Rviz == | ||
+ | For a PrimeSense device: connect it to your PC and run: | ||
+ | roslaunch openni_launch openni.launch | ||
+ | Run rviz: | ||
+ | rosrun rviz rviz | ||
+ | Add a new PointCloud2 display type to rviz, and choose // / | ||
- | Code will be available at:\\ | + | Use //rosbag// to record data in a bag file, e.g.: |
- | [[https://github.com/ai-seminar/perception-tutorials.git]] | + | |
+ | Note: /tf is needed if you want to view the recorded data using rviz. | ||
+ | Play back a bag file using: | ||
+ | rosbag play filename.bag --loop | ||
+ | More detail and a more elegant way of saving data from a Kinect to bag files can be found [[http://www.ros.org/wiki/openni_launch/ | ||
- | ==Viewing and recording data from the Kinect== | + | In order to save Point Clouds to *.pcd files run: |
- | + | rosrun pcl_ros pointcloud_to_pcd / | |
- | | + | |
- | | + | == Image_view |
- | * replay bag files, view them in rviz\\ | + | View rgb image: |
- | * subscribing to the PointCloud topic from C++ | + | |
+ | View depth image: | ||
+ | rosrun | ||
+ | |||
+ | == From a ROS node == | ||
- | ==Processing Point Clouds== | + | To see how you subscribe to a topic from a ros node take a look at // |
- | | + | ==== 3.Processing Point Clouds ==== |
- | * using PCLVisualizer | + | |
+ | The following are presented in // | ||
+ | | ||
+ | * using PCLVisualizer[[http:// | ||
* removing nans | * removing nans | ||
- | * filtering based on axes | + | |
- | * downsampling the Point Cloud | + | |
+ | * downsampling the Point Cloud[[http:// | ||
+ | * RANSAC plane fitting, and extracting indices [[http:// | ||
- | + | For further questions contact: // | |
- | | + |
teaching/se-kiba/perception-tutorial.1366838572.txt.gz · Last modified: 2016/05/19 09:18 (external edit)