hardware:xsens
Table of Contents
NOTE: Site under construction
Required Software
- rviz
- knowrob
- knowrob_addons
- openni2
- Xsens MVN studio
Required Hardware
- Xsens laptop
- Xsens suite
- kinect
Preparation
- recharge batteries and replacement batteries
- make sure to set up streaming for only one client
Running the software
- Start RVIZ
$ rosrun rviz rviz
- For Calibration, define TF between `map` and `mocap`
$ rosrun comp_mocap tf_dynamic_transform.py
- Receive mocap data on port 9763
- Modify the IP in the script to reflect the IP of your computer
$ rosrun comp_mocap xsens_tf_broadcaster.py
- Publish marker messages for human skelleton with TF root set to `mocap`
$ rosrun comp_mocap mocap_marker.py --skeleton xsens --root-frame mocap
- Start Xsens MVN studio on the Xsens laptop
- Make sure to attach the USB stick with the license
- Go to Options/Preferences/Network streamer
- Add your computers IP to the destination addresses
- Setup a kinect for capturing RGB images
- Attach kinect to tripod and to your computer via USB
- Start Publishing
$ roslaunch openni2_launch openni2.launch
- Calibrate kinect camera
- Go to camera/driver
- Check depth_registration if you need depth info
$ rosrun rqt_reconfigure rqt_reconfigure
- Spawn semantic map in rviz via knowrob
- Start knowrob_vis
$ roslaunch knowrob_vis knowrob_vis.launch
- in firefox, go to http://127.0.0.1:1111/ and spawn the map
$ owl_parse('package://iai_semantic_maps/owl/room.owl').
$ register_ros_package(knowrob_objects).
$ owl_individual_of(A, knowrob:'SemanticEnvironmentMap'), !, add_object_with_children(A).
- Leave the knowrob server
Dressing the suite
TODO
Calibration
TODO
Recording
rosbag record –duration=3 –output-name=$1 /camera/rgb/camera_info /camera/rgb/image_raw /camera/depth/camera_info /camera/depth/image /camera/depth_registered/image_raw /tf
hardware/xsens.txt · Last modified: 2016/05/19 09:19 by 127.0.0.1