gazebo:manipulation:basics
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
gazebo:manipulation:basics [2018/10/10 17:01] – typos | gazebo:manipulation:basics [2024/08/20 21:38] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 30: | Line 30: | ||
- Let's access the sensor data outside of '' | - Let's access the sensor data outside of '' | ||
- Code up a python class for subscribing and displaying the kinect image streams. When run, this should just display to separate windows the color image data and the depth data. The ``cv2`` function for displaying an image is called ``imshow()`` which expects the data to be in a specific format (either from 0 to 1, or from o to 255, I forget). The depth data may have to be normalized for proper display. | - Code up a python class for subscribing and displaying the kinect image streams. When run, this should just display to separate windows the color image data and the depth data. The ``cv2`` function for displaying an image is called ``imshow()`` which expects the data to be in a specific format (either from 0 to 1, or from o to 255, I forget). The depth data may have to be normalized for proper display. | ||
- | - Checkout this [[https:// | + | - Checkout this [[https:// |
- Modify the displayed output to threshold the range data based on a target range. | - Modify the displayed output to threshold the range data based on a target range. | ||
- If you are using an RGB-D sensor with registered range/color images, use the registered image to extract the point clouds of the segmented objects. | - If you are using an RGB-D sensor with registered range/color images, use the registered image to extract the point clouds of the segmented objects. | ||
=== Module Set 3: Modifying for Project Use === | === Module Set 3: Modifying for Project Use === | ||
- | - Identify an alternative robot, usually one that is simply a fixed-base robotic arm. Replace the PR2 robot with your chosen robotic arm. You can find ROS-related packages in Github for some commercial robot arms i.e. Kinova or use our customized robot arm named Handy from the [[https:// | + | - Identify an alternative robot, usually one that is simply a fixed-base robotic arm. Replace the PR2 robot with your chosen robotic arm. You can find ROS-related packages in Github for some commercial robot arms i.e. Kinova or use our customized robot arms from our [[https:// |
- The rest part is for the group that needs ForageRRT planner and Manipulation State Space. Otherwise, you can keep using the default MoveIt! and Ompl code. | - The rest part is for the group that needs ForageRRT planner and Manipulation State Space. Otherwise, you can keep using the default MoveIt! and Ompl code. | ||
- Install our custom code from the [[https:// | - Install our custom code from the [[https:// | ||
Line 54: | Line 54: | ||
- To use it on Kinect via ROS, simply import tf (tensorflow) in your python node, and modify the provided demo.py to load the pretrained model for your own purposes. | - To use it on Kinect via ROS, simply import tf (tensorflow) in your python node, and modify the provided demo.py to load the pretrained model for your own purposes. | ||
- (Optional) If you would like to finetune on specific object for grasping, this [[https:// | - (Optional) If you would like to finetune on specific object for grasping, this [[https:// | ||
+ | - To figure out the transformation between robot base and camera, you can start with [[https:// | ||
gazebo/manipulation/basics.1539205295.txt.gz · Last modified: 2024/08/20 21:38 (external edit)