ROS_Projects - RicoJia/notes GitHub Wiki
========================================================================
========================================================================
-
有个专门publish cmd_vel 的工具: teleop_twist_keyboard -> make twist msgs. You need to control right on the terminal
rosrun teleop_twist_keyboard teleop_twist_keyboard.py
-
Debugging tools: They link to warn, debug, fatal, etc. msgs
-
rqt_console
andrqt_logger_level
always open concurrently on two different terminals.- The order is from easy to severe, and only msgs above the level set in rqt_logger_level will show
rosrun rqt_console rqt_console
- use
ROS_INFO_STREAM
for debugging. (likeROS_INFO_STREAM("lol"<<"hahah");
)
- The order is from easy to severe, and only msgs above the level set in rqt_logger_level will show
-
-
Diagnosis tool: roswtf, which checks for environment variables, etc.
========================================================================
========================================================================
-
you need to download
sudo apt install ros-melodic-usb-cam sudp apt install ros-melodic-libuvc-camera
-
To see the raw image:
- streaming thru /dev/video0
rosrun usb_cam usb_cam_node _pixel_format:=yuyv
- to view in rviz
#This is your camera topic. rosrun image_view image_view image:=/usb_cam/image_raw # rviz -> add image (not camera), then select topic. Camera displays image in a frame, in /tf. So you need fixed frame, and /tf #rqt_image_view, select topic.
- streaming thru /dev/video0
-
TODO? cp udev rules for what?
sudo cp 80-uvc-sys76.rules /etc/udev/rules.d/ sudo udevadm control --reload sudo udevadm trigger
-
roslaunch me495_image_processing something.maybe you need to remap cam name. Also you need to run the bridge for seeing the image proesssing node, so run the bridge node.
-
ros_numpy
can convertgeometry_msgs.Point, Transform...
to numpy data structures.from sensor_msgs.msg import Image import numpy as np import ros_numpy img_np = np.zeros((480, 640, 3), dtype=np.uint8) # Convert NumPy array to ROS Image message img_msg = ros_numpy.msgify(Image, img_np, encoding='rgb8')
-
image pipeline: activate then process the image together. Applies to point clouds and stereo vision
-
camera calibration package. Tutorals - checkerboard, package. purpose: finds the intrinsic parameters, using checkerboard and openCV?
publish to /camera_info
- image_proc
purpose: camera lens will distort image, and this will rectangularize the image
- subscribe to raw image and camera_info
- publishes:
image_mono: A monochrome unrectified image image_rect: A monochrome rectified image image_color: A color unrectified image. These images are de-omsaiced (that is it accounts for the rgb pixel pattern of the camera) image_rect_color A color rectified image.
-
depth_image_processing: process depth image. RGBD image each pixel has a depth info: how far light has to travel to hit something. Used to make point clouds.
-
Sublimentary tools
- image transport can compress image. Republish feature is pretry good,
- web video server. if your robot is interacting with web, this will be helpful.
-
OpenCV:
- CV bridge run this in ros
- open_CV apps: different algos on opencv
- Image geometry: convert bw pixel coords and other frames?
-
========================================================================
========================================================================
-
Concepts
- give an estimated pose in the occupancy map
- (on Rviz, it's 2D estimate pose). THis is the map you see on Rviz.
- the robot uses laser scan to "see" its surroundings. This is the laser scanned map,
- you can use the teleop node to move the robot back and forth and see the surrounding, and the robot will estimate its pose.
- terminate teleop node
- run the navigation node:
- the navigation node will plan the path for you. even if an obstacle suddenly appears, it will avoid it.
- decipher a map:
/map
publishes/Nav_msgs/OccupancyGrid
. 1 means occupied,0 is free,and -1 is unknown.
- give an estimated pose in the occupancy map
-
Implementing a laser scan
- range should be slightly over the max.
- Laser scan simulation: laser scan frequency went up to 40 hz.. then the robot in Rviz doesn't move at all! A: because the previous process wasn't killed properly
-
actionlib api
- you can set time out on this
self.client.wait_for_result(timeout=rospy.Duration(5))
- you can set time out on this
-
Turtlebot3 already has it, but costmap global params need to be loaded!
<rosparam file="$(find turtlebot3_navigation)/param/global_costmap_params.yaml" command="load" />
-
Custom Global Planner: example.
navfn/navfn
is Djikstra- Reason why you can see the global plan:
/move_base/NavfnROS/plan
is being published - Need to reverse path when see this:
None of the 0 first of 0 (0) points of the global plan were in the local costmap and free
- Reason why you can see the global plan:
-
Basic components to launch a robot
========================================================================
========================================================================
-
we copy only the install folder to raspberry pi. SO MAKE SURE YOU INSTALL ALL YOUR NODES PROPERLY!!!
-
Ros_serial
- A driver that directly allows an Arduino, STM32, Xbee, Teensy to publish!
- rosserial_python is the one to use rosserial, not c++, because c++ is not complete! Also, the opencr uses rosserial_python for what?
- Play sound:
rosrun rosserial_python serial_node.py _port:=/dev/ttyACM0 _baud:=1000000 rostopic pub /sound turtlebot3_msgs/Sound "value: 2"
- you need rosserial's serial_node to be launched on turtlebot3!!. So you can play sounds on it, etc, using OpenCR
-
Overall Guidelines
- all data should be visualized!!
- rosmaster: on our computer
- roslaunch : use machine tag to start nodes on the machine.
- local hostnames will be resolvable. using ROS_MASTER_URI=http://RicoSUPERMACHINE:11311
- what is host name resolution?? - hostname resolution is conversion from assigned hostname to an IP address.
- No rviz, no rqt_plot on turtlebot!!
-
install sudo apt install docker.io
, not docker!! (a whole separate file system, for arm systems)- then you use ./arm catkin_make in your WORKSPACE to create separate arm_install, arm_devel, arm_build,
- then you use the rsync command to copy arm_install to the turtlebot home.
Finally, do sudo reboot to reboot the computer to get rid of the ros processes.
- Connection
-
connect to matt's office TURTLENET, pwd: NU-TURTLE
User account password: navigation turtlebot will overwrite the previous code.
-
how to ssh into the robot:
ssh -o 'HostKeyAlgorithms=ssh-rsa' 'student@turtlebot4'
- If you are connected to the same turtlebot, you can ping each other's computer by name ("SUPERMAHCHINE, etc")
- you only need to copy this key once. once you've done so, you can just do ssh student@turtlebot4
-
where is roscore running?
$export ROS_MASTER_URI=http://RicoSUPERMACHINE:11311 (you can set it to $export ROS_MASTER_URI=http://turtlebotX:11311 as well, but you need to specify "address" in launch file to be localhost)
- you can't catkin compile code on turtlebot 1 cuz raspberry pi is too slow.
- If you don't do export, ROS_MASTER_URI by default is http://localhost.11311, which will make turtlebot go to itself.
-
Do this on BOTH YOUR COMPUTER AND YOUR ROBOT!
- OTHERWISE NODES LIKE robot_state_publisher will have trouble finding where to print.
-
roslaunch file
- You can launch your launchfile either on computer, or robot. so you need to specify
<machine name="local_from_turtlebot" address="localhost" env-loader="/home/student/install/env.sh"/> //for launching launchfile on turtlebot <machine name="remote_from_pc" address="turtlebot4" env-loader="/home/student/install/env.sh"/> // for launching on your computer
- define machine tag, and specify env-loader (env bash file to specify all environment variables)
<machine name="foo" address="foo-address" env-loader="/opt/ros/fuerte/env.sh" user="someone"/>
3.start your node like this.
<node machine="foo" name="footalker" pkg="test_ros" type="talker.py" />
- You can launch your launchfile either on computer, or robot. so you need to specify
-
CMakelists - install all your config and launch files:
install(DIRECTORY launch/ DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}/launch)
-
Copy all your stuff over
rsync -av --delete . student@turtlebotX:/home/student/ws
- rsync will only copy things that are different from the last time.
- -a is all directories and permissions.
- -v is verbosity
- --delete: delete everything that's not present in the source
-
rsync -av --delete arm_install/ student@turtlebotX:/home/student/install
: copy arm_install on your computer to home/student/install. So you can use the same thing with other directories.
-
cross compilation. Compiling code on Raspberry pi is slow! we need to x86-x64 to armhf
in the basic workspace (same level as build, install), do ./arm catkin_arm install
-
If you choose to run launchfile on your host computer, you should run roscore, then run your launchfile.
-
copying a file from one computer to another thru ssh:
rsync -av --delete coffee_bot/controls/ pi@10.0.1.78:Desktop/controls pi@ip_address already directs you to the home directory of the computer.
-
========================================================================
========================================================================
- install point cloud lib
sudo apt-get install cmake libopenblas-dev liblapack-dev libarpack-dev libarpack2-dev libsuperlu-dev
========================================================================
========================================================================
-
URDF -> SDF(pybullet, sure thing);
-
Moveit & URDF
- URDF with floating platform
- In URDF, the platform is connected to the base by prismatic joints and this can be represented by joint states from PyBullet.
- In Moveit, we take in the goal pose, solve it with IK, but then visualization might be a problem...
- URDF with floating platform
-
Commonn trick for making a universal joint, or a spherical joint rpy in rviz, you make invisible dummy links:
<link name="knee_${id}" />
-
Moveit!
- choose end-effector's parent link to be end effector, if the joint is too chunky to make the robot visible!
========================================================================
========================================================================
- Encoders
- test update only. - not matching using wheel_vel and wheel_pos. Reason: The odometer will just read absolute wheel position for pose. Solution: accumulate velocity.
- 【教训】use relative values for any possible asynchronized situation - starting at a different time, etc.
- the wheel encoder doesn't give non-zero initial value. So you have to initalize it.
- test update only. - not matching using wheel_vel and wheel_pos. Reason: The odometer will just read absolute wheel position for pose. Solution: accumulate velocity.
========================================================================
======================================================================== 2. Meeting with Matt 1. writing a few lines at a time 2. you can program without understanding the whole algorithm 3. git: make small commits and restore from your previous stages. Commit things incrementally to a different branch. - 4. docstring - helps you organize thoughts, connecting nodes. 5. naming: be consistent with the math, type out the math or 6. breaking the math separately. minus sign. 7. ask ppl for help and think about why. 9. ReadME: Pay special attention to instructions, read them more carefully.
- Terms
- semantics: it's the meaning of a command, not syntax. if you enter a command that semantically wrong, it doesnt make sense even if syntax is right.
-
Connect to Baxter
-
nmcli connection up Rethink
-
make sure you have ethernet connection to baxter
- go to set up wired connection in Settings -> Network -> wired settings.
- Profile name: Rethink, your Max address is 10.42.0.1, 255.255.255.0
- set it to manual, click add
- untick automatic launch in its settings.
- connect to Baxter using ethernet
- run $nmcli connection up Rethink
-
To have your node running on Baxter, you must set environment variables:
export ROS_MASTER_URI=http://10.42.0.2:11311
-
export ROS_IP=10.42.0.1
so your own nodes know where to find the robot.
-
write a bashrc file for this.
-
Then you an run rqt_graph on this
-
-
press E stop. then run script toreboot
-
5 - 10 min to reboot
-
Manual mode: (zero-g )
rosrun baxter_tools enable_robot.py -e
========================================================================
========================================================================
- sambot for ros2
- v4l2 driver
- slamtec: has package: object detection: Tiny yolo v4: 60 objects. Open veno: intel's visual surface. Movidia chips.
- carla: reinforcement learning environment. Coach is another one