Color blob tracking with ROS

Date Sun 29 December 2013
 
Tags ros robotics

Here's is short tutorial on how to do colored blob tracking using ROS Hydro and the cmvision package. This is building on the package that I build in the line follower demo. Here's a video of the color tracking.


The first setep is to get the cmvision library. That will depend on the version of ROS that you are using but for Hydro it's:

sudo apt-get ros-hydro-cmvision

Once you have that installed you need to figure out what color you want to track. In my case I just used a brightly colored balloon. You get better results using something that will stand out from the rest of the objects in a room. To find the color to track run:

rosrun cmvision colorgui image:=/camera/rgb/image_color

You may need to replace the image topic with whatever image topic your camera is on. After that the image gui will open. Place your colored object to track in view of the camera and click on it in the gui. You should see a box form around the object. If you don't get a box try clicking the object again. Keep clicking until you get a box around the whole object. If you click on it too many times you may start picking up other objects with similar colors in the room. If that happens just restart the gui and start over. Once you have a good box around the object move it to another spot in the camera view and repeat the process. The idea is to have the tracking work for the different variations of lighting in the room.


stirrer

Copy down the YUV values to use in your colors.txt file. If you want to track more than one object then restart the gui and repeat the whole process again to get another YUV value. After you finish move to your cmvision directory and open a file called "colors.txt". This is the file that will determine what is going to be tracked.

[colors]

(0, 255, 0)  0.000000 15 Green

[thresholds]

(47:87, 148:162, 93: 113)

An example of my colors.txt file is above. The colors value determines the color and line width of the box that is displayed around the object. This corresponds to the threshold where you need to input the YUV values for each object that you wrote down earlier. For multiple objects just add a new color and threshold line.

To run the blob tracker and check your work run: roslaunch cmvision cmvision.launch You may need to edit the cmvision.launch file to subscribe to your image topic. The default image topic in the launch file was not set to the same topic as my camera (/camera/rgb/image_color).

Here is the code below. See my comments inside for details on how it works. This is mostly an extension on my line follower code.

import roslib; roslib.load_manifest('color_tracking')
import rospy
from geometry_msgs.msg import Twist
from cmvision.msg import Blobs, Blob
from create_node.msg import TurtlebotSensorState

#global
turn = 0.0 #turning rate
blob_position = 0 # x position for the blob

# callback function checks to see if any blobs were found then
# loop through each and get the x position.  Since the camera
# will sometimes find many blobs in the same object we just
# average all the x values.  You could also just take the first
# one if you are sure you will only have one blob. 
#
# This doesn't use multiple blobs but if are tracking several 
# objects you need to check the /data.blobs.color topic for
# the color tag you put in your colors.txt file. 
#
# after we have the x value we just make the robot turn to 
# keep it in the center of the image.

def callback(data):
    global turn
    global blob_position

    if(len(data.blobs)):

        for obj in data.blobs:
        blob_position = blob_position + obj.x
        blob_position = blob_position/len(data.blobs)

        rospy.loginfo("blob is at %s"%blob_position)
        # turn right if we set off the left cliff sensor
        if( blob_position > 350 ):
            turn = -0.5
        # turn left if we set off the right cliff sensor
        if( blob_position < 200 ):
            turn = 0.5

        if( blob_position > 200 and blob_position < 350):
            turn = 0.0
    else: 
        turn = 0.0

def run():
    global blob_position
    # publish twist messages to /cmd_vel
    pub = rospy.Publisher('/mobile_base/commands/velocity', Twist)

    #subscribe to the robot sensor state
    rospy.Subscriber('/blobs', Blobs, callback)
    rospy.init_node("color_tracker")

    global turn
    twist = Twist()

    while not rospy.is_shutdown():

        # turn if we hit the line
        if ( turn != 0.0 ):
            str = "Turning %s"%turn
            rospy.loginfo(str)
            twist.linear.x = 0.0; twist.linear.y = 0; twist.linear.z = 0
            twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = turn
            turn = 0.0

            # straight otherwise
        else:
            str = "Straight %s"%turn
            rospy.loginfo(str)
            twist.linear.x = 0.0; twist.linear.y = 0; twist.linear.z = 0
            twist.angular.x = 0; twist.angular.y = 0; twist.angular.z = 0

            # send the message and delay
        pub.publish(twist)
    blob_position = 0
        rospy.sleep(0.1)

if __name__ == '__main__':
    try:
        run()
    except rospy.ROSInterruptException: pass

Once you have the tracker running you can run the python code above by using:

python color_tracker.py

assuming that you named it color_tracker.py.