Here's a short video of my turtlebot responding ot voice commands using the Pocketsphinx ROS package.
Here's is short tutorial on how to do colored blob tracking using ROS Hydro and the cmvision package. This is building on the package that I build in the line follower demo. Here's a video of the color tracking.
The first setep is to get the cmvision library. That will depend on the version of ROS that you are using but for Hydro it's:
sudo apt-get ros-hydro-cmvision
Once you have that installed you need to figure out what color you want to track. In my case I just used a brightly colored balloon. You get better results using something that will stand out from the rest of the objects in a room. To find the color to track run:
rosrun cmvision colorgui image:=/camera/rgb/image_color
You may need to replace the image ...
from scratch. So now the turtlebot is running a version of ROS Fuerte and I have the newest verson of ROS Hydro on a virtual box for my workstation. I put together a simple line following program to test everything out. You can follow along to create your own or just grab the completed project from my github repo.
One new thing I ran into is that creating packages with Hydro is a little different than before. You still use the same arguments but this time create a new package using catkin_create_package. The package we will make is going to use std_msgs, rospy, and the turtlebot_node. To do this move into your working directory and run the following
catkin_create_pkg line_follower std_msgs rospy turtlebot_node
Use roscreate_pkg if you are still on an older version of ROS. You should have a line_follower directory in you workspace now. Move to that directory and create a nodes folder. In the nodes folder we will create the actual ...
I came across an unused iRobot Roomba at FUBAR labs and though this would be the perfect opportunity to build a robot using the Robotic Operating System (ROS). ROS is basically software that's used integrate all a robot's sensors (encoders, depth camera, laser scanner ect..) with the code that's used to control it. All of the sensors run as a node. For example, the Kinect sensor node publishes its depth data. The SLAM (localization and mapping) node will uses the Kinect data to determine the robot's location and publish to other nodes. Each node is separate from the others so it's easy to change or add new ones. ROS also has a ton of built in libraries that work with different servos, laser scanners, and other sensors.
The Roomba is essentially the same ...