Robot Qualification |
Tsukuba Robotics Challenge is a trial to develop stable robots that can move freely on the street (Also make sure to be no harm to the people)
To complete the trial, the robot have to move for about 1.3 killometers and detect 5 particular people.
My robot, Kenseiko-chan Mobile 2 |
My robot was turtlebot(a robot that is silimar to roomba) with LIDAR(Wide range Laser scanner),
toughbook note PC(to cope with rain),xtion(similar to kinect), and GPS.
debugging my robot |
controlling my robot with PS3 controller(creating map) |
and the previous day I went to Tsukuba was the deadline hand-outing the application so I had to concentrate developing it.
I just finished setting-up environment for my robot and creating robots.
When I arrived to Tsukuba, I started developing the main program of my robot.
This time I used the ROS(Robot Operating System) for the first time, and I started from reading the tutorial of ROS.
However, I couldn't understand how to program the ROS topics and frame_ids so I read the APIs, and I found reading the code is more faster to understand.
At last, I could program frame_id and topics.
map tracking in the trial |
My algorithm is amcl with SLAM(writing maps) using LIDAR. If I set the first position and the goal position, my robot will move to its way.
This time my original program didn't work so well so I just used the ROS amcl package. My robot calculate the geolocation by map matching and looking the value of motor encoder.
To do map matching, I had to take a map, so I made the map with moving my robot manually.(using PS3 controller)
The track record of my robot was 20 m.
In the practice my robot moved about 70 m. This shorten of the record was caused from the change of the environment;
In the pratice there was no person near the starting point but in the trial there were a lot of people near the starting point so the map changed from the practice.
In map-planning running, although my robot was detecting the LIDAR device, but it didn't avoid obstacle.
I found I had to change the amcl configuration file to change the LIDAR subscriber id to base_frame. Now my robot can avoid obstacle.
I didn't used GPS data to plan map this time because the GPS geolocation was inaccurate so I couldn't use it to revise odometry.
My original program was way-to-point tracking. I had to configure the coordinates to local from map coordinates to make it use.
Now my robot can do way-to-point tracking!
Robot from the Tsukuba University Intelligent Robotics Lab |
In this challenge I could prove turtlebot can be used outside. It can move through the fallen leaves. It stops moving if it hits a middle sized rock.
It can say that turtlebot can move outside.
0 件のコメント:
コメントを投稿