DB21
DB21M
DB21
,DB21M
start_gui_tools
, ROS tools and no-vnc
DB17
DB18
DB19
Modified 2019-09-27 by Aleksandar Petrov
Project Unicorn is a project for AMOD (Autonomous Mobility On Demand) course from ETH Zürich that focuses on the intersection navigation for duckiebots in Duckietown.
The following document provides the instructions on how to run the intersection navigation demo on a duckiebot and a basic overview of the project (in the section below).
The interested reader can find the source code in the Project Unicorn Intersection Navigation repository.
Requires: Duckiebot in DB18 configuration
Requires: Completed camera calibration
Requires: Completed wheel calibration
Requires: Local computer with Docker installation
Modified 2019-04-02 by gibernas
Intersection Navigation demo:
The Intersection Navigation demo performs one intersection maneuver out of the following 3:
Left
Right
Straight
For each intersection type (3-way or 4-way) a feasible direction will be chosen randomly by the algorithm.
Intersection Navigation and Lane following demo:
Additionally to the official demo (“pure” intersection navigation), a demo combining lane following and intersection navigation can be executed.
In this case, the duckiebot will perform one intersection maneuver and switch to lane following automatically after.
To avoid unnecessary computation while performing intersection navigation in our lane following image, the camera publishes on different topics: one for lane following and the other one for intersection navigation.
As a red line detection module was not operational during our project, the lane follower will not stop at the next intersection but continue lane following.
Modified 2018-12-27 by MartaTintore
The expected behaviors should look like the following videos:
Modified 2018-12-27 by MartaTintore
The following is assumed:
Standard 3 and 4-way intersections in size and orientation; built according to Duckietown specifications.
2 AprilTags are placed facing each of the different stopping lines according to Figure 19.3 and Figure 19.4.
Modified 2019-04-02 by gibernas
Clone the duckietown-intnav folder in your PC:
laptop $ git clone --branch demo git@github.com:duckietown/duckietown-intnav.git
Modified 2019-04-02 by gibernas
Requires: Completed intersection navigation calibration.
An accurate localization in the intersection area is crucial for successful navigation.
The used algorithm is inter alia based on reprojecting points from the camera to the world frame. Therefore in order to enable an accurate localization an accurate camera calibration is required, especially with respect to scale.
Calibration test instructions:
Place the duckiebot in the intersection as in Figure 19.5: the duckiebot’s origin (center of wheel axis) in the center of the bottom line of the yellow tag adjacent to the red stop line and oriented towards the opposite side of the intersection.
Run the container on your duckiebot:
laptop $ cd duckietown-intnav/scripts
laptop $ bash deploy.bash hostname
Start the calibration test in the intnav container:
duckiebot $ roslaunch duckietown-intnav calibration_check.launch duckiebot:=hostname
When launched, press X to start the calibration procedure.
Modified 2018-12-27 by MartaTintore
Test passed: no further actions.
Test failed: the camera has to be recalibrated, with special regards on the scale.
To obtain a good camera calibration in terms of scale, make sure the bar checking for “Scale” is filled above 70% (like in Figure 19.6).
place the camera very close to the checkerboard during the data collection for the calibration process.
Modified 2018-12-27 by MartaTintore
Check: The duckiebot has sufficient battery.
Check: The intersection is free of obstacles (including other duckiebots).
Modified 2018-12-27 by MartaTintore
Place the duckiebot in front of any of the red lines from the desired Duckietown intersection.
Run the container on your duckiebot:
laptop $ cd duckietown-intnav/scripts
laptop $ bash deploy.bash hostname
Once inside the container, start the intnav demo:
duckiebot $ roslaunch duckietown-intnav main.launch duckiebot:=hostname
After the duckiebot stops, the demo is finished.
Modified 2018-12-27 by MartaTintore
Run the container on your duckiebot:
laptop $ cd duckietown-intnav
laptop $ git checkout master
laptop $ cd scripts
laptop $ bash deploy_lane_follower.bash [hostname]
Wait until the lane follower is booted; it can take a few minutes. Place your duckiebot at an intersection.
Start intersection navigation. In another terminal:
laptop $ bash deploy.bash [hostname]
duckiebot $ roslaunch duckietown-intnav main.launch [hostname]
The duckiebot will cross the intersection and switch to lane following automatically.
the duckiebot is not expected to stop at the next red line. (Read this for more information).
Modified 2019-04-02 by gibernas
roslaunch not working properly.
Resolution:
Stop the process with Ctrl + C.
Exit the container:
duckiebot $ exit
Restart the container from the terminal (go back to Step 2).
Duckiebot not moving.
Make sure the duckiebot can see AprilTags.
Duckiebot is not navigating the intersection properly (not moving smoothly or cutting corners).
Make sure the wheel calibration is done correctly.
Make sure the AprilTags are placed according to Figure 19.3 and Figure 19.4.
‘roslaunch xml error’ displayed.
Try to restart the container again (try 2-3 times). If the error is not fixed, re-flash your SD card.
Docker failed to register layer: ‘no space left on device’
Remove unused images on Portainer.
Check for the newest updates on error troubleshooting on our repository.
Modified 2018-12-27 by MartaTintore
The following video shows how the Intersection Navigation demo can fail, when the assumptions are not respected.
Modified 2018-12-27 by MartaTintore
Modified 2018-12-27 by MartaTintore
Navigate any duckietown standard intersection from any starting point without detaining other duckietown operations at a success rate greater than 90%.
Modified 2018-12-27 by MartaTintore
Intersections are an essential part of duckietowns, since without them, duckiebots can only drive autonomously on straight roads or loops. However, no method was found yet, that allows duckiebots to navigate intersections quick and reliably. Since this is a core functionality of every autonomous vehicle that interacts with street-like infrastructure, the approach implemented in this project tackles this issue.
Modified 2018-12-27 by MartaTintore
We must be able to drive duckiebots through any of the intersections. The implementation should be such that the duckiebot is not crossing any lines during the intersection navigation. In order to drive the robot to any exit point, for all initial conditions on distance and orientations at the entrance point, within a certain TBD bound.
Modified 2018-12-27 by MartaTintore
Standard Implementation:
The intersection navigation that is currently used for demos is an extended version of the duckietown lane follower. While the duckietown lane follower is robust enough to let the duckiebot follow a line without failing until its battery is empty, the extended version for intersection navigation not working reliably.
2017 Navigators team implementation:
The Navigators team implemented a template matching based approach, based on the paper Edge-Based Markerless 3D Tracking of Rigid Objects. Due to the computational heaviness of the approach, a fast navigation was not achieved, and due to slow control update cycles, the success rate of the standard implementation could not be improved.
Modified 2018-12-27 by MartaTintore
As indicated in grey in Figure 19.8 the intersection navigation goes through four main steps:
1- Estimation of the initial pose: Starting at a red line at any intersection, the duckiebot estimates its initial pose. The duckiebot relative position in the intersection with regard to the AprilTags (based on camera image) is calculated (AprilTag 2: Efficient and robust fiducial detection) and added to the fixed AprilTags relative position with regard to the global frame to obtain the duckiebot initial pose in the global frame.
2- Trajectory Generation: different paths for going left, right and straight are pre-computed and chosen depending on the desired intersection exit. Given the intersection type, the intersection command (which can be given as user input or as a random chose) and bearing in mind the duckiebot dynamic constraints and duckietown intersection boundaries, the proper trajectory is generated.
3- Pure Pursuit Controller: Constantly updating its pose estimate, it follows the path in a closed-loop manner thanks to the PurePursuit path tracking algorithm from the controller. Implementation based on the paper Automatic Steering Methods for Autonomous Automobile Path Tracking.
4- Interface: detects when the duckiebot has reached the intersection end (exit lane) and switches back to the duckietown lane follower.
Modified 2019-04-02 by gibernas
Besides the requirements stated in the requirements, setup notes and pre-flight list, the following assumptions are made:
Hardware:
Differential drive robot (DB18).
Camera with 160º field of view.
Software:
The duckiebot is placed in a lane (with width between 10 - 16 cm), in front of the red line and approximately perpendicular to it.
Duckiebot drives successfully on lines: reliable lane following mode.
Initial intersection conditions:
Duckiebot detecting intersection.
Stopping in front of the red line: Initial position within the lane and within ± 3 cm to the red line and initial orientation within ± 10° to the yellow lines.
Entering intersection navigation mode.
Maximal control delay (~ 1 s, testable).
Modified 2019-04-02 by gibernas
To watchdog the performance of the algorithm a rviz interface is available as well as display of the detected april tag.
laptop $ bash run_visualization.bash hostname
Modified 2018-12-27 by MartaTintore
For the Estimation of the initial pose, the following approaches other than the AprilTag detection were tried and not implemented for the reasons stated below:
Feature Detection + Template matching:
The camera image was re-projected to a birds-eye-view image using the homography. In this reprojected image, features were detected using the FAST-detector. To relate them to known points in the intersection, a color-based descriptor was tested that was then related to known descriptors of the given points. However, this approach turned out to be computationally heavy and did not allow for robust matching between detected and known points.
Corner Detection:
The camera image was converted to HSV colorspace. Then, canny edge detector was used to detect lines. Different thresholds were used to discard yellow and red lines from white lines, since the latter seemed to be the easiest to detect. Since the world coordinates of intersecting white lines are known, this would provide a way to localize in the map. However, this approach includes lots of threshold tuning. While it was possible to tune these thresholds according for a single environment and camera scenario, no threshold setting was found that was robust in different lightning conditions (due to changing white-balance, reflections of lines etc).
Modified 2018-12-27 by MartaTintore
The code is split into two sections:
Node | Description | Input | Output |
camera_config | Undistorsion algorithm | Camera image, Camera calibration | Undistorsioned image |
kalman | Extended Kalman filter based on differential drive mode | Path to follow, Instante pose, Measurements of previous pose | Estimated next pose |
planner | Generate corresponding path to follow | Intersection command | Path to follow |
controller | Control duckiebot locally to follow optimal trajectory | Path to follow, Instant pose | Wheel speed commands |
imap | Visualize the intersection | Intersection type | Intersection map |
Node | Libraries (lib-intnav /external) | Input | Output |
image_ processing | image_ calibration | Raw image | Undistorted image |
april_ activator | AprilTag detection ping | ||
localization | kalman | Previous wheel speeds, Detected tags | Pose estimation, Trajectory |
controller | controller, planner | Pose estimate | Optimal path, wheel speed commands |
interface | Duckietown configuration, Pose | Direction to go, Intersection type, Switch | |
tf_april_ static | tf | Duckietown configuration | AprilTag pose in world frame |
tf_cam_ vehicle | tf | Duckiebot configuration | Camera to wheel axis transformation |
visualization _imap | imap | Direction to go, Intersection type | Visualization (rviz) |
Modified 2020-07-18 by Andrea Censi
A successful intersection navigation entails:
Following the path in a reasonable way: not crossing any lines, smooth trajectory.
Achieving the desired intersection end point within a range of +/-15 cm.
Navigating the intersection within the maximum time limit: 10 s for straight and left curve, 8 s for right curve.
Navigation duration (until duckiebot is able to continue with the lane follower):
Left = 8 s (3 s initialize + 5 s operation)
Right = 6 s (3 s initialize + 3 s operation)
Straight = 6 s (3 s initialize + 3 s operation)