build details

Show: section status errors & todos local changes recent changes last change in-page changes feedback controls

The Controllers: final report

Modified 2018-02-27 by tanij

The final result

Modified 2018-06-25 by Andrea Censi

The video is at

The Controllers Demo Video

See the operation manual to reproduce these results.

Mission and Scope

Modified 2018-02-20 by Simon Muntwiler

(With control comes power)

Our Mission was to make lane following more robust to model assumptions and Duckietown geometric specification violations and provide control for a different reference.


Modified 2018-02-25 by tanij

In Duckietown, Duckiebots are cruising on the streets and also Duckies are sitting on the sidewalk waiting for a Duckiebot to pick them up. To ensure a baseline safety of the Duckiebots and the Duckies, we have to make sure the Duckiebots are able to follow the lane (or a path on intersections and in parking lots) and stop in front of red lines. For instance, the Duckiebot is driving on the right lane. It should never cross the centerline to avoid any collisions with an oncoming Duckiebot.

The overall goal of our project is to stay in the lane while driving and stopping in front of a red line. Due to the tight time plan, we focused on improving the existing code and benchmarking the tasks. In order to let the Duckiebot drive to a given point, the robot has to know where it is in the lane, calculate the error and define a control action to reach the target. To retrieve the location and orientation information, a pose estimator is implemented. The estimator receives line segments from the image pipeline with information about line tape colour (white, yellow, red) (Figure 3.4) and whether the segment is on the left or right edge of the line tape. Using those information, we determine if the Duckiebot is inside or outside the lane, how far it is from the middle of the lane and at what angle it stands. The relative location to the middle of the lane and the orientation of the Duckiebot are passed on to the controller. In order to minimize the error, the controller calculates the desired velocity and heading of the Duckiebot using the inputs and controller parameters. The importance of our project in the framework “Duckietown” was obvious, as it contains the fundamental functionality of autonomous driving. Furthermore, many other projects relied on our project’s functionality such as obstacle avoidance, intersection navigation or parking of a Duckiebot. We had to ensure that our part is robust and reliable.

Image with Line Segments
Image with Line Segments, $d_{err}$ and $\phi_{err}$ displayed.

Existing solution

Modified 2018-04-29 by Andrea Censi

Curve plot
Pose of Duckiebot in a curve element.

From last year’s project, the baseline implementation of a pose estimator and a controller were provided to us for further improvement. The prior pose estimator was designed to deliver the pose for a Duckiebot on straight lanes only. If the Duckiebot was in or before a curve and in the middle of the lane, the estimated pose showed an offset d, see definition of d in figure below. The existing controller worked reasonably on straight lines. Although, due to the inputs from the pose estimator to the controller, the Duckiebot overshot in the curves and crossed the left/right line during or after the curve.

The video is at

Old vs. new controller


Modified 2018-06-25 by Andrea Censi

In the previous implementation, the lane following was not guaranteed on curved lane segments, because the Duckiebot often left the lane while driving in the curve or after the curve. Although the Duckiebot sometimes returned correctly to the right lane after leaving it and continued following the lane, robust lane following was not provided. On straight lanes, the Duckiebot frequently drove with a large static offset from the center of the lane. The previously implemented pose estimator and controller left room for improvement.

Further, the previous lane controller was not benchmarked for robustness nor for performance, therefore we defined various tests to benchmark the previous controller and our updated solution. During the project, we continuously tested our code with the entire lane following pipeline for best practice and compared our implemented solution to the existing one to record the improvement.

Our Scope was first of all to enable controlled autonomous driving of the Duckiebot on straight lane segments and curved lane segments which are in compliance with the geometry defined in Duckietown Appearance Specifications. Further, we wanted to enhance the robustness to arbitrary geometry of lane width or curvature of the lane to ensure the autonomous driving of the Duckiebot in an individual Duckietown setup. We also tackled the detection and stopping at red (stop) lines. With the previous implementation, the Duckiebot stopped rather at random points in front of the red line. We wanted to improve the implementation, to ensure a stop in the middle of the lane, in a predefined range and at a straight angle to the red line. As the Duckietown framework is a complex system involving various functionalities such as obstacle avoidance and intersection navigation, our lane following pipeline provides the basic function for those functionalities and it has to be able to interact with the modules of other teams. Hence, it was also our duty to design an interface which can receive and apply information from other modules. For example, our controller can take reference d from obstacle avoidance, intersection crossing and parking. For intersection navigation and parking, our controller needs additionally the pose estimation and a curvature from the navigators and the parking team respectively.

Out of scope was:

  • Pose estimation and curvature on Intersections (plus navigation / coordination)
  • Model of Duckiebot and uncertainty quantification of parameters (System Identification)
  • Object avoidance involving going to the left lane
  • Extraction and classification of edges from images (anti-instagram)
  • Any hardware design
  • Controller for Custom maneuvers (e.g. Parking, Special intersection control)
  • Robustness to non existing line

Preliminaries (optional)

Modified 2018-06-25 by Andrea Censi

Definition of the problem

Modified 2018-02-25 by tanij

Our final objective is to keep the Duckiebots at a given distance d from the center of the lane, on straight and curved roads, under bounded variations of the city geometric specifications.

The project was on the bottom line, taking the line segments which gave information about the line colour and the segment positions to estimate the Duckiebot’s pose and return a command for the motors to steer the robot to the center of the lane. After roughly analysing the existing solution, we divided the work load into two topics pose estimation and controller to enable parallel dealing with the problems in the short period of time.

In our Preliminary Design Document and Intermediate Report, we have listed all variables and their definitions, as well as all system interfaces with other groups and assumptions we made. Due to limitation of time and different priorities of other teams, some integrations with other teams are not yet activated but they are already prepared in our code (some of it commented out).

Contribution / Added functionality

Modified 2018-02-19 by Simon Muntwiler

Formal performance evaluation / Results

Modified 2018-02-25 by tanij

We evaluated the improvement of the performance with help of several tests. The evaluation procedure are defined in our Intermediate Report. The main benchmark feature was the average deviation from tracking reference during a run (distance to middle lane) and the standard deviation of the same value. We also benchmarked the deviation from the heading angle as well but since the bot is mainly controlled according to the deviation of the tracking distance, it was the main feature to lead our development. Benchmarking in general occurred by letting the Duckiebot run a specific experiment and recording a rosbag. We wrote a distinct offline benchmarking application mentioned above, that analyzes the rosbag containing the recorded values and creates plots with the extracted information about tracking distance and heading angle over the run.

Furthermore, we assessed the performance of the Duckiebots in the following dimensions:

  • Estimator:
    • Static lane pose estimation benchmark
    • Static curve pose estimation benchmark
    • Image resolution benchmark
    • Segment interpolation benchmark
    • Curvature estimation benchmark
  • Controller:
    • Stop at red line benchmark
    • Controller benchmark
    • Non-conforming curve benchmark

Future avenues of development

Modified 2018-02-20 by Simon Muntwiler

As there is always more to do and the performance for both the controller and the estimator can still be further enhanced we list in this section some suggestions for next steps to take.

No questions found. You can ask a question on the website.