build details

Show: section status errors & todos local changes recent changes last change in-page changes feedback controls

Data-augmentation

Modified 2020-01-17 by Liam Paull

This is the description of data augmentation demo.

Joystick demo has been successfully launched.

That you understand the workflow for running the exercises as notebooks here

Access to augmented set of data points (i.e. line segments)

This is an advanced demo. Make sure you have the experience and the hardware to fully capture its potential. The demo is also experimental, meaning that the controller might not be able to navigate safely around duckietown.

Picture of expected results

Modified 2019-12-27 by Anthony Courchesne

augmented

A video of the filtered segments before using this demo can be found here

A video of these datapoints augmented using this demo can be found here

Duckietown setup notes

Modified 2019-12-27 by Anthony Courchesne

For the duckiebot, basic setup is assumed. Not enough lightning on corner tiles might lead to a turn not being detected and the duckiebot going straight. Duckietown should with the appearance specifications presented in the Duckietown specs.

Duckiebot setup notes

Modified 2019-12-27 by Anthony Courchesne

One Duckiebot in setup DB-18.

Pre-flight checklist

Modified 2019-12-27 by Anthony Courchesne

Check: Duckiebot is properly calibrated.

Check: Lane following demo should work properly, and you should be possible to visualize the line segments in RVIZ

Demo instructions

Modified 2019-12-27 by Anthony Courchesne

This demo is especially useful to visualize and augment the filtered line segments. We recommend using it with your own controller to enhence the currently available node_filter that filters the line segments to keep only the inliers. The demo comes with a pure-pursuit controller to test the filtered segments.

Step 1: Launch your containers and start a terminal in the container

Step 2: Launch the car interface

container $ ./launch_car_interface.sh

Step 3: Launch the demo

container $ roslaunch pure_pursuit pure_pursuit.launch

Step 4: Visualize the augmented segments: Using the noVNC window, open rviz and add markers from the topics under duckiebot_visualizer/. More specifically, these are interesting markers to display:

  • segment_list_markers: List of the segment list as initially detected
  • filtered_segment_list_markers: List of the inliers after analyzed by the node_filter
  • filtered_white_points_markers: List of the white points after being filtered and augmented by the point tracker
  • filtered_yellow_points_markers: List of the yellow points after being filtered and augmented by the point tracker
  • follow_point_markers: Displays the point being tracked by the pure_pursuit controller. The robot is always going towards this point.

Troubleshooting

Modified 2019-12-27 by Anthony Courchesne

The demo is not found

Build the workspace:

container $ catkin build --workspace catkin_ws/

container $ source catkin_ws/devel/setup.bash

Project Description

Modified 2019-12-27 by Anthony Courchesne

Short-term Slam and rear-panel detection

Modified 2019-12-27 by Anthony Courchesne

This experimental demo contains code aims to solve the problem that the lane filter node is not giving enough data to build a robust controller. We will often not have enough measurment of the white or yellow line and it will be difficult to input a good command. To solve this problem, we implement a buffer that can keep track of previously seen objects (in this demo we use it with the yellow and white lines) and update it at each timestamp using the speed and angular velocity that was sent to the robot.

The demo can be launched with the launch file “pure_pursuit.launch”. It is possible to switch between the normal version and the augmented line detection version by uncommenting the line:

Visualization

Modified 2019-12-27 by Anthony Courchesne

To debug the behavior of the algorithm, it is relevant to visualize the new points generated by the short-term Slam. To do so, we re-used the package “duckiebot_visualizer” and added code to publish markers for the augmented segment lists. We also added code to display the follow point for the pure_pursuit algorithm, which helped us in fine tune the controller. A package that contains the modified duckibot_visualizer (name duckiebot_visualizer_local) is included in this repo, and the launch file points to it (it also disables the visualizer from the master launch).

Vehicle detection

Modified 2019-12-27 by Anthony Courchesne

To achieve the LFV challenge, vehicles need to be able to detect other vehicles on the track. This is handled using the vehicle_detection_node, whichsubscribes to the compressed image topic and publishes information about the detection. This node specifically look for a circle grid, which is displayed on the back panel of a duckiebot. At the moment that this project was done, the simulator did not include a rear-panel on the duckiebot model, so we added one to the .obj file (which was pushed to the official gym-dt repo). Moreover, the simulator did have a calibration for the fisheye, so we were not able to get good result on the circlegrid detection. To mitigate the fisheye effect, we reduced the size of the grid that we were trying to detect. Hence, the software is set to detect a line of 3x1 circle (since it is the only size that we were able to detect with the fisheye effect), though it should not be a problem anymore since the simulator now provides proper undistorting matrices. Moreover, this should not be a problem on the real robot. Hence, it is recommended to tune the grid detection parameters to the size of the grid (or maximum size that can be detected correctly).

If a duckiebot is detected, we set the finite state machine (FSM node) to a stopping state. For our purpose, we reused an existing state that was mapped to a stop command: ARRIVE_AT_STOP_LINE and we set the state directly using the setFSM service offered by the FSM node. We included a local copy of the FSM node since the proper way of changing state would be to add new transitions, but it is not yet implemented (but the stopping works properly using the service).

Results

Modified 2019-12-27 by Anthony Courchesne

Simulation:

The short-term slam helps the pure-pursuit better handle turns when the robot is facing the open field (before turning) and is not seeing any points. However, the buffer sometime generates outlier points which can confuse the pure_pursuit. Since the refresh rate is hight enough, even if we have a false follow point, the end result is not bad. The point buffer is definitely an improvement from the calssical lane_filter, though if he had more time we could tune it to get rid of outlier and improve the overall robustness. One idea would be to add weigths on points, based on how old they are or on our confidence level.

Real robot:

The vehicle detection works properly, but the overall controller seems to be having problem on turns. It acts has if it was not detecting any lines for some specific right turns.