Home
build details

Show: section status errors & todos local changes recent changes last change in-page changes feedback controls

Lane control with supervised learning

Modified 2018-06-24 by Andrea Censi

This is the description of lane following demo.

Wheels calibration completed.wheel calibration

Camera calibration completed.Camera calibration

Joystick demo has been successfully launched.Joystick demo)

One Movidius Neural Compute Stick. The ETH team only has two sticks. If you want to try out, contact Yang Shaohui or Wang Tianlu.

Movidius Neural Compute Stick api installed on Duckiebot.Install ncsdk.

You do not need the whole ncsdk (which requires caffe and tensorflow to be installed on Duckiebot, which causes troubles and tons of time) to run this demo. Just installing the api will be enough.

Video of expected results

Modified 2018-06-22 by Andrea Censi

The videos have been recorded. See the following websites. Recorded video

Duckietown setup notes

Modified 2018-06-22 by Andrea Censi

A duckietown with white and yellow lanes. No obstacles on the lane. Actually any kind of map could be handled by the pretrained CNN network. You can even build a map which is totally different from all the map formats we currently have.

Duckiebot setup notes

Modified 2018-06-22 by Andrea Censi

Make sure the camera is heading ahead. Tighten the screws if necessary.

Make sure the bot sees white lane on its right and yellow lane on its left.

Make sure the nerual stick keeps some distance from the USB port of the raspberry PI because both things will be heating. An example appearance of the Duckiebot can be found here. Figure 15.2

The Duckiebot appearance

Pre-flight checklist

Modified 2018-06-22 by Andrea Censi

Check: turn on joystick.

Check: Enough battery of the duckiebot.

Demo instructions

Modified 2018-06-22 by Andrea Censi

Step 1: On duckiebot, checkout to the branch “devel-super-learning-jan15”. Of course, do git pull first. Use ‘catkin_make’ to rebuild the src folder.

Step 2: On duckiebot, in /DUCKIERTOWN_ROOT/ directory, run command:

duckiebot $ make demo-imitation-learning

Wait a while so that everything has been launched. Press R1 to start autonomous lane following computed by the Movidius Neural Stick. Press L1 to switch to joystick control. You can see the predictions of heading direction are printed out on the screen.

Step 3: On laptop, source the enviroment, ROS master and vehicle name file so that you can remotely watched the performance of the car. Run

laptop $ rqt_graph

You will find that there are much less ros nodes and topics compared with the traditional lane following demo. An example ros node picture is available here. Figure 15.4

The Ros graph

Troubleshooting

Modified 2018-06-22 by Andrea Censi

Contact Yang Shaohui(ETHZ) via Slack or Email(shyang@ethz.ch) if any trouble occurs.

Because of mathjax bug

No questions found. You can ask a question on the website.