diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 00000000..e69de29b diff --git a/404.html b/404.html new file mode 100644 index 00000000..523385ca --- /dev/null +++ b/404.html @@ -0,0 +1,189 @@ + + + + + + + + Donkey Car + + + + + + + + + + + + +
+ + +
+ +
+
+
    +
  • +
  • +
  • +
+
+
+
+
+ + +

404

+ +

Page not found

+ + +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/CNAME b/CNAME new file mode 100644 index 00000000..4c311cc8 --- /dev/null +++ b/CNAME @@ -0,0 +1 @@ +docs.donkeycar.com diff --git a/assets/ArduinoWiring.png b/assets/ArduinoWiring.png new file mode 100644 index 00000000..1b9ab117 Binary files /dev/null and b/assets/ArduinoWiring.png differ diff --git a/assets/Arduino_actuator_blk_dgm.jpg b/assets/Arduino_actuator_blk_dgm.jpg new file mode 100644 index 00000000..c8f5152e Binary files /dev/null and b/assets/Arduino_actuator_blk_dgm.jpg differ diff --git a/assets/Arduino_firmata_sketch.jpg b/assets/Arduino_firmata_sketch.jpg new file mode 100644 index 00000000..a098a2da Binary files /dev/null and b/assets/Arduino_firmata_sketch.jpg differ diff --git a/assets/ESC_RX.png b/assets/ESC_RX.png new file mode 100644 index 00000000..bcf0a18d Binary files /dev/null and b/assets/ESC_RX.png differ diff --git a/assets/HW_Video.png b/assets/HW_Video.png new file mode 100644 index 00000000..1ab13f8d Binary files /dev/null and b/assets/HW_Video.png differ diff --git a/assets/Jetbot_Assembled.png b/assets/Jetbot_Assembled.png new file mode 100644 index 00000000..7bd0fc3a Binary files /dev/null and b/assets/Jetbot_Assembled.png differ diff --git a/assets/Jetson_Adapter.jpg b/assets/Jetson_Adapter.jpg new file mode 100644 index 00000000..dc302b82 Binary files /dev/null and b/assets/Jetson_Adapter.jpg differ diff --git a/assets/RC_receiver.jpg b/assets/RC_receiver.jpg new file mode 100644 index 00000000..9e01d8ea Binary files /dev/null and b/assets/RC_receiver.jpg differ diff --git a/assets/Servo_Wiring.png b/assets/Servo_Wiring.png new file mode 100644 index 00000000..953772b7 Binary files /dev/null and b/assets/Servo_Wiring.png differ diff --git a/assets/Sombrero_assembled.jpg b/assets/Sombrero_assembled.jpg new file mode 100644 index 00000000..687be7fc Binary files /dev/null and b/assets/Sombrero_assembled.jpg differ diff --git a/assets/build_hardware/2a.png b/assets/build_hardware/2a.png new file mode 100644 index 00000000..364e1c6d Binary files /dev/null and b/assets/build_hardware/2a.png differ diff --git a/assets/build_hardware/2b.png b/assets/build_hardware/2b.png new file mode 100644 index 00000000..d7847b99 Binary files /dev/null and b/assets/build_hardware/2b.png differ diff --git a/assets/build_hardware/3a.png b/assets/build_hardware/3a.png new file mode 100644 index 00000000..1dabc4b0 Binary files /dev/null and b/assets/build_hardware/3a.png differ diff --git a/assets/build_hardware/3b.png b/assets/build_hardware/3b.png new file mode 100644 index 00000000..86560a06 Binary files /dev/null and b/assets/build_hardware/3b.png differ diff --git a/assets/build_hardware/4a.png b/assets/build_hardware/4a.png new file mode 100644 index 00000000..3efc6900 Binary files /dev/null and b/assets/build_hardware/4a.png differ diff --git a/assets/build_hardware/4b.png b/assets/build_hardware/4b.png new file mode 100644 index 00000000..9f6ae3bf Binary files /dev/null and b/assets/build_hardware/4b.png differ diff --git a/assets/build_hardware/5ab.png b/assets/build_hardware/5ab.png new file mode 100644 index 00000000..e9483778 Binary files /dev/null and b/assets/build_hardware/5ab.png differ diff --git a/assets/build_hardware/6a.png b/assets/build_hardware/6a.png new file mode 100644 index 00000000..be16cdba Binary files /dev/null and b/assets/build_hardware/6a.png differ diff --git a/assets/build_hardware/6b.png b/assets/build_hardware/6b.png new file mode 100644 index 00000000..6aadf4e3 Binary files /dev/null and b/assets/build_hardware/6b.png differ diff --git a/assets/build_hardware/7a.png b/assets/build_hardware/7a.png new file mode 100644 index 00000000..84d42289 Binary files /dev/null and b/assets/build_hardware/7a.png differ diff --git a/assets/build_hardware/7b.png b/assets/build_hardware/7b.png new file mode 100644 index 00000000..af482393 Binary files /dev/null and b/assets/build_hardware/7b.png differ diff --git a/assets/build_hardware/Crunch.png b/assets/build_hardware/Crunch.png new file mode 100644 index 00000000..f5be07b9 Binary files /dev/null and b/assets/build_hardware/Crunch.png differ diff --git a/assets/build_hardware/Desert_Monster.png b/assets/build_hardware/Desert_Monster.png new file mode 100644 index 00000000..2a9b24a4 Binary files /dev/null and b/assets/build_hardware/Desert_Monster.png differ diff --git a/assets/build_hardware/Desert_Monster_adapter.png b/assets/build_hardware/Desert_Monster_adapter.png new file mode 100644 index 00000000..b1563744 Binary files /dev/null and b/assets/build_hardware/Desert_Monster_adapter.png differ diff --git a/assets/build_hardware/DonkeycarWiring_bb.png b/assets/build_hardware/DonkeycarWiring_bb.png new file mode 100644 index 00000000..e9ed24d5 Binary files /dev/null and b/assets/build_hardware/DonkeycarWiring_bb.png differ diff --git a/assets/build_hardware/Remove--bad.jpg b/assets/build_hardware/Remove--bad.jpg new file mode 100644 index 00000000..c47de83c Binary files /dev/null and b/assets/build_hardware/Remove--bad.jpg differ diff --git a/assets/build_hardware/Remove--good.jpg b/assets/build_hardware/Remove--good.jpg new file mode 100644 index 00000000..4799519e Binary files /dev/null and b/assets/build_hardware/Remove--good.jpg differ diff --git a/assets/build_hardware/TT01.png b/assets/build_hardware/TT01.png new file mode 100644 index 00000000..169b2675 Binary files /dev/null and b/assets/build_hardware/TT01.png differ diff --git a/assets/build_hardware/assemble_camera.jpg b/assets/build_hardware/assemble_camera.jpg new file mode 100644 index 00000000..4f41cc80 Binary files /dev/null and b/assets/build_hardware/assemble_camera.jpg differ diff --git a/assets/build_hardware/donkey.png b/assets/build_hardware/donkey.png new file mode 100644 index 00000000..cf295b09 Binary files /dev/null and b/assets/build_hardware/donkey.png differ diff --git a/assets/build_hardware/donkey2.png b/assets/build_hardware/donkey2.png new file mode 100644 index 00000000..08a1bd9e Binary files /dev/null and b/assets/build_hardware/donkey2.png differ diff --git a/assets/build_hardware/traxxas.png b/assets/build_hardware/traxxas.png new file mode 100644 index 00000000..5356fba4 Binary files /dev/null and b/assets/build_hardware/traxxas.png differ diff --git a/assets/calibration_graph.png b/assets/calibration_graph.png new file mode 100644 index 00000000..98ab490e Binary files /dev/null and b/assets/calibration_graph.png differ diff --git a/assets/cuthere.jpg b/assets/cuthere.jpg new file mode 100644 index 00000000..3f5c297a Binary files /dev/null and b/assets/cuthere.jpg differ diff --git a/assets/cv_track.png b/assets/cv_track.png new file mode 100644 index 00000000..7ea16934 Binary files /dev/null and b/assets/cv_track.png differ diff --git a/assets/cv_track_telemetry.png b/assets/cv_track_telemetry.png new file mode 100644 index 00000000..e5352939 Binary files /dev/null and b/assets/cv_track_telemetry.png differ diff --git a/assets/drive_UI.png b/assets/drive_UI.png new file mode 100644 index 00000000..cdef573f Binary files /dev/null and b/assets/drive_UI.png differ diff --git a/assets/driveshaft.jpg b/assets/driveshaft.jpg new file mode 100644 index 00000000..8d06efaf Binary files /dev/null and b/assets/driveshaft.jpg differ diff --git a/assets/encoder1.jpg b/assets/encoder1.jpg new file mode 100644 index 00000000..0893e92d Binary files /dev/null and b/assets/encoder1.jpg differ diff --git a/assets/encoder2.jpg b/assets/encoder2.jpg new file mode 100644 index 00000000..f0f3ad34 Binary files /dev/null and b/assets/encoder2.jpg differ diff --git a/assets/encoder_inplace.jpg b/assets/encoder_inplace.jpg new file mode 100644 index 00000000..03f81dec Binary files /dev/null and b/assets/encoder_inplace.jpg differ diff --git a/assets/encoder_wiring.jpg b/assets/encoder_wiring.jpg new file mode 100644 index 00000000..66a709b7 Binary files /dev/null and b/assets/encoder_wiring.jpg differ diff --git a/assets/fine_calibration.gif b/assets/fine_calibration.gif new file mode 100644 index 00000000..e75eaf37 Binary files /dev/null and b/assets/fine_calibration.gif differ diff --git a/assets/hsv_picker_mask.png b/assets/hsv_picker_mask.png new file mode 100644 index 00000000..57babef1 Binary files /dev/null and b/assets/hsv_picker_mask.png differ diff --git a/assets/hsv_picker_no_mask.png b/assets/hsv_picker_no_mask.png new file mode 100644 index 00000000..70258a1c Binary files /dev/null and b/assets/hsv_picker_no_mask.png differ diff --git a/assets/imager.png b/assets/imager.png new file mode 100644 index 00000000..fb166c5a Binary files /dev/null and b/assets/imager.png differ diff --git a/assets/lidar.jpg b/assets/lidar.jpg new file mode 100644 index 00000000..a39d2f5a Binary files /dev/null and b/assets/lidar.jpg differ diff --git a/assets/lidar_angle.png b/assets/lidar_angle.png new file mode 100644 index 00000000..1625bfa8 Binary files /dev/null and b/assets/lidar_angle.png differ diff --git a/assets/logos/apple_logo.jpg b/assets/logos/apple_logo.jpg new file mode 100644 index 00000000..7ddb52f4 Binary files /dev/null and b/assets/logos/apple_logo.jpg differ diff --git a/assets/logos/linux_logo.png b/assets/logos/linux_logo.png new file mode 100644 index 00000000..09f47f7e Binary files /dev/null and b/assets/logos/linux_logo.png differ diff --git a/assets/logos/nvidia_logo.png b/assets/logos/nvidia_logo.png new file mode 100644 index 00000000..65981ef5 Binary files /dev/null and b/assets/logos/nvidia_logo.png differ diff --git a/assets/logos/rpi_logo.png b/assets/logos/rpi_logo.png new file mode 100644 index 00000000..4497c6f6 Binary files /dev/null and b/assets/logos/rpi_logo.png differ diff --git a/assets/logos/windows_logo.png b/assets/logos/windows_logo.png new file mode 100644 index 00000000..28b21316 Binary files /dev/null and b/assets/logos/windows_logo.png differ diff --git a/assets/mobile_app/advanced-configuration.png b/assets/mobile_app/advanced-configuration.png new file mode 100644 index 00000000..b8ffcada Binary files /dev/null and b/assets/mobile_app/advanced-configuration.png differ diff --git a/assets/mobile_app/autopilot.gif b/assets/mobile_app/autopilot.gif new file mode 100644 index 00000000..1ece8271 Binary files /dev/null and b/assets/mobile_app/autopilot.gif differ diff --git a/assets/mobile_app/calibration.png b/assets/mobile_app/calibration.png new file mode 100644 index 00000000..79af7f5f Binary files /dev/null and b/assets/mobile_app/calibration.png differ diff --git a/assets/mobile_app/cover.png b/assets/mobile_app/cover.png new file mode 100644 index 00000000..88672db4 Binary files /dev/null and b/assets/mobile_app/cover.png differ diff --git a/assets/mobile_app/data.png b/assets/mobile_app/data.png new file mode 100644 index 00000000..db3b8b7b Binary files /dev/null and b/assets/mobile_app/data.png differ diff --git a/assets/mobile_app/drive-summary.png b/assets/mobile_app/drive-summary.png new file mode 100644 index 00000000..ce36cad8 Binary files /dev/null and b/assets/mobile_app/drive-summary.png differ diff --git a/assets/mobile_app/drive-ui.gif b/assets/mobile_app/drive-ui.gif new file mode 100644 index 00000000..a2413b74 Binary files /dev/null and b/assets/mobile_app/drive-ui.gif differ diff --git a/assets/mobile_app/search-vehicle.png b/assets/mobile_app/search-vehicle.png new file mode 100644 index 00000000..c893f7bd Binary files /dev/null and b/assets/mobile_app/search-vehicle.png differ diff --git a/assets/mobile_app/train.png b/assets/mobile_app/train.png new file mode 100644 index 00000000..3371b2ab Binary files /dev/null and b/assets/mobile_app/train.png differ diff --git a/assets/parts/Servomotor_Timing_Diagram.svg b/assets/parts/Servomotor_Timing_Diagram.svg new file mode 100644 index 00000000..8d97f37c --- /dev/null +++ b/assets/parts/Servomotor_Timing_Diagram.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/assets/parts/rchat.png b/assets/parts/rchat.png new file mode 100644 index 00000000..cd68bf43 Binary files /dev/null and b/assets/parts/rchat.png differ diff --git a/assets/parts/stop_sign_detection/demo.mp4 b/assets/parts/stop_sign_detection/demo.mp4 new file mode 100644 index 00000000..62663eee Binary files /dev/null and b/assets/parts/stop_sign_detection/demo.mp4 differ diff --git a/assets/parts/voice_control/alexa_overview.png b/assets/parts/voice_control/alexa_overview.png new file mode 100644 index 00000000..81fbbdfa Binary files /dev/null and b/assets/parts/voice_control/alexa_overview.png differ diff --git a/assets/path_8_rotate.png b/assets/path_8_rotate.png new file mode 100644 index 00000000..2f01e029 Binary files /dev/null and b/assets/path_8_rotate.png differ diff --git a/assets/rc.jpg b/assets/rc.jpg new file mode 100644 index 00000000..8145511f Binary files /dev/null and b/assets/rc.jpg differ diff --git a/assets/rc.png b/assets/rc.png new file mode 100644 index 00000000..c87aa7ee Binary files /dev/null and b/assets/rc.png differ diff --git a/assets/rc_hat.jpg b/assets/rc_hat.jpg new file mode 100644 index 00000000..49da9341 Binary files /dev/null and b/assets/rc_hat.jpg differ diff --git a/assets/rc_wiring.jpg b/assets/rc_wiring.jpg new file mode 100644 index 00000000..cf05841c Binary files /dev/null and b/assets/rc_wiring.jpg differ diff --git a/assets/rpi_imager.png b/assets/rpi_imager.png new file mode 100644 index 00000000..237c8cf7 Binary files /dev/null and b/assets/rpi_imager.png differ diff --git a/assets/screw_assy.jpg b/assets/screw_assy.jpg new file mode 100644 index 00000000..7f804059 Binary files /dev/null and b/assets/screw_assy.jpg differ diff --git a/assets/sim_screen_shot.png b/assets/sim_screen_shot.png new file mode 100644 index 00000000..3e3bcc5a Binary files /dev/null and b/assets/sim_screen_shot.png differ diff --git a/assets/sm-tree-donkey.gif b/assets/sm-tree-donkey.gif new file mode 100644 index 00000000..6d0dcae6 Binary files /dev/null and b/assets/sm-tree-donkey.gif differ diff --git a/assets/ui-car-connector-1.png b/assets/ui-car-connector-1.png new file mode 100644 index 00000000..5d33a61c Binary files /dev/null and b/assets/ui-car-connector-1.png differ diff --git a/assets/ui-pilot-arena.png b/assets/ui-pilot-arena.png new file mode 100644 index 00000000..f8323eaf Binary files /dev/null and b/assets/ui-pilot-arena.png differ diff --git a/assets/ui-trainer.png b/assets/ui-trainer.png new file mode 100644 index 00000000..96864f5f Binary files /dev/null and b/assets/ui-trainer.png differ diff --git a/assets/ui-tub-manager-2.png b/assets/ui-tub-manager-2.png new file mode 100644 index 00000000..1bfbda19 Binary files /dev/null and b/assets/ui-tub-manager-2.png differ diff --git a/assets/ui-tub-manager.png b/assets/ui-tub-manager.png new file mode 100644 index 00000000..09bf4cb0 Binary files /dev/null and b/assets/ui-tub-manager.png differ diff --git a/assets/virtual_race_league.jpg b/assets/virtual_race_league.jpg new file mode 100644 index 00000000..e966ae70 Binary files /dev/null and b/assets/virtual_race_league.jpg differ diff --git a/assets/web_controller.png b/assets/web_controller.png new file mode 100644 index 00000000..89bd69cb Binary files /dev/null and b/assets/web_controller.png differ diff --git a/cars/roll_your_own/index.html b/cars/roll_your_own/index.html new file mode 100644 index 00000000..87c9aa28 --- /dev/null +++ b/cars/roll_your_own/index.html @@ -0,0 +1,394 @@ + + + + + + + + Roll Your Own - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Roll Your Own Car

+

Crunch

+

The Quick and Dirty

+
    +
  • Your car needs to be easy to control from a Raspberry Pi
  • +
  • Your car needs to be not too large, because it will be too heavy and dangerous (and expensive)
  • +
  • Your car needs to be not too small, because it needs to carry a certain minimum amount of equipment
  • +
  • Your car needs to meet minimum performance standards in power and control for the model to make sense for it
  • +
  • Your car needs to be smooth to control even at low speeds
  • +
+

This generally means:

+
    +
  • Your car needs to have a speed controller for the motor (ESC) that takes a standard RC 3-pin control signal (RC PWM style)
  • +
  • Your car needs to have a steering servo that takes a standard RC 3-pin control signal (RC PWM style)
  • +
  • Your car needs to have a radio receiver that contains standard 100-mil (2.54 mm) pin headers for each of the ESC and the steering servo.
  • +
  • Your car needs to be between 1/18th scale (smallest end) and 1/8th scale (largest end) if you want to race in the DIYRobocars race.
  • +
  • Your car needs to either use a brushed motor, or a sensored brushless motor. Sensorless brushless motors are too rough at low speeds. If you buy a car with a brushless motor included it is invariably a sensorless brushless motor and will need to be replaced along with the ESC.
  • +
+

Other options are perhaps possible, see the end of this document.

+

Many car builders end up looking at "integrated" RC hobby cars, because they +are typically cheaper. However, the reason these are cheaper, is that they will +integrate many parts of electronics and mechanics into a single package, which +means that we can't intersect the appropriate signals to control the car with a +Raspberry Pi. In fact, the expected signals may not even exist at all in an +integrated car.

+

Here is an example of an integrated RX and ESC - typically these should be avoided: +RX ESC example

+

You also need to know some things about electronics, such as the difference +between power rails and control signals, what the duration of a microsecond is, +and how Volts, Amperes, Watts, Hours, Ohms, and other measurement units relate.

+

Chassis build

+

While there are lots of designs out there besides the Donkeycar, but two stand out and are worth mentioning specifically.

+

Chilicorn rail

+

This is a flexible mounting system developed by Markku.ai.

+ +

sCAD Files

+

Doug LaRue, a long time community member has extensive designs for making your own chassis in sCAD. If you want to roll your own but are not comfortable with CAD this is a good place to start.

+ +

Servo Specifics

+

An RC servo is used for controlling the steering wheels of the car. This servo +typically expects around 4.8V to 6V input on the power wire (varies by car) and +a PWM control signal on the signal wire. Typically, the three wires are colored +black-red-white, or brown-red-yellow, where the dark wire (black/brown) is ground, +and the center wire (red) is power, and the light wire (white/yellow) is control.

+

The control signal is RC-style PWM, where one pulse is sent 60 times a second, +and the width of this pulse controls how left/right the servo turns. When this +pulse is 1500 microseconds, the servo is centered; when the pulse is 1000 +microseconds, the servo is turned all the way left (or right) and when the pulse +is 2000 microseconds, the servo is turned all the way in the other direction. +This is NOT the same kind of PWM that you would use to control the duty cycle of +a motor, or the brightness of a LED.

+

The power for the servo typically comes from the motor ESC, which has a BEC +(Battery Eliminator Circuit) built in.

+

ESC Specifics

+

The role of the ESC is to take a RC PWM control signal (pulse between 1000 and +2000 microseconds) in, and use that to control the power to the motor so the +motor spins with different amounts of power in forward or reverse. Again, 1500 +microseconds typically means "center" which for the motor means "dead stop."

+

The battery typically connects straight to the ESC using thicker wiring than the +simple control signals, because the motor draws many more amps than the control. +The ESC then connects on to the motor with equally thick power wiring. The +standard Donkey motor and ESC probably have a peak current of about 12A; a +1/8th scale RC car with powerful brushless motor can have a peak draw up to +200A!

+

Additionally, the ESC typically contains a linear or switching voltage converter +that outputs the power needed to control the steering servo; this is typically +somewhere in the 4.8V to 6V range. Most BECs built into ESCs will not deliver +more than about 1A of current, so it is not typically possible to power both the +steering servo and the Raspberry Pi from the BEC.

+

Receiver Specifics

+

If you buy a "kit car" that is listed as "needs a receiver," then you don't need +to buy a receiver. The Raspberry Pi plus the PCA9685 board take the role of the +receiver, outputting control signals to the car. Buying a "kit car" that comes +with steering servo, motor, and ESC, but not with radio, is actually a great way +to make sure that the car you build will have the right signalling, because any +RC car with separate receiver will be designed for the appropriate PWM signals.

+

If your car comes with a receiver, make sure it has the appropriate three-pin +headers next to each other for steering servo and for ESC control. Some receivers +may have additional three-pin headers for additional channels, which may be empty +or may control fancy attachments like horns, lights, and so forth.

+

There is a modification to the Donkey car which uses the RC radio to drive the +car when collecting training data; this will give better control of the car than +you typically get with a PlayStation controller, or cell phone. However, it also +requires replacing the PCA9685 board with an external micro-controller, and +changing the software of the Donkey to use it.

+

Finally, some receivers can output, in addition to the PWM control signals, a +serial data packet that contains the control signals. An example of such a receiver +is the FS-i6B, which has 6 output channels for PWM signals, but can output 10 +channels of data at 115,200 bps as serial data, which you can read with an external +micro-controller, or perhaps even with the Raspberry Pi (requires re-configuration +of the Pi boot loader, and custom modifications to the donkey software.)

+

Batteries

+

The Donkey comes with a Nickel Metal Hydride battery (NiMH) which is just enough +to make its motor go, for a little bit of time (5-10 minutes) before needing a +recharge. The specifications on this battery are 6 cells, 1100 mAh. Because +NiHM batteries range from 0.9V to 1.35V with a "nominal" voltage of 1.2V, you can +expect to see voltages in the 5.4V to 8.1V range.

+

NiHM batteries have medium energy capacity per weight and volume. Thus, you can +improve the runtime and performance of the Magnet car by upgrading to a Lithium +Polymer battery (LiPo.) Typically, you will get a 2 cell battery (2S) and +Lithium batteries have 3.2V to 4.2V per cell, so you will see voltages in the +6.4V to 8.4V range. Additionally, Lithium Polymer batteries generally have higher +current capacity (amount of Amps the battery can deliver at one point while +driving) as well as energy storage (number of Amp Hours the battery stores when +fully charged) so it may also last longer.

+

Note that the amount of charge a battery can hold (how long it runs) is measured +in Ampere-hours (Ah), or milli-Ampere-hours (mAh), whereas the amount of current a battery +can instantaneously deliver while driving is measured simply in Amperes. But to +make things more confusing, Amperes are often re-calculated in terms of multiples +of the energy content, divided by one hour; this ratio is often called "C." Thus, +a LiPo rated for 10C and 2000 mAh, can deliver 20 Amperes of current while +driving. A NiHM rated for 5C and 1100 mAh can deliver 5.5 Amperes of current while +driving. Batteries typically will deliver more than the C rating for very short +amounts of time, but will heat up or build up internal resistance such that that +is not something you can rely on for normal operation.

+

For your custom car, be aware of the voltages needed for the ESC and motor of the +car, and make sure to get a battery that matches in voltage. Smaller RC cars will +come with NiMH for affordability, or 2S LiPo for power. Larger RC cars will use 3S +(11.1V) or 4S (14.8V) or even 6S (22.2V) Lithium batteries, and thus need to have +ESC and motor combinations to match.

+

Finally, be sure to get a charger that matches your battery. If you have a LiPo +battery, get a good Lithium battery charger, with a balancing plug that matches +your battery. Never discharge a Lithium battery below 3.2V per cell; if you let it +run dead, it will not want to be charged up to normal voltage again, and trying to +do so may very well overheat the batter and light it on fire! See YouTube pictures +of burning Teslas for what that can look like. Seriously, houses have burned down +because people have tried to save $10 by re-charging a Lithium battery that they +forgot to disconnect and it ran down too much. It's not worth it. Instead, get a +battery alarm, that you plug into the battery balance connector, and it beeps when +the battery has discharged so much that you should disconnect and recharge it.

+

Physical Constraints

+

Adding the additional battery and electronics for self-driving to a toy car will +add more load than the car was initially designed for. For a large, 1/8th scale +car, this may not be much of a problem. For a small car, 1/18th scale or below, the +additional weight and top-heaviness will cause the car to not react well to the +steering output, which may cause the self-driving model to be less able to control +the car.

+

If you use a car that's not the standard Magnet, at a minimum, you will have to +figure out how to mount all the hardware securely. Just piling things on and hoping +wiring will keep it in place will not work for things that actually drive and turn. +Finding good mounting points, and making your own "base plate" with measurements +from the car you have, is likely to be necessary. You can build this base plate +using 3D printing, laser cutting, CNC milling, or even just drilling into a thin +piece of plywood, but getting a good fit to your chassis is important, so don't +rush it or try to cut corners.

+

Doug LaRue also built a configurator in Thingiverse that enables people to easily make custom 3D printed plates.

+

Other Options

+

Yes, you can make a self-driving car out of your 1/5th scale Nitro Dragster. You +will just have to learn even more about the different bits and pieces of the +solution, and figure out all the necessary integration yourself. The control +signals for a Nitro car are the same, so this might not even be hard. However, the +indoors arenas used for Donkey Racing Meetups do not allow fuel-burning cars, only +electric.

+

Yes, you can make a self-driving car out of a cheap two-wheel chassis that uses +a LM298 H-bridge with direct PWM control to "tank steer" two wheels. However, you +will have to adapt the Donkey software to output the right steering controls, and +you will additionally have to figure out how to wire up the H-bridge to the Pi in +a way that makes sense to you; the PWM signals output by the PCA9685 board are the +RC control kind, NOT the motor control kind! Also, most affordable two-wheel-drive +robot chassis are not actually big enough, strong enough, and mechanically +consistent enough to make for good Donkey Car candidates.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/cars/supported_cars/index.html b/cars/supported_cars/index.html new file mode 100644 index 00000000..cbdbd6ef --- /dev/null +++ b/cars/supported_cars/index.html @@ -0,0 +1,377 @@ + + + + + + + + Supported Cars - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Supported cars

+

Magnet and HSP 94186

+

The magnet chassis was the first standard Donkey build. However in many cases it may not be available. +donkey

+

Try searching for both the Magnet and HSP 94186 on ebay, banggood, ali express etc.

+

The HSP 94186 is the same as the Magnet and will work. If you speak mandarin it is always available on Taobao.

+

Taobao item offer

+

Exceed Desert Monster, Short Course Truck, and Blaze

+

The Desert Monster, SCT and Blaze are made by the same manufacturer as the Magnet and has the same motor and ESC. The chassis is slightly different so it requires an adapter and some extra hardware to work with the standard donkey platform. With the adapters the camera placement will be identical to the Magnet and should be able to share models.

+

It is worth noting that the Desert Monster and SCT also has some nice characteristics including narrower, more road friendly tires and the Blaze has a slightly narrower stance which makes it less likely to hit things.

+

Desert Monster +To purchase one of these cars follow the following links:

+ +

To assemble one of these you will need some additional parts than the standard build, these can be purchased as a kit on the donkey store at: Purchase: Donkey Store

+ + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
3D printed AdaptersFiles: thingiverse.com/thing:2260575$10
Chassis ClipsAmazon$5
+

To assemble first remove the plastic cover and roll cage then unscrew the posts that hold up the cover and replace with the adapters.

+

Visual instructions to follow.

+

LaTrax Prerunner

+

The LaTrax prerunner is a supported car and follows the same build instructions as the Desert Monster. However the adapters get screwed in as is shown in the photo below.

+

donkey

+

Donkey Pro

+

To build a donkey pro the following parts are Needed

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
Donkey Pro Plastics and baseThingiverse or Donkeystore$50
(8) M2.5 standoff
(8) M2.5 Nylock nuts
(8) M2.5x6mm socket head cap screws
(4) M3x10 plastic self threading screw
+

To assemble the Raspberry pi to the chassis this assembly picture should clarify how it fits together.

+

assy

+

Tamaya TT-01 (Advanced Build)

+

donkey

+

The TT-01 is a new build that is a higher end version of the Donkey. This is an advanced build and requires existing RC skills or the desire to learn them - along with some willingness to trial and error. For first time builders we recommend the Magnet. That said, it has some pros and cons that people should be aware of, presented below.

+

Pros:

+
    +
  • Better kinematics and traction on smooth surfaces - basically this means it will corner better
  • +
  • Larger build area for adding other sensors.
  • +
  • Globally available with several clones.
  • +
+

Cons:

+
    +
  • Assembly required! - you will need to supply your own ESC, battery, servo, pinion gear and motor.
  • +
  • Needs to run on a smooth surface like a driveway or parking lot.
  • +
  • Larger size requires a larger 3D printer to print chassis, otherwise purchase at the Donkeystore.
  • +
  • More expensive
  • +
+

In addition to the standard donkey parts, Raspberry Pi etc, you will need to buy the following components.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
TT-01 Clone Chassisamazon other TT01s may be used$130
ESCHobbyking10.60
Brushed MotorHobbyking$5
Steering ServoHobbyking$5
BatteryHobbyking or similar 2S 5000 mAh battery$21
Pinion GearAmazon$7
TT01 PlasticsThingiverse or Donkeystore$50
+

Note: purchasing from Hobbyking is tricky. They can ship from multiple warehouses and it can be expensive and time consuming if shipping from one overseas. You may need to buy an alternate component if one of the items above are not available in your local warehouse.

+

If You Want to Roll Your Own

+

It's totally possible to diverge from the main Donkey build, and still have a car that +drives well and is fun to work with. We've seen a large variety of cars in the various +Donkey competitions around the world.

+

However, when you want to diverge, there are several things you need to know, or you +will not be successful. There are many cost and quality trade-offs where the lower +cost options simply won't work. We've already worked hard to find the cheapest +available options that will work, so you should not expect to choose other options to +save money. Rolling your own is more about learning, experimentation, and going to new +and uncharged places.

+

To find out more about what you need, see Roll Your Own.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/css/fonts/Roboto-Slab-Bold.woff b/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 00000000..6cb60000 Binary files /dev/null and b/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/css/fonts/Roboto-Slab-Bold.woff2 b/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 00000000..7059e231 Binary files /dev/null and b/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/css/fonts/Roboto-Slab-Regular.woff b/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 00000000..f815f63f Binary files /dev/null and b/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/css/fonts/Roboto-Slab-Regular.woff2 b/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 00000000..f2c76e5b Binary files /dev/null and b/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/css/fonts/fontawesome-webfont.eot b/css/fonts/fontawesome-webfont.eot new file mode 100644 index 00000000..e9f60ca9 Binary files /dev/null and b/css/fonts/fontawesome-webfont.eot differ diff --git a/css/fonts/fontawesome-webfont.svg b/css/fonts/fontawesome-webfont.svg new file mode 100644 index 00000000..855c845e --- /dev/null +++ b/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/css/fonts/fontawesome-webfont.ttf b/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 00000000..35acda2f Binary files /dev/null and b/css/fonts/fontawesome-webfont.ttf differ diff --git a/css/fonts/fontawesome-webfont.woff b/css/fonts/fontawesome-webfont.woff new file mode 100644 index 00000000..400014a4 Binary files /dev/null and b/css/fonts/fontawesome-webfont.woff differ diff --git a/css/fonts/fontawesome-webfont.woff2 b/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 00000000..4d13fc60 Binary files /dev/null and b/css/fonts/fontawesome-webfont.woff2 differ diff --git a/css/fonts/lato-bold-italic.woff b/css/fonts/lato-bold-italic.woff new file mode 100644 index 00000000..88ad05b9 Binary files /dev/null and b/css/fonts/lato-bold-italic.woff differ diff --git a/css/fonts/lato-bold-italic.woff2 b/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 00000000..c4e3d804 Binary files /dev/null and b/css/fonts/lato-bold-italic.woff2 differ diff --git a/css/fonts/lato-bold.woff b/css/fonts/lato-bold.woff new file mode 100644 index 00000000..c6dff51f Binary files /dev/null and b/css/fonts/lato-bold.woff differ diff --git a/css/fonts/lato-bold.woff2 b/css/fonts/lato-bold.woff2 new file mode 100644 index 00000000..bb195043 Binary files /dev/null and b/css/fonts/lato-bold.woff2 differ diff --git a/css/fonts/lato-normal-italic.woff b/css/fonts/lato-normal-italic.woff new file mode 100644 index 00000000..76114bc0 Binary files /dev/null and b/css/fonts/lato-normal-italic.woff differ diff --git a/css/fonts/lato-normal-italic.woff2 b/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 00000000..3404f37e Binary files /dev/null and b/css/fonts/lato-normal-italic.woff2 differ diff --git a/css/fonts/lato-normal.woff b/css/fonts/lato-normal.woff new file mode 100644 index 00000000..ae1307ff Binary files /dev/null and b/css/fonts/lato-normal.woff differ diff --git a/css/fonts/lato-normal.woff2 b/css/fonts/lato-normal.woff2 new file mode 100644 index 00000000..3bf98433 Binary files /dev/null and b/css/fonts/lato-normal.woff2 differ diff --git a/css/theme.css b/css/theme.css new file mode 100644 index 00000000..ad773009 --- /dev/null +++ b/css/theme.css @@ -0,0 +1,13 @@ +/* + * This file is copied from the upstream ReadTheDocs Sphinx + * theme. To aid upgradability this file should *not* be edited. + * modifications we need should be included in theme_extra.css. + * + * https://github.com/readthedocs/sphinx_rtd_theme + */ + + /* sphinx_rtd_theme version 1.2.0 | MIT license */ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel{border:1px solid #7fbbe3;background:#e7f2fa;font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} diff --git a/css/theme_extra.css b/css/theme_extra.css new file mode 100644 index 00000000..9f4b063c --- /dev/null +++ b/css/theme_extra.css @@ -0,0 +1,191 @@ +/* + * Wrap inline code samples otherwise they shoot of the side and + * can't be read at all. + * + * https://github.com/mkdocs/mkdocs/issues/313 + * https://github.com/mkdocs/mkdocs/issues/233 + * https://github.com/mkdocs/mkdocs/issues/834 + */ +.rst-content code { + white-space: pre-wrap; + word-wrap: break-word; + padding: 2px 5px; +} + +/** + * Make code blocks display as blocks and give them the appropriate + * font size and padding. + * + * https://github.com/mkdocs/mkdocs/issues/855 + * https://github.com/mkdocs/mkdocs/issues/834 + * https://github.com/mkdocs/mkdocs/issues/233 + */ +.rst-content pre code { + white-space: pre; + word-wrap: normal; + display: block; + padding: 12px; + font-size: 12px; +} + +/** + * Fix code colors + * + * https://github.com/mkdocs/mkdocs/issues/2027 + */ +.rst-content code { + color: #E74C3C; +} + +.rst-content pre code { + color: #000; + background: #f8f8f8; +} + +/* + * Fix link colors when the link text is inline code. + * + * https://github.com/mkdocs/mkdocs/issues/718 + */ +a code { + color: #2980B9; +} +a:hover code { + color: #3091d1; +} +a:visited code { + color: #9B59B6; +} + +/* + * The CSS classes from highlight.js seem to clash with the + * ReadTheDocs theme causing some code to be incorrectly made + * bold and italic. + * + * https://github.com/mkdocs/mkdocs/issues/411 + */ +pre .cs, pre .c { + font-weight: inherit; + font-style: inherit; +} + +/* + * Fix some issues with the theme and non-highlighted code + * samples. Without and highlighting styles attached the + * formatting is broken. + * + * https://github.com/mkdocs/mkdocs/issues/319 + */ +.rst-content .no-highlight { + display: block; + padding: 0.5em; + color: #333; +} + + +/* + * Additions specific to the search functionality provided by MkDocs + */ + +.search-results { + margin-top: 23px; +} + +.search-results article { + border-top: 1px solid #E1E4E5; + padding-top: 24px; +} + +.search-results article:first-child { + border-top: none; +} + +form .search-query { + width: 100%; + border-radius: 50px; + padding: 6px 12px; /* csslint allow: box-model */ + border-color: #D1D4D5; +} + +/* + * Improve inline code blocks within admonitions. + * + * https://github.com/mkdocs/mkdocs/issues/656 + */ + .rst-content .admonition code { + color: #404040; + border: 1px solid #c7c9cb; + border: 1px solid rgba(0, 0, 0, 0.2); + background: #f8fbfd; + background: rgba(255, 255, 255, 0.7); +} + +/* + * Account for wide tables which go off the side. + * Override borders to avoid weirdness on narrow tables. + * + * https://github.com/mkdocs/mkdocs/issues/834 + * https://github.com/mkdocs/mkdocs/pull/1034 + */ +.rst-content .section .docutils { + width: 100%; + overflow: auto; + display: block; + border: none; +} + +td, th { + border: 1px solid #e1e4e5 !important; /* csslint allow: important */ + border-collapse: collapse; +} + +/* + * Without the following amendments, the navigation in the theme will be + * slightly cut off. This is due to the fact that the .wy-nav-side has a + * padding-bottom of 2em, which must not necessarily align with the font-size of + * 90 % on the .rst-current-version container, combined with the padding of 12px + * above and below. These amendments fix this in two steps: First, make sure the + * .rst-current-version container has a fixed height of 40px, achieved using + * line-height, and then applying a padding-bottom of 40px to this container. In + * a second step, the items within that container are re-aligned using flexbox. + * + * https://github.com/mkdocs/mkdocs/issues/2012 + */ + .wy-nav-side { + padding-bottom: 40px; +} + +/* + * The second step of above amendment: Here we make sure the items are aligned + * correctly within the .rst-current-version container. Using flexbox, we + * achieve it in such a way that it will look like the following: + * + * [No repo_name] + * Next >> // On the first page + * << Previous Next >> // On all subsequent pages + * + * [With repo_name] + * Next >> // On the first page + * << Previous Next >> // On all subsequent pages + * + * https://github.com/mkdocs/mkdocs/issues/2012 + */ +.rst-versions .rst-current-version { + padding: 0 12px; + display: flex; + font-size: initial; + justify-content: space-between; + align-items: center; + line-height: 40px; +} + +/* + * Please note that this amendment also involves removing certain inline-styles + * from the file ./mkdocs/themes/readthedocs/versions.html. + * + * https://github.com/mkdocs/mkdocs/issues/2012 + */ +.rst-current-version span { + flex: 1; + text-align: center; +} diff --git a/dev_guide/contribute/index.html b/dev_guide/contribute/index.html new file mode 100644 index 00000000..9b12444e --- /dev/null +++ b/dev_guide/contribute/index.html @@ -0,0 +1,248 @@ + + + + + + + + Contribute - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Contribute to Donkey

+

Donkey is an open source project to help accelerate the development of +self driving autos.

+

There is a very good explanation of the DonkeyCar software architecture and theory here

+

Guiding Development Principles

+
    +
  • Modularity: A self driving system is composed of standalone, +independently configurable components that can be combined to make a car.
  • +
  • Minimalism: Each component should be kept short (<100 lines of code). +Each piece of code should be transparent upon first reading. No black magic, +it slows the speed of innovation.
  • +
  • Extensibility: New components should be simple to create by following a +template.
  • +
  • Python: Keep it simple.
  • +
+

These guidelines are nearly copied from Keras, + because they are so good

+

Add a part

+

Are you a hardware specialist that can write a donkey part wrapper for a +GPS unit or a data scientist that can write an recursive neural net autopilot? +If so please write a part so other people driving donkeys can use the part. How do parts work? Check out this overview

+

Fix or report a bug

+

If you find a problem with the code and you know how to fix it then please +clone the repo, make your fix, and submit your pull request.

+

Reply to issues

+

Helping close or triage the issues is a good way to help.

+

If You Need An Inspiration

+

Search the code or docs for TODO to find places where you might be able +to find a better solution.

+

Improve the documentation

+

You can fix grammar or provide clarity by clicking the the Edit on GitHub +link in the top right corner. Here's a guide to how to create and edit docs.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/dev_guide/docs/index.html b/dev_guide/docs/index.html new file mode 100644 index 00000000..80f8a420 --- /dev/null +++ b/dev_guide/docs/index.html @@ -0,0 +1,221 @@ + + + + + + + + Contributing to the documentation - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Contributing to the documentation

+

Thank you for contributing to the Donkeycar project. The documentation is critical for the success of our users so we appreciate your contributions. Accuracy and completeness is critical. Many users are beginners so please write your contributions with this in mind; don't assume that 'they should already know that'.

+

We use the mkdocs package to create the html for the https://docs.donkeycar.com site. The files in the repo are in markdown format; mkdocs compiles those to html so they can be displayed in a browser. You make your changes in your own fork of the donkeydocs repo and open a pull request so it can be merged into the main donkeydocs repo by one of the maintainers. Once the PR is merged then the changes will automatically be compiled and pushed to the https://docs.donkeycar.com site.

+
    +
  1. Fork the donkeydocs repo in your own github account.
  2. +
  3. Clone your fork to your computer so you have a local copy that can be changed.
  4. +
  5. Create a new branch in the clone of your fork; the branch will be used to make the changes/additions. If this is related to an issue in the main repo then start the name of the branch with the issue number.
  6. +
  7. +

    Make the changes/additions and check them in your fork. We use a package called mkdocs to compile the markdown files that you edit/create into html. See the mkdocs documentation for the particulars of the markdown format that it uses. If you install mkdocs you can use it to generate a live preview so you can see the changes as you save them.

    +
      +
    • open a console (git bash console on windows) and cd into the root of your cloned donkeydocs project folder.
    • +
    • create a python virtual environment and activate it.
    • +
    • install the mkdocs package into the activated virtual environment.
    • +
    • run the mkdocs server. This will provide you with a url that you can open in your browser to see the rendered docs.
      +python3 -m venv env +source env/bin/activate +pip3 install mkdocs +mkdocs serve
    • +
    • On subsequent edit sessions, just reactivate the virtual environment and start the mkdocs server (you don't need to recreate the environment, just reactivate it).
    • +
    +
  8. +
  9. +

    Once you are done making changes/addtions in your branch, commit the changes and push them to your forked repo. If you find that you need to make more changes then just rinse and repeat; make changes, commit them, push them.

    +
  10. +
  11. Once you are sure your updates are correct and they are pushed to your forked repo, open a pull request. Because you created a fork of the main donkeydocs repo, you can reflect this pull request in the main repo. Github has a nice feature; after you push to your forked repo; if you go to the repo in github.com then you will see a green Compare & Pull Request button; you can push that to create your pull request.
  12. +
  13. That will open a pull request in the main donkeydocs repo. I would alert the discord maintainers channel that you have opened a PR so someone will review it. You can expect a couple of rounds of comments during the PR process. If you get a change request then you can make changes related to those comments and push them; the new changes will be reflected in the pull request.
  14. +
+

This process is documented in more detail here https://docs.github.com/en/get-started/quickstart/contributing-to-projects

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/dev_guide/model/index.html b/dev_guide/model/index.html new file mode 100644 index 00000000..02c638e6 --- /dev/null +++ b/dev_guide/model/index.html @@ -0,0 +1,498 @@ + + + + + + + + Create your own model - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

How to build your own model

+
+

Note: This requires version >= 4.1.X

+
+ +

Overview

+

You might want to write your own model:

+
    +
  • If you find the models that ship with donkey not sufficient, and you want to + experiment with your own model infrastructure
  • +
  • If you want to add more input data to the model because your car has more + sensors
  • +
+

Constructor

+

Models are located in donkeycar/parts/keras.py. Your own model needs to +inherit from KerasPilot and initialize your model:

+
class KerasSensors(KerasPilot):
+    def __init__(self, input_shape=(120, 160, 3), num_sensors=2):
+        super().__init__()
+        self.num_sensors = num_sensors
+        self.model = self.create_model(input_shape)
+
+

Here, you implement the keras model +in the member function create_model(). The model needs to have labelled input +and output tensors. These are required for the training to work.

+

Training interface

+

What is required for your model to work, are the following functions:

+
def compile(self):
+    self.model.compile(optimizer=self.optimizer, metrics=['accuracy'],
+                       loss={'angle_out': 'categorical_crossentropy',
+                             'throttle_out': 'categorical_crossentropy'},
+                       loss_weights={'angle_out': 0.5, 'throttle_out': 0.5})
+
+

The compile function tells keras how to define the loss function for training. +We are using the KerasCategorical model as an example. The loss function here +makes explicit usage of the output tensors of the +model (angle_out, throttle_out).

+
def x_transform(self, record: TubRecord):
+    img_arr = record.image(cached=True)
+    return img_arr
+
+

In this function you define how to extract the input data from your +recorded data. This data is usually called X in the ML frame work . We have +shown the implementation in the base class which works for all models that have +only the image as input.

+

The function returns a single data item if the model has only one input. You +need to return a tuple if your model uses more input data.

+

Note: If your model has more inputs, the tuple needs to have the image in +the first place.

+
def y_transform(self, record: TubRecord):
+    angle: float = record.underlying['user/angle']
+    throttle: float = record.underlying['user/throttle']
+    return angle, throttle
+
+

In this function you specify how to extract the y values (i.e. target +values) from your recorded data.

+
def x_translate(self, x: XY) -> Dict[str, Union[float, np.ndarray]]:
+    return {'img_in': x}
+
+

Here we require a translation of how the X value that you extracted above will +be fed into tf.data. Note, tf.data expects a dictionary if the model has +more than one input variable, so we have chosen to use dictionaries also in the +one-argument case for consistency. Above we have shown the implementation in the +base class which works for all models that have only the image as input. You +don't have to overwrite neither x_transform nor x_translate if your +model only uses the image as input data.

+

Note: the keys of the dictionary must match the name of the input +layers in the model.

+
def y_translate(self, y: XY) -> Dict[str, Union[float, np.ndarray]]:
+    if isinstance(y, tuple):
+        angle, throttle = y
+        return {'angle_out': angle, 'throttle_out': throttle}
+    else:
+        raise TypeError('Expected tuple')
+
+

Similar to the above, this provides the translation of the y data into the +dictionary required for tf.data. This example shows the implementation of +KerasLinear.

+

Note: the keys of the dictionary must match the name of the output +layers in the model.

+
def output_shapes(self):
+    # need to cut off None from [None, 120, 160, 3] tensor shape
+    img_shape = self.get_input_shape()[1:]
+    shapes = ({'img_in': tf.TensorShape(img_shape)},
+              {'angle_out': tf.TensorShape([15]),
+               'throttle_out': tf.TensorShape([20])})
+    return shapes
+
+

This function returns a tuple of two dictionaries that tells tensorflow which +shapes are used in the model. We have shown the example of the +KerasCategorical model here.

+

Note 1: As above, the keys of the two dictionaries must match the name +of the input and output layers in the model.

+

Note 2: Where the model returns scalar numbers, the corresponding +type has to be tf.TensorShape([]).

+

Parts interface

+

In the car application the model is called through the run() function. That +function is already provided in the base class where the normalisation of the +input image is happening centrally. Instead, the derived classes have to +implement +inference() which works on the normalised data. If you have additional data +that needs to be normalised, too, you might want to override run() as well.

+
def inference(self, img_arr, other_arr):
+    img_arr = img_arr.reshape((1,) + img_arr.shape)
+    outputs = self.model.predict(img_arr)
+    steering = outputs[0]
+    throttle = outputs[1]
+    return steering[0][0], throttle[0][0]
+
+

Here we are showing the implementation of the linear model. Please note that +the input tensor shape always contains the batch dimension in the first +place, hence the shape of the input image is adjusted from +(120, 160, 3) -> (1, 120, 160, 3).

+

Note: _If you are passing another array in theother_arr variable, you will +have to do a similar re-shaping.

+

Example

+

Let's build a new donkey model which is based on the standard linear model +but has following changes w.r.t. input data and network design:

+
    +
  1. +

    The model takes an additional vector of input data that represents a set + of values from distance sensors which are attached to the front of the car.

    +
  2. +
  3. +

    The model adds a couple of more feed-forward layers to combine the CNN + layers of the vision system with the distance sensor data.

    +
  4. +
+

Building the model using keras

+

So here is the example model:

+
class KerasSensors(KerasPilot):
+    def __init__(self, input_shape=(120, 160, 3), num_sensors=2):
+        super().__init__()
+        self.num_sensors = num_sensors
+        self.model = self.create_model(input_shape)
+
+    def create_model(self, input_shape):
+        drop = 0.2
+        img_in = Input(shape=input_shape, name='img_in')
+        x = core_cnn_layers(img_in, drop)
+        x = Dense(100, activation='relu', name='dense_1')(x)
+        x = Dropout(drop)(x)
+        x = Dense(50, activation='relu', name='dense_2')(x)
+        x = Dropout(drop)(x)
+        # up to here, this is the standard linear model, now we add the
+        # sensor data to it
+        sensor_in = Input(shape=(self.num_sensors, ), name='sensor_in')
+        y = sensor_in
+        z = concatenate([x, y])
+        # here we add two more dense layers
+        z = Dense(50, activation='relu', name='dense_3')(z)
+        z = Dropout(drop)(z)
+        z = Dense(50, activation='relu', name='dense_4')(z)
+        z = Dropout(drop)(z)
+        # two outputs for angle and throttle
+        outputs = [
+            Dense(1, activation='linear', name='n_outputs' + str(i))(z)
+            for i in range(2)]
+
+        # the model needs to specify the additional input here
+        model = Model(inputs=[img_in, sensor_in], outputs=outputs)
+        return model
+
+    def compile(self):
+        self.model.compile(optimizer=self.optimizer, loss='mse')
+
+    def inference(self, img_arr, other_arr):
+        img_arr = img_arr.reshape((1,) + img_arr.shape)
+        sens_arr = other_arr.reshape((1,) + other_arr.shape)
+        outputs = self.model.predict([img_arr, sens_arr])
+        steering = outputs[0]
+        throttle = outputs[1]
+        return steering[0][0], throttle[0][0]
+
+    def x_transform(self, record: TubRecord) -> XY:
+        img_arr = super().x_transform(record)
+        # for simplicity we assume the sensor data here is normalised
+        sensor_arr = np.array(record.underlying['sensor'])
+        # we need to return the image data first
+        return img_arr, sensor_arr
+
+    def x_translate(self, x: XY) -> Dict[str, Union[float, np.ndarray]]:
+        assert isinstance(x, tuple), 'Requires tuple as input'
+        # the keys are the names of the input layers of the model
+        return {'img_in': x[0], 'sensor_in': x[1]}
+
+    def y_transform(self, record: TubRecord):
+        angle: float = record.underlying['user/angle']
+        throttle: float = record.underlying['user/throttle']
+        return angle, throttle
+
+    def y_translate(self, y: XY) -> Dict[str, Union[float, np.ndarray]]:
+        if isinstance(y, tuple):
+            angle, throttle = y
+            # the keys are the names of the output layers of the model
+            return {'n_outputs0': angle, 'n_outputs1': throttle}
+        else:
+            raise TypeError('Expected tuple')
+
+    def output_shapes(self):
+        # need to cut off None from [None, 120, 160, 3] tensor shape
+        img_shape = self.get_input_shape()[1:]
+        # the keys need to match the models input/output layers
+        shapes = ({'img_in': tf.TensorShape(img_shape),
+                   'sensor_in': tf.TensorShape([self.num_sensors])},
+                  {'n_outputs0': tf.TensorShape([]),
+                   'n_outputs1': tf.TensorShape([])})
+        return shapes
+
+

We could have inherited from KerasLinear which already provides the +implementation of y_transform(), y_translate(), compile(). However, to +make it explicit for the general case we have implemented all functions here. +The model requires the sensor data to be an array in the TubRecord with key +"sensor".

+

Creating a tub

+

Because we don't have a tub with sensor data, let's create one with fake +sensor entries:

+
import os
+import tarfile
+import numpy as np
+from donkeycar.parts.tub_v2 import Tub
+from donkeycar.pipeline.types import TubRecord
+from donkeycar.config import load_config
+
+
+if __name__ == '__main__':
+    # put your path to your car app
+    my_car = os.path.expanduser('~/mycar')
+    cfg = load_config(os.path.join(my_car, 'config.py'))
+    # put your path to donkey project
+    tar = tarfile.open(os.path.expanduser(
+        '~/Python/donkeycar/donkeycar/tests/tub/tub.tar.gz'))
+    tub_parent = os.path.join(my_car, 'data2/')
+    tar.extractall(tub_parent)
+    tub_path = os.path.join(tub_parent, 'tub')
+    tub1 = Tub(tub_path)
+    tub2 = Tub(os.path.join(my_car, 'data2/tub_sensor'),
+               inputs=['cam/image_array', 'user/angle', 'user/throttle',
+                       'sensor'],
+               types=['image_array', 'float', 'float', 'list'])
+
+    for record in tub1:
+        t_record = TubRecord(config=cfg,
+                             base_path=tub1.base_path,
+                             underlying=record)
+        img_arr = t_record.image(cached=False)
+        record['sensor'] = list(np.random.uniform(size=2))
+        record['cam/image_array'] = img_arr
+        tub2.write_record(record)
+
+

Making the model available

+

We don't have a dynamic factory yet, so we need to add the new model into the +function get_model_by_type() in the module donkeycar/utils.py:

+
...
+elif model_type == 'sensor':
+    kl = KerasSensors(input_shape=input_shape)
+...
+
+

Go train

+

In your car app folder now the following should work: +donkey train --tub data2/tub_sensor --model models/pilot.h5 --type sensor +Because of the random values in the data the model will not converge quickly, +the goal here is to get it working in the framework.

+

Support and discussions

+

Please join the Discord Donkey Car group for +support and discussions.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/dev_guide/tests/index.html b/dev_guide/tests/index.html new file mode 100644 index 00000000..f6f6de06 --- /dev/null +++ b/dev_guide/tests/index.html @@ -0,0 +1,217 @@ + + + + + + + + Tests - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Tests

+

There is a limited test suite to ensure that the your changes to the code +don't break something unintended.

+

Run all the tests

+

Run pytest from the donkeycar project directory.

+

Code Organization

+

The test code is in tests foders in the same folder as the code. This is to +help keep the test code linked to the code its self. If you change the code, +change the tests. :)

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/extra.css b/extra.css new file mode 100644 index 00000000..4833e2f4 --- /dev/null +++ b/extra.css @@ -0,0 +1 @@ +blockquote {background: khaki; padding: 0 10px;} diff --git a/guide/build_hardware/index.html b/guide/build_hardware/index.html new file mode 100644 index 00000000..af03d1e0 --- /dev/null +++ b/guide/build_hardware/index.html @@ -0,0 +1,571 @@ + + + + + + + + Build a car. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

How to Build a Donkey®

+

 

+ +

Overview

+

The latest version of the software installation instructions are maintained in the software instructions section. Be sure to follow those instructions after you've built your car.

+

Choosing a Car

+

There are two main options for cars. One is the WL Toys brand and the second is the exceed brand.

+

NOTE: only the WL Toys 144010 and HSP-94186 are readily avilable rignt now

+

The WL Toys 144010 is probably the easiest car to get right now. It has a brushless motor, which makes it fast, but takes some getting used to for beginners. There are brushed motor versions of theis car the 144011 and 144001 but both require the user to replace both the steering servo and ESC. Only do this if you are familiar with RC or enjoy tinkering. Here is a short video explaining how to assemble the car. You can find the adapters in Thingiverse or if you would like to buy them you can do so one the donkey car store

+

The alterate car, which often has slighly less availability is the HSP 94186 and the "Exceed" brand cars. There are 5 supported cars, all are very similar and should be considered equivalent. Note, often some of these are out of stock, so go through the links to find one that is in stock. If they are out of stock on Amazon, you can find the cars at the Exceed Website. The HSP-94186 is identical to the Exceed Magnet 1/16 Truck; it can be found on AliExpress but takes about a month to get to the US, there are local options that charge a premium.

+ +

These cars are electrically identical but have different tires, mounting and other details. It is worth noting that the Desert Monster, Short Course Truck and Blaze all require adapters which can be easily printed or purchased from the donkey store. These are the standard build cars because they are mostly plug and play, both have a brushed motor which makes training easier, they handle rough driving surfaces well and are inexpensive.

+

Here is a video overview of the different cars (Excluding the WL Toys car) and how to assemble them.

+

For advanced users there are 2 more cars supported under the "Donkey Pro" name. These are 1/10 scale cars which means that they are bigger, perform a little better and are slightly more expensive. They can be found here:

+
    +
  • HobbyKing Mission-D found here
  • +
  • Tamaya TT01 or Clone commonly used knockoff found here - found worldwide but usually has to be built as a kits. The other two cars are ready to be donkified, this one, however is harder to assemble.
  • +
+

Here is a video that goes over the different models. The Donkey Pro models are not yet very well documented, just a word of warning.

+

For more detail and other options, follow the link to: supported cars

+

donkey

+

Roll Your Own Car

+

Alternatively If you know RC or need something the standard Donkey does not support, you can roll your own. Here is a quick reference to help you along the way. Roll Your Own

+

Video Overview of Hardware Assembly

+

This video covers how to assemble a standard Donkey Car, it also covers the Sombrero, the Raspberry Pi and the nVidia Jetson Nano.

+

IMAGE ALT TEXT HERE

+

Parts Needed

+

The following instructions are for the Raspberry Pi, below in Optional Upgrades section, you can find the NVIDIA Jetson Nano instructions.

+

Option 1: Buying through an official Donkey Store

+

There are two official stores:

+

If you are in the US, you can use the Donkey store. The intention of the Donkey Store is to make it easier and less expensive to build the Donkey Car. The Donkey Store is run by the original founders of donkey car and profits are used to fund development of the donkey cars. Also it is worth noting the design of the parts out of the Donkey store is slightly improved over the standard build as it uses better parts that are only available in large quantities or are harder to get. The Donkey Store builds are open source like all others.

+

If you are in Asia, the DIYRobocars community in Hong Kong also sells car kits at Robocar Store. They are long term Donkey community members and use proceeds to support the R&D efforts of this project. It is worth noting they can also sell to Europe and the US but it is likely less cost effective.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
WL Toys 144010, Exceed Magnet, Desert Monster, Blaze, or Short Course TruckSee links above$100-130
USB Battery with microUSB cable (any battery capable of 2A 5V output is sufficient)Anker 10,000 mAh$39
Raspberry Pi 3b+Pi 3b+$42
MicroSD Card (many will work, we strongly recommend this one)64GB https://amzn.to/2XP7UAa$11.99
Donkey Partial KitKIT$82 to $125
+

Option 2: Bottoms Up Build

+

If you want to buy the parts yourself, want to customize your donkey or live outside of the US, you may want to choose the bottoms up build. Keep in mind you will have to print the donkey car parts which can be found here

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
Magnet Car or alternativesee cars above under 'choosing a car'$92
M2x6 screws (8)Amazon or Donkey Store$4.89 *
M3x10 screws (3)Amazon or Donkey Store$7.89 *
USB Battery with microUSB cable (any battery capable of 2A 5V output is sufficient)Anker 10,000 maH$39
Raspberry Pi 3b+Pi 3B+$38
MicroSD Card (many will work, I like this one because it boots quickly)64GB$18.99
Wide Angle Raspberry Pi CameraAmazon or Donkey Store$25
Female to Female Jumper WireAmazon or Donkey Car Store$7 *
(Optional if you don't want to use RPi GPIO pins to control the car's servo and throttle directly) Servo Driver PCA 9685Amazon or Donkey Car Store$12 **
3D Printed roll cage and top plate.Purchase: Donkey Store Files: thingiverse.com/thing:2260575$50
+

* If it is hard to find these components, there is some wiggle room. Instead of an M2 you can use an M2.2, m2.3 or #4 SAE screw. Instead of an M3 a #6 SAE screw can be used. Machine screws can be used in a pinch.

+

** This component can be purchased from Ali Express for ~$2-4 if you can wait the 30-60 days for shipping.

+

Optional Upgrades

+
    +
  • NVIDIA JetsonNano Hardware Options The NVIDIA Jetson Nano is fully supported by the donkey Car. To assemble the Donkey Car you will need a few parts including the Wifi card, Antennas and camera. In addition you will need this Adapter. If you want to print it yourself, it is on the Thingiverse page for the project. Due to the higher power usage and consumption you should consider the 10Ahr 3A USB battery pack listed below and a good cable rated for 3A.
  • +
+

adapter

+

Plug in the Servo driver the same as the Raspberry Pi, just keep in mind that the Jetson pinout is reversed and that the Sombrero is not supported.

+

Jetson Servo

+

Finally this is the Donkey Assembled.

+

Jetson Assembled

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
Nvidia Jetson NanoAmazon$99
Jetson Nano AdapterDonkey Store$7
Camera ModuleDonkey Store$27
WiFi CardAmazon$18
AntennasDonkey Store$7
+

For other options for part, feel free to look at the jetbot documentation here.

+
    +
  • Sombrero Hat NOTE: the Sombrero is out of stock at any stores - we are looking at other options or will place another order. The sombrero hat replaces the Servo driver and the USB battery and can be purchased at the Donkeycar store here and video instructions can be found here. Implementing the Sombrero hat requires a LiPo battery (see below). Documentation is in Github.
  • +
+

sombrero

+
    +
  • LiPo Battery and Accessories: LiPo batteries have significantly better energy density and have a better dropoff curve. See below (courtesy of Traxxas).
  • +
+

donkey

+ + + + + + + + + + + + + + + + + + + + + + + + + +
Part DescriptionLinkApproximate Cost
LiPo Batteryhobbyking.com/en_us/turnigy-1800mah-2s-20c-lipo-pack.html or amazon.com/gp/product/B0072AERBE/$8.94 to $~17
Lipo Charger (takes 1hr to charge the above battery)charger$13
Lipo Battery Case (to prevent damage if they explode)lipo safe$8
+

Hardware

+

If you purchased parts from the Donkey Car Store, skip to step 3.

+

Step 1: Print Parts

+

If you do not have a 3D Printer, you can order parts from Donkey Store, Shapeways or 3dHubs. I printed parts in black PLA, with 2mm layer height and no supports. The top roll bar is designed to be printed upside down. Remember that you need to print the adapters unless you have a "Magnet"

+

I printed parts in black PLA, with .3mm layer height with a .5mm nozzle and no supports. The top roll bar is designed to be printed upside down.

+

Step 2: Clean up parts

+

Almost all 3D Printed parts will need clean up. Re-drill holes, and clean up excess plastic.

+

donkey

+

In particular, clean up the slots in the side of the roll bar, as shown in the picture below:

+

donkey

+

Step 3: Assemble Top plate and Roll Cage

+

If you have an Exceed Short Course Truck, Blaze or Desert Monster watch this video

+

This is a relatively simple assembly step. Just use the 3mm self tapping screws to scew the plate to the roll cage.

+

When attaching the roll cage to the top plate, ensure that the nubs on the top plate face the roll-cage. This will ensure the equipment you mount to the top plate fits easily.

+

Step 4: Connect Servo Shield to Raspberry Pi

+

The PCA9685 Servo controller can control up to 16 PWM devices like servos, motor controllers, LEDs or almost anything that uses a PWM signal. It is connected to the RaspberryPi (or Jetson Nano) 40 pin GPIO bus via the I2C pins.

+
    +
  • GPIO I2C bus 1
      +
    • SDA is board pin 03
    • +
    • SCL is board pin 05
    • +
    +
  • +
  • Wiring
      +
    • SDA and SCL may be through a shared bus rather than a direct connection between nano and PCA9685 if other devices are using the I2C bus (like an OLED display)
    • +
    • 3.3v VCC power may be provided by a 3.3v pin on the GPIO bus (typically board pin 01).
    • +
    • 5v VIN should NOT be provided by the GPIO bus because motors/servos may draw too much power. Most Electronic Speed Controllers actually provide the necessary power via the 3 pin cables that get plugged into the PCA9685, so it is generally not necessary to provide power directly to VIN.
    • +
    • All GND must be common ground. On the GPIO it is usually easiest to use GPIO board pin 09 for ground. Once again the 3 pin cables from the ESC carry ground and the PCA9685 connects this to the GPIO via the GND pin.
    • +
    +
  • +
+
---
+    GPIO   ... PCA9685  ... 5v ... ESC ... Servo
+    3v3-01 <---> VCC
+    pin-03 <---> SDA
+    pin-05 <---> SCL
+    GND-09 <---> GND 
+                 VIN  <---> 5v   optional, see above
+                 GND  <---> GND
+                 CH-0 <---------> ESC
+                 CH-1 <------------------> Servo
+---
+
+
    +
  • checking connections
      +
    • The PCA9685 should appear on I2C bus 1 at address 0x40
    • +
    • ssh into the Donkeycar and use i2cdetect to read bus 1. A device should exist at address 0x40
    • +
    +
  • +
+
---
+    $ i2cdetect -y -r 1
+         0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f
+    00:          -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    40: 40 -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
+    70: UU -- -- -- -- -- -- --                         
+---
+
+

You could do this after attaching the Raspberry Pi to the bottom plate, I just think it is easier to see the parts when they are laying on the workbench. Connect the parts as you see below:

+

donkey

+

For reference, below is the Raspberry Pi Pinout for reference. You will notice we connect to 3.3v, the two I2C pins (SDA and SCL) and ground:

+

donkey

+

Step 5: Attach Raspberry Pi to 3D Printed bottom plate

+

Before you start, now is a good time to insert the already flashed SD card and bench test the electronics. Once that is done, attaching the Raspberry Pi and Servo is as simple as running screws through the board into the screw bosses on the top plate. The M2.5x12mm screws should be the perfect length to go through the board, the plastic and still have room for a washer. The “cap” part of the screw should be facing up and the nut should be on the bottom of the top plate. The ethernet and USB ports should face forward. This is important as it gives you access to the SD card and makes the camera ribbon cable line up properly.

+

Attach the USB battery to the underside of the printed bottom plate using cable ties or velcro.

+

donkey

+

Step 6: Attach Camera

+

Slip the camera into the slot, cable end first. However, be careful not to push on the camera lens and instead press the board. +donkey

+

If you need to remove the camera the temptation is to push on the lens, instead push on the connector as is shown in these pictures.
+donkey donkey

+

Before using the car, remove the plastic film or lens cover from the camera lens.

+

donkey

+

It is easy to put the camera cable in the wrong way so look at these photos and make sure the cable is put in properly. There are loads of tutorials on youtube if you are not used to this.

+

donkey

+

Step 7: Put it all together

+

*** Note if you have a Desert Monster Chassis see 7B section below *** +The final steps are straightforward. First attach the roll bar assembly to the car. This is done using the same pins that came with the vehicle.

+

donkey

+

Second run the servo cables up to the car. The throttle cable runs to channel 0 on the servo controller and steering is channel 1.

+

donkey

+

Now you are done with the hardware!!

+

Step 7b: Attach Adapters (Desert Monster only)

+

The Desert monster does not have the same set up for holding the body on the car and needs two adapters mentioned above. To attach the adapters you must first remove the existing adapter from the chassis and screw on the custom adapter with the same screws as is shown in this photo:

+

adapter

+

Once this is done, go back to step 7

+

Software

+

Congrats! Now to get your get your car moving, see the software instructions section.

+

donkey

+
+

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/calibrate/index.html b/guide/calibrate/index.html new file mode 100644 index 00000000..c3ac7e74 --- /dev/null +++ b/guide/calibrate/index.html @@ -0,0 +1,315 @@ + + + + + + + + Calibrate steering and throttle. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Calibrate your Car

+

The point of calibrating your car is to make it drive consistently. If you have a steering servo then donkey needs to know the PWM values associated with full left and full right turns. If you have an ESC, then donkey needs to know the PWM values for full forward throttle, stopped and full reverse throttle. You figure out those values in the calibration process, then save them to your myconfig.py file so they can be used then the car is driving.

+

Some kinds of drivetrains do not need to be calibrated. If you are using any drivetrain that uses an L298N motor controller or similar (rather than and ESC), then no calibration is necessary; those drivetrains do not use PWM; they use a duty cycle that does not need to be calibrated. Most of the differential drivetrains (those whose name begins with DC_TWO_WHEEL) are of that type. If your drivetrain uses an L298N motor controller or similar for throttle, but uses a servo for steering, then you will need to calibrate steering, but not throttle.

+

There is a more complete discussion of drivetrains in Actuators

+

How to adjust your car's settings

+
+

You will need to ssh into your Pi to do the calibration.

+
+

All of the car's default settings are in the config.py. You can override the default settings by editing the myconfig.py script in your car directory. This was generated when you ran the donkey createcar --path ~/mycar command. You can edit this file on your car by running:

+
nano ~/mycar/myconfig.py
+
+

Steering Calibration

+
+

Make sure your car is off the ground to prevent a runaway situation.

+
+
    +
  1. Turn on your car.
  2. +
  3. Find the servo's 3-pin cable and make sure it is connected property to the PWM output pins. +
  4. +
  5. Run donkey calibrate ... and provide it with arguments to specify which pin will produce the PWM
      +
    • When calibrating a drivetrain that uses pin specifiers, like PWM_STEERING_THROTTLE, then use --pwm-pin argument to specify the target pin, like RPI_GPIO.BOARD.33 or PCA9685.1:40.13. If you are using the Donkey Hat then you would use donkey calibrate --pwm-pin=PIGPIO.BCM.13 to calibrate steering. See Pins for a more complete discussion of pins and pin specifiers.
    • +
    • When using a legacy PCA9685 drivetrain, like I2C_SERVO, then specify the PCA9685 channel (the index of the 3-pin connector that cable in connected to) and the I2C bus the PCA9685 is connected to; donkey calibrate --channel <your_steering_channel> --bus=<your_i2c_bus>
    • +
    +
  6. +
  7. First find the value that turns the tires all the way to the left extreme. When calibrating steering you want to choose the value that just turns the wheels to the maximum; the wheels should turn all the way but the servo should NOT make a whining noise. Try the value 360 and you should see the wheels on your car move slightly. If not try 400 or 300. Next enter values +/- 10 from your starting value to find the PWM setting that makes your car turn all the way left, again making sure the motor is not making a whining sound. Remember this value.
  8. +
  9. Next find the value that turns the tires all the way to the right extreme. Enter values +/- 10 from your starting value to find the PWM setting that makes your car turn all the way right, again making sure the motor is not making a whining sound. Remember this value.
  10. +
+

Edit the myconfig.py script on your car and enter these values as STEERING_LEFT_PWM and STEERING_RIGHT_PWM respectively.

+
    +
  • STEERING_LEFT_PWM = PWM for full left turn
  • +
  • STEERING_RIGHT_PWM = PWM value for full right turn
  • +
+

Throttle Calibration

+
    +
  1. Find the ESC's 3-pin cable and make sure it is connected property to the PWM output pins. +
  2. +
  3. Run donkey calibrate ... and provide it with arguments to specify which pin will produce the PWM
      +
    • When calibrating a drivetrain that uses pin specifiers, like PWM_STEERING_THROTTLE, then use --pwm-pin argument to specify the target pin, like RPI_GPIO.BOARD.33 or PCA9685.1:40.13. If you are using the Donkey Hat then you would use donkey calibrate --pwm-pin=PIGPIO.BCM.18 to calibrate throttle. See Pins for a more complete discussion of pins and pin specifiers.
    • +
    • When using a legacy PCA9685 drivetrain, like I2C_SERVO, then specify the PCA9685 channel (the index of the 3-pin connector that cable in connected to) and the I2C bus the PCA9685 is connected to; donkey calibrate --channel <your_throttle_channel> --bus=<your_i2c_bus>
    • +
    +
  4. +
  5. Enter 370 when prompted for a PWM value.
  6. +
  7. You should hear your ESC beep indicating that it's calibrated.
  8. +
  9. Enter 400 and you should see your cars wheels start to go forward. If not, +its likely that this is reverse, try entering 330 instead.
  10. +
  11. Keep trying different values until you've found a reasonable max speed and remember this PWM value.
  12. +
+

Reverse on RC cars is a little tricky because the ESC must receive a reverse pulse, zero pulse, reverse pulse to start to go backwards. To calibrate a reverse PWM setting...

+
    +
  1. Use the same technique as above set the PWM setting to your zero throttle.
  2. +
  3. Enter the reverse value, then the zero throttle value, then the reverse value again.
  4. +
  5. Enter values +/- 10 of the reverse value to find a reasonable reverse speed. Remember this reverse PWM value.
  6. +
+

Now open your myconfig.py script and enter the PWM values for your car into the throttle_controller part:

+
    +
  • THROTTLE_FORWARD_PWM = PWM value for full throttle forward
  • +
  • THROTTLE_STOPPED_PWM = PWM value for zero throttle
  • +
  • THROTTLE_REVERSE_PWM = PWM value at full reverse throttle
  • +
+

Fine tuning your calibration

+

fine calibration

+

Now that you have your car roughly calibrated you can try driving it to verify that it drives as expected. Here's how to fine tune your car's calibration.

+

First and most importantly, make sure your car goes perfectly straight when no steering input is applied.

+
    +
  1. Start your car by running python manage.py drive.
  2. +
  3. Go to <your_cars_hostname.local>:8887 in a browser.
  4. +
  5. Press the i key on your keyboard a couple of times to get the car to move forward. This is best done if you have your car on a very flat floor with some kind of grid, so you can guage if it going straight. Be careful not to confuse driving off at an angle versus driving along an arc. Driving at an angle may simply mean you pointed the car at an angle when starting it. Driving a curved arc indicates the car is steering.
  6. +
  7. If your car tends to turn left without steering applied then update the STEERING_LEFT_PWM in your myconfig.py file so it is closer to neutral. For example, if STEERING_LEFT_PWM is 460 and STEERING_RIGHT_PWM is 290, then reduce STEERING_LEFT_PWM a little, maybe 458.
  8. +
  9. If your car tends to steer right with no steering applied, then update STEERING_RIGHT_PWM in your myconfig.py file so it is closer to neutral. For example, if STEERING_LEFT_PWM is 460 and STEERING_RIGHT_PWM is 290, then increase STEERING_RIGHT_PWM a little, maybe 292.
  10. +
  11. Repeat this process a couple of times until you have your car driving straight.
  12. +
+

Next, try to make it so that a full left turn and a full right turn are the same turn angle (they make the same diameter circle when driven all the way around).

+
+

Note : optional

+
+
    +
  1. Start your car by running python manage.py drive.
  2. +
  3. Go to <your_cars_hostname.local>:8887 in a browser.
  4. +
  5. Press j until the cars steering is all the way right.
  6. +
  7. Press i a couple of times to get the car to go forward.
  8. +
  9. Measure the diameter of the turn and record it on a spreadsheet.
  10. +
  11. Repeat this measurement for different steering values for turning each direction.
  12. +
  13. Chart these so you can see if your car turns the same in each direction.
  14. +
+

Corrections:

+
    +
  • If your car turns the same amount at an 80% turn and a 100% turn, change the PWM setting for that turn direction to be the PWM value at 80%.
  • +
  • If your car is biased to turn one direction, change the PWM values of your turns in the opposite direction of the bias.
  • +
+

After you've fine tuned your car the steering chart should look something like this.

+

calibration graph

+

You may need to iterate making sure the car is driving straight and that the left and right turns are the same to get those both to work. Prioritize making sure the car drives straight.

+

Next let's get driving!

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/computer_vision/computer_vision/index.html b/guide/computer_vision/computer_vision/index.html new file mode 100644 index 00000000..3a8247e6 --- /dev/null +++ b/guide/computer_vision/computer_vision/index.html @@ -0,0 +1,685 @@ + + + + + + + + Computer Vision Autopilot - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Computer Vision Autopilot

+

The computer vision autopilot, like the deep learning autopilot, interprets camera images in order to determine steering and throttle values. However, rather than deep learning models, the computer vision autopilot utilizes traditional computer vision algorithms, such as Canny edge detection, to interpret images of the track. The computer vision autopilot is specifically designed to make it easy to write your own algorithm and use it in place of the built-in algorithm.

+

The Computer Vision Autopilot

+

The built-in algorithm is a line following algorithm; it expects the track to have a center line, preferably solid, that it can detect. The expected color of the line can be tuned with configuration; by default it expects a yellow line. The algorithm calculates the distance of the line from the center of the image, then a PID controller uses that value to calculate a steering value. If the car is to the left of the line then it will turn right. If the car is to the right of the line then it will turn left. The chosen steering angle is proportional to the distance from the line. The chosen throttle is inversely proportional to the steering angle so that the car will go faster on a straight path and slow down for turns. More details on the algorithm and the configuration parameters are discussed below.

+

But what if your track does not have a center line; what if it just has a left and right lane boundary lines? What if it is a sidewalk? What if you simply want to implement your own algorithm? The computer vision template is designed to make that pretty easy. You can write your own part in Python to use as the autopilot and simply change the configuration in your myconfig.py to point to it. Your part can utilize computer vision parts in cv.py or you can call OpenCV's python api directly. We present a simplified example below.

+
+
+

IMPORTANT: The computer vision template requires that opencv is installed. Opencv is pre-installed on the Jetson Nano, but it must be explicitly installed on the Raspberry Pi. See Raspberry Pi installation Step 9 and Step 11.

+
+
+

Create a computer vision Application

+

You can create a computer vision application similarly to the how we create a deep learning application; we just tell it to use the cv_control template instead of the default template. First, make sure your donkeycar python environment is activated, then use the createcar command to create your application folder.

+
donkey createcar --template=cv_control --path=~/mycar
+
+

When updating to a new version of donkeycar, you will want to refresh your application folder. You can do this with the same command, but add --overwrite so that it does not erase your myconfig.py file.

+
donkey createcar --template=cv_control --path=~/mycar --overwrite
+
+

The Line Follower

+

The built-in algorithm can follow a line using the camera. By default it is tuned for a yellow line, but the color that it tracks can be configured. Many other aspects of the algorithm can be tuned. Below is as description of the algorithm and how it uses the configuraton values. The values themselves are listed and described afterwards.

+
    +
  1. If TARGET_PIXEL is None, then use steps 1 to 5 to estimate the target (the expected) position of the line.
  2. +
  3. Copy the image rows at SCAN_Y and SCAN_HEIGHT pixels height. So the result is a block of pixels as wide as the image and SCAN_HEIGHT high.
  4. +
  5. Convert the pixels from RBG red-gree-blue color space to HSV hue-saturation-value color space.
  6. +
  7. The algorithm then identifies all the pixels in the block that have an HSV color between COLOR_THRESHOLD_LOW and COLOR_THRESHOLD_HIGH.
  8. +
  9. Once the pixels with the color target are isolated then a histogram is created that creates counts of yellow pixels from left to right for each 1 pixel wide by SCAN_HEIGHT pixel high slice.
  10. +
  11. The x value (horizontal offset) of the slice with the most yellow pixels is chosen. This is where the algorithm thinks the yellow line is.
  12. +
  13. The difference between this x-value and the TARGET_PIXEL value is used as the value the PID algorithm uses to calculate a new steering. If the value is to the left of the TARGET_PIXEL more than TARGET_THRESHOLD pixels then the car steers right; if hte value is to the right of TARGET_PIXEL more than TARGET_THRESHOLD pixels the the car steers left. If the value is withing TARGET_THRESHOLD pixels of TARGET_PIXEL then steering is not changed.
  14. +
  15. The steering value is used to decide if the car should speed up or slow down. If steering is not changed then the throttle is increased by THROTTLE_STEP, but not over THROTTLE_MAX. If steering is changed then throttle is decreased by THROTTLE_STEP, but not below THROTTLE_MIN.
  16. +
+
+
+

This pyimagesearch article and accompanying video describe the various color spaces available and OpenCV and their characteristics.

+
+
+

The complete source code is provided and discussed in the LineFollower class section near the end of this page.

+

Camera Setup

+

The image at the top of the page shows the camera setup approximately how it would be using the standard Donkeycar cage. It is angled to see to the horizon so that it can see turns from far away. This is good when going very fast because you can see far ahead. However if the detected line is very thin then it could have artifacting (noise) that could lead to false positives that cause the vehicle to move off the line. If you are not going fast and you want to be as accurate as possible then pointing the camera down at the line is a good idea. So if your camera can be adjusted then you can make trade-offs between accuracy (point it down) and speed (point to to the horizon).

+

Choosing Parameters for the LineFollower

+

The computer vision template is a little different than the deep learning and path follow templates; there is no data recording. After setting your configuration parameters you just put your car on the track that has the line that you want to follow and then change from user mode to one of the auto-pilot modes; full-auto or auto-steering. The complete set of configuration parameters can be found in the LineFollower Configuration section below; we will discuss the most important configuration in more detail in this section.

+

SCAN_Y and SCAN_HEIGHT

+

The rectangular area that will be scanned for the line, called the detection area, is determined with the SCAN_Y and SCAN_HEIGHT.

+

When in autopilot mode, the LineFollower shows the detection area as a horizontal black bar. Pixels that fall within the color threshold range (see next sectino) are drawn as white pixels. Ideally, only the pixels in the line that are in the detection bar will show as white; any white pixels that are NOT part of the line that you want to follow are considered false positives. If the false positives are relatively disperse then they should not interfere with detecting the line. However, if there are big areas of white false positives then they might trick the algorithm. See the next section of how to adjust the color threshold range to minimize false positives.

+

The image below shows the detection area and the detected line.

+

The Detection Area

+

COLOR_THRESHOLD_LOW, COLOR_THRESHOLD_HIGH

+

The color threshold values represent the range of colors used to detect the line; they should be chosen to include the colors in the line in the area that it passes through the detection bar and ideally they should not include any other colors. The color threshold values are in HSV color space (Hue, Saturation, Color) format, not RGB format. RGB color space is how a computer shows colors. HSV color space is closer to how humans perceive color. For our purposes the 'hue' part is the 'pure' color without regard for shadows or lighting. This makes it easier to find a color because it is one number, rather than combination of 3 numbers.

+
+
+

There are many online converters between RGB and HSV. This one was used when creating this documentation; peko-step I like that tool because it will allow the Saturation and Value to be output in the range of 0.255, which is what we need. IMPORTANT: The online tools use the standard way of representing HSV, which is a Hue value of 0 to 359 degrees, Saturation of 0 to 100%, Value of 0 to 100%. OpenCV, which our code is based on, uses a Hue value of 0 to 179, Saturation of 0 to 255 and value of 0 to 255; so be aware that you may need to convert from the tool's values to the OpenCV values when changing these configurations.

+
+
+

When choosing the threshold colors it is important to take into account what the camera will see including the lighting conditions. Donkeycar includes a script to make this easy to do. the hsv_picker.sh script allows you to view the live camera image or alternatively to choose a static image to view. So if you are running a desktop image on your car (so not a server image or headless image) then you can run the script and view the camera image. If you do not have a desktop on the car, then you can run the car and open the web view in a browser on your host laptop at take a screen shot to save it, then use that static image with the hsv_picker.sh script on your laptop computer. In either case, arrange the car on the course so it can see the line as it would see it when you drive in autopilot so you are getting a realistic view.

+

You can run the hsv_picker.sh script to view a screen shot image; with the donkey python environment activated run the script from the root of your donkeycar repo folder;

+
python scripts/hsv_picker.sh --file=<path-to-image>
+
+

To view the camera stream, again with donkey python environment activated, run the script from the root of your donkeycar repo folder;

+
python scripts/hsv_picker.sh
+
+

If you have more than one camera and it is not showing the correct one, you can choose the camera index and/or set the image size

+
python scripts/hsv_picker.sh --camera=2  --width=320 --height=240
+
+

A screenshot in hsv_script.sh

+

The image above shows the hsv_script.sh with a web ui screenshot loaded. The blue line in the center of the image is the line that we want to follow. The horizontal black bar in the camera image is the detection bar; this is defined by SCAN_Y and SCAN_HEIGHT and is the area where the mask is applied to try to isolate the pixels in the line. When pixels are detected they will be draw in white in the detection area.

+

The bottom of the screen has 6 trackbars to select the 3 parts of the low HSV value and the 3 parts of the high HSV value that are used to create a mask to pull out the pixels in the line. You can move those scrollbars manually to try to find the best values for the detection range. As you change the scrollbars the resulting mask will be applied to the image and you will see pixels start to turn back. The Hue value is typically the most important value. You can reset the trackbars and clear the mask anytime by selecting the Escape key on the keyboard.

+

Using the scrollbars works, but there is an easier way. You can also just select a rectangular area by clicking-dragging-releasing; the pixels in that area will be searched for a low and high value and the trackbars will by updated with those low and high values. So the easiest way to find the mask for the line is to select a rectangular area on the line itself. You can fine-tune the selected mask using the trackbars.

+

The image below shows the mask that was created by selecting a rectangular area within the blue line.

+

A masked screenshot in the hsv_picker.sh script

+

Features of hsv_picker.sh:

+
    +
  • change the low and high mask values using the taskbars at the bottom of the screen.
  • +
  • set the low and high mask values by selecting a rectangular area on the image using click-drag-release.
  • +
  • select the Escape key on the keyboard to clear the mast.
  • +
  • select the 'p' key to print the current mask values to the console.
  • +
  • select the 'q' key to print the final mask values and quit.
  • +
+

TARGET_PIXEL

+

The TARGET_PIXEL value is the expected horizontal position of the line to follow in the image. The line follow algorithm will adjust steering to try to keep the line at that position in the image. More specifically, the difference between the TARGET_PIXEL value and where the line follow algorithm detects that actual line is in the image is used by a PID controller to adjust steering (see The PID Controller) below.

+

If you are the only car on the course, then you probably want the car to drive directly on the line to follow. In this case setting TARGET_PIXEL to the horizontal center of the image at (IMAGE_W / 2) means the auto-pilot assumes the line to follow should be directly in the middle of the image and so the car will try to stay in the middle. So if your car actually starts to the left or right of that line, it will quickly move the the line and stay on it.

+

However, if you are on a course where two cars drive at the same time (there are two lanes separated by a line), then you probably want your car to stay in it's lane. In that case you would set TARGET_PIXEL to None, which will cause the car to detect the location of the line at startup. The auto-pilot will then assume the line should stay at that position in the image, and so it will the try to keep the car it it's lane to make that true.

+
+
+

If you are really motivated then you might try implementing a lane-changing algorithm that would dynamically change the target pixel value in order to move from one lane to another.

+
+
+

LineFollower Configuration

+

The complete set of configuration values and their defaults can be found in donkeycar/templates/cfg_cv_control.py and is copied here for convenience.

+
# configure which part is used as the autopilot - change to use your own autopilot
+CV_CONTROLLER_MODULE = "donkeycar.parts.line_follower"
+CV_CONTROLLER_CLASS = "LineFollower"
+CV_CONTROLLER_INPUTS = ['cam/image_array']
+CV_CONTROLLER_OUTPUTS = ['pilot/steering', 'pilot/throttle', 'cv/image_array']
+CV_CONTROLLER_CONDITION = "run_pilot"
+
+# LineFollower - line color and detection area
+SCAN_Y = 120          # num pixels from the top to start horiz scan
+SCAN_HEIGHT = 20      # num pixels high to grab from horiz scan
+COLOR_THRESHOLD_LOW  = (0, 50, 50)    # HSV dark yellow (opencv HSV hue value is 0..179, saturation and value are both 0..255)
+COLOR_THRESHOLD_HIGH = (50, 255, 255) # HSV light yellow (opencv HSV hue value is 0..179, saturation and value are both 0..255)
+
+# LineFollower - target (expected) line position and detection thresholds
+TARGET_PIXEL = None   # In not None, then this is the expected horizontal position in pixels of the yellow line.
+                      # If None, then detect the position yellow line at startup;
+                      # so this assumes you have positioned the car prior to starting.
+TARGET_THRESHOLD = 10 # number of pixels from TARGET_PIXEL that vehicle must be pointing
+                      # before a steering change will be made; this prevents algorithm
+                      # from being too twitchy when it is on or near the line.
+CONFIDENCE_THRESHOLD = (1 / IMAGE_W) / 3  # The fraction of total sampled pixels that must be yellow in the sample slice.
+                                          # The sample slice will have SCAN_HEIGHT pixels and the total number
+                                          # of sampled pixels is IMAGE_W x SCAN_HEIGHT, so if you want to make sure
+                                          # that all the pixels in the sample slice are yellow, then the confidence
+                                          # threshold should be SCAN_HEIGHT / (IMAGE_W x SCAN_HEIGHT) or (1 / IMAGE_W).
+                                          # If you keep getting `No line detected` logs in the console then you
+                                          # may want to lower the threshold.
+
+# LineFollower - throttle step controller; increase throttle on straights, descrease on turns
+THROTTLE_MAX = 0.3    # maximum throttle value the controller will produce
+THROTTLE_MIN = 0.15   # minimum throttle value the controller will produce
+THROTTLE_INITIAL = THROTTLE_MIN  # initial throttle value
+THROTTLE_STEP = 0.05  # how much to change throttle when off the line
+
+# These three PID constants are crucial to the way the car drives. If you are tuning them
+# start by setting the others zero and focus on first Kp, then Kd, and then Ki.
+PID_P = -0.01         # proportional mult for PID path follower
+PID_I = 0.000         # integral mult for PID path follower
+PID_D = -0.0001       # differential mult for PID path follower
+
+OVERLAY_IMAGE = True  # True to draw computer vision overlay on camera image in web ui
+                      # NOTE: this does not affect what is saved to the data
+
+
+

The PID Controller

+

It is very common to use a Proportional Integral Derivative (PID) controller to control throttle and/or steering in a wheeled robot. For example, the Path Follow autopilot uses a PID algorithm to modify steering based on how far away from the desired path the robot is. In the Computer Vision template, the built-in Line Follower algorithm uses a PID in a similar way; the line follow algorithm outputs a value that is proportional to how far the car is from the center line and whose sign indicates which side of the line it is on. The PID controller uses the magnitude and sign of the distance from the center line to calculate a steering value that will move the car towards the center line.

+
+
+

The path_follow autopilot also uses a PID controller. There is a good description of how to tune a controller for driving at Determining PID Coefficients

+
+
+

Writing a Computer Vision Autopilot

+

You can use the CV_CONTROLLER_* configuration values to point to a python file and class that implements your own computer vision autopilot part. Your autopilot class must conform to the donkeycar part standard. You can also determine the name of the input values, output values and run_condition. The default configuration values point the the included LineFollower part. At a minimum computer vistion autpilot part takes the camera image as an input and outputs the autopilot's throttle and steering values.

+

Let's create a simple custom computer vision part. It won't be much of an autopilot because it will just output a constant throttle and steering value and an image that counts that frames.

+

A computer vision is a donkeycar part, so at a minimum it must be a Python class with a run(self) method. An autpilot needs a little more than that, which we will see, but here is a minimal structure;

+
import cv2
+import numpy as np
+from simple_pid import PID
+import logging
+
+logger = logging.getLogger(__name__)
+
+
+class MockCvPilot:
+    def __init__(self, pid, cfg):
+        # initialize instance properties
+        pass
+
+    def run(self, img):
+        # use img to determine a steering and throttle value
+        return 0, 0, None  # steering, throttle, image
+
+

The constructor, __init__(self, pid, cfg) takes a PID controller instance and the vehicle configuration properties. It is very common for autopilots to use a PID controller, so the framework provides one. Your autopilot may have values that you want to adjust to tune the algorithm; you should put those values in the myconfig.py configuration file, then retrieve them in the constructor. In our MockCvPilot we want to know if the user wants to see the telemetry image or just the camera image. We do the same thing in the built-in LineFollower autopilot part, so we can just re-use that configuration value, OVERLAY_IMAGE, in our autpilot. We can add that in our constructor;

+
    def __init__(self, pid, cfg):
+        self.pid_st = pid
+        self.overlay_image = cfg.OVERLAY_IMAGE
+        self.counter = 0
+
+

The run(self, img) method is called each time through the loop. This is where you will interpret the image that is passed and determine a steering and throttle value that the car should use. The Computer Vision template also allows for showing an image in the web ui that is different in autopilot mode; typically you would add telemetry information to the camera image that is passed to run; such as the new steering and throttle values and perhaps a other alterations to the image so the user can better understand how the algorithm is working. For instance, if your algorithm did edge detection using the Canny algorithm, then you might want to show the processed image with the edges. So the minimal autopilot part returns a tuple of (steering, throttle, image).

+

To keep things simple, the MockCvPilot won't actually predict a steering and throttle, it will just return zero for each. However it will maintain a counter and display that in the telemetry image. We can see that in the run() method.

+
    def run(self, cam_img):
+        if cam_img is None:
+            return 0, 0, None
+
+        self.counter += 1
+
+        # show some diagnostics
+        if self.overlay_image:
+            # draw onto a COPY of the image so we don't alter the original
+            cam_img = self.overlay_display(np.copy(cam_img))
+
+        return self.steering, self.throttle, cam_img
+
+

There are a couple of things to note here:

+
    +
  • The run() method is protected against an empty camera image - this can happen, espcially during startup. So in this case we stop the car.
  • +
  • We only produce a telemetry image if the original configuration value, OVERLAY_IMAGE, that we copied into self.overlay_imageis True. If it is not True then we just pass throught the original camera image.
  • +
  • Note that we make a copy of the original camera image so that we do not alter the original. This is called a defensive copy; we don't know what other parts in the vehicle need to do with the orignal image, so we don't want t alter it.
  • +
+

We put the logic that draws the telemetry image into it's own method so keep the both it and the run() method clean and cohesive. Also because we run() method has made a defensive copy of the original image, the method can do anything it wants to the image; even overwrite it completely. In our case we just draw some text on it to show the steering, throttle and counter values. We know the steering and throttle values will be zero in our mock autopilot, but it is instructive to show how you might display them. In this case we are showing them as text, but you might prefer to show them as bars, like we do in the webui, or some other visualization. This is the display method we use in our mock autopilot;

+
    def overlay_display(self, img):
+        display_str = []
+        display_str.append(f"STEERING:{self.steering:.1f}")
+        display_str.append(f"THROTTLE:{self.throttle:.2f}")
+        display_str.append(f"COUNTER:{self.counter}")
+
+        lineheight = 25
+        y = lineheight
+        x = lineheight
+        for s in display_str:
+            cv2.putText(img, s, color=(0, 0, 0), org=(x ,y), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, thickness=3)
+            cv2.putText(img, s, color=(0, 255, 0), org=(x ,y), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, thickness=1)
+            y += lineheight
+
+        return img
+
+

There are a couple of things to note: +- We organize the text as an array of strings; that makes it easy to process the line when we are drawing the text. You might even want to have a separate method to create this list of text string and possibly pass them into the display() methed if that simplies the display method or makes it more versatile (it can do both). +- We draw the text twice, once with a thick black stroke and then again with a thinner green stroke. This creates green text with a black outline; this makes it easier to read on an unpredictable background.

+

Here is the complete custom computer vision autopilot part:

+
import cv2
+import numpy as np
+from simple_pid import PID
+import logging
+
+logger = logging.getLogger(__name__)
+
+
+class MockCvPilot:
+    '''
+    OpenCV based MOCK controller; just draws a counter and 
+    returns 0 for thottle and steering.
+
+    :param pid: a PID controller that can be used to estimate steering and/or throttle
+    :param cfg: the vehicle configuration properties
+    '''
+    def __init__(self, pid, cfg):
+        self.pid_st = pid
+        self.overlay_image = cfg.OVERLAY_IMAGE
+        self.steering = 0
+        self.throttle = 0
+        self.counter = 0
+
+
+    def run(self, cam_img):
+        '''
+        main runloop of the CV controller.
+
+        :param cam_img: the camerate image, an RGB numpy array
+        :return: tuple of steering, throttle, and the telemetry image.
+
+        If overlay_image is True, then the output image
+        includes an overlay that shows how the 
+        algorithm is working; otherwise the image
+        is just passed-through untouched. 
+        '''
+        if cam_img is None:
+            return 0, 0, None
+
+        self.counter += 1
+
+        # show some diagnostics
+        if self.overlay_image:
+            # draw onto a COPY of the image so we don't alter the original
+            cam_img = self.overlay_display(np.copy(cam_img))
+
+        return self.steering, self.throttle, cam_img
+
+    def overlay_display(self, img):
+        '''
+        draw on top the given image.
+        show some values we are using for control
+
+        :param img: the image to draw on as a numpy array
+        :return: the image with overlay drawn
+        '''
+        # some text to show on the overlay
+        display_str = []
+        display_str.append(f"STEERING:{self.steering:.1f}")
+        display_str.append(f"THROTTLE:{self.throttle:.2f}")
+        display_str.append(f"COUNTER:{self.counter}")
+
+        lineheight = 25
+        y = lineheight
+        x = lineheight
+        for s in display_str:
+            # green text with black outline so it shows up on any background
+            cv2.putText(img, s, color=(0, 0, 0), org=(x ,y), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, thickness=3)
+            cv2.putText(img, s, color=(0, 255, 0), org=(x ,y), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=1, thickness=1)
+            y += lineheight
+
+        return img
+
+

To use the custom part, we must modify the myconfig.py file in the mycar folder to locate the python file and the class within it and to specify the inputs, outputs and run_condition that should be used when adding the part to the vehicle loop:

+
# # configure which part is used as the autopilot - change to use your own autopilot
+CV_CONTROLLER_MODULE = "my_cv_pilot"
+CV_CONTROLLER_CLASS = "MockCvPilot"
+CV_CONTROLLER_INPUTS = ['cam/image_array']
+CV_CONTROLLER_OUTPUTS = ['pilot/steering', 'pilot/throttle', 'cv/image_array']
+CV_CONTROLLER_CONDITION = "run_pilot"
+
+

CV_CONTROLLER_MODULE is the package path to the my_cv_autpilot.py file. It is generally is convenient to have this in the mycar folder and this is what we have done here. However, if you are developing this in your own repository, then if you are a Mac or Linux machine you can create a symbolic link to the file or the folder in which the file or files are.

+

CV_CONTROLLER_CLASS is the name of the part's class in the python file to which CV_CONTROLLER_MODULE points. In our case this is MockCvPilot.

+

CV_CONTROLLER_INPUTS is an array of the named inputs to the part that are passed when the part is added to the vehicle loop. For a computer vision autopilot the image is the minimum required. However you can pass any named values in the vehicle's memory. These correspond in a one-to-one fashion to the arguments (ignore the self argument) to the autopilot's run() method. So our mock example expects only an image run(self, cam_img) and we only declare an image in the inputs, ['cam/image_array'].

+

CV_CONTROLLER_OUTPUTS is an array of named outputs to the part that are passed when the part is added to vehicle loop. These correspond to the return values from the autopilot's run(). This is an autopilot, so we return a steering value and a throttle value. We also produce a new image with telemetry information drawn on it. So our mock autopilot returns return self.steering, self.throttle, cam_img which corresponds to the declared output values, ['pilot/steering', 'pilot/throttle', 'cv/image_array'].

+

CV_CONTROLLER_CONDITION is the named value that decides if the autopilot part will run or not run. If you always want it to run, then pass None, otherwise this should be the name of a boolean value; when it is True the part's run() method will be called; when it if False the run() method is not called. The templates maintain such a boolean value named "run_pilot", so we use that.

+

LineFollower Class

+

Now that you understand the structure of an autopilot part, it is worth reviewing the pseudocode in The Line Follower section above and compare that to the actual implementation. The python file is located at https://github.com/autorope/donkeycar/blob/main/donkeycar/parts/line_follower.py and is copied below. In particular:

+
    +
  • get_i_color() uses SCAN_Y and SCAN_HEIGHT to copy a section of the camera image, convert it to HSV and apply the mask created by the low and high HSV mask values. Then it finds the x (horizontal) index in the area that has the highest amount of the positive pixels; that is where the autopilot thinks the line is.
  • +
+
    def get_i_color(self, cam_img):
+        # take a horizontal slice of the image
+        iSlice = self.scan_y
+        scan_line = cam_img[iSlice : iSlice + self.scan_height, :, :]
+
+        # convert to HSV color space
+        img_hsv = cv2.cvtColor(scan_line, cv2.COLOR_RGB2HSV)
+
+        # make a mask of the colors in our range we are looking for
+        mask = cv2.inRange(img_hsv, self.color_thr_low, self.color_thr_hi)
+
+        # which index of the range has the highest amount of yellow?
+        hist = np.sum(mask, axis=0)
+        max_yellow = np.argmax(hist)
+
+        return max_yellow, hist[max_yellow], mask
+
+
    +
  • Note how target value is initialized by reading the image if it is not already initialized in the configuration:
  • +
+
        max_yellow, confidence, mask = self.get_i_color(cam_img)
+        if self.target_pixel is None:
+            self.target_pixel = max_yellow
+
+
    +
  • Note how the PID controller's set point (the target value) in initialized in the run() method with the TARGET_PIXEL value.
  • +
+
        if self.pid_st.setpoint != self.target_pixel:
+            # this is the target of our steering PID controller
+            self.pid_st.setpoint = self.target_pixel
+
+
    +
  • If we got a good reading from the image, then we use it to predict a new steering value based on the horizontal distance of the detected line from the target_pixel.
  • +
+
        if confidence >= self.confidence_threshold:
+            # invoke the controller with the current yellow line position
+            # get the new steering value as it chases the ideal target_value
+            self.steering = self.pid_st(max_yellow)
+
+
    +
  • Slow down if we are turning
  • +
+
            if abs(max_yellow - self.target_pixel) > self.target_threshold:
+                # we will be turning, so slow down
+                if self.throttle > self.throttle_min:
+                    self.throttle -= self.delta_th
+
+
    +
  • Or speed up if we are going straight
  • +
+
            else:
+                # we are going straight, so speed up
+                if self.throttle < self.throttle_max:
+                    self.throttle += self.delta_th
+
+
+

Here is the complete source to the LineFollower part.

+
import cv2
+import numpy as np
+from simple_pid import PID
+import logging
+
+logger = logging.getLogger(__name__)
+
+
+class LineFollower:
+    '''
+    OpenCV based controller
+    This controller takes a horizontal slice of the image at a set Y coordinate.
+    Then it converts to HSV and does a color thresh hold to find the yellow pixels.
+    It does a histogram to find the pixel of maximum yellow. Then is uses that iPxel
+    to guid a PID controller which seeks to maintain the max yellow at the same point
+    in the image.
+    '''
+    def __init__(self, pid, cfg):
+        self.overlay_image = cfg.OVERLAY_IMAGE
+        self.scan_y = cfg.SCAN_Y   # num pixels from the top to start horiz scan
+        self.scan_height = cfg.SCAN_HEIGHT  # num pixels high to grab from horiz scan
+        self.color_thr_low = np.asarray(cfg.COLOR_THRESHOLD_LOW)  # hsv dark yellow
+        self.color_thr_hi = np.asarray(cfg.COLOR_THRESHOLD_HIGH)  # hsv light yellow
+        self.target_pixel = cfg.TARGET_PIXEL  # of the N slots above, which is the ideal relationship target
+        self.target_threshold = cfg.TARGET_THRESHOLD # minimum distance from target_pixel before a steering change is made.
+        self.confidence_threshold = cfg.CONFIDENCE_THRESHOLD  # percentage of yellow pixels that must be in target_pixel slice
+        self.steering = 0.0 # from -1 to 1
+        self.throttle = cfg.THROTTLE_INITIAL # from -1 to 1
+        self.delta_th = cfg.THROTTLE_STEP  # how much to change throttle when off
+        self.throttle_max = cfg.THROTTLE_MAX
+        self.throttle_min = cfg.THROTTLE_MIN
+
+        self.pid_st = pid
+
+
+    def get_i_color(self, cam_img):
+        '''
+        get the horizontal index of the color at the given slice of the image
+        input: cam_image, an RGB numpy array
+        output: index of max color, value of cumulative color at that index, and mask of pixels in range
+        '''
+        # take a horizontal slice of the image
+        iSlice = self.scan_y
+        scan_line = cam_img[iSlice : iSlice + self.scan_height, :, :]
+
+        # convert to HSV color space
+        img_hsv = cv2.cvtColor(scan_line, cv2.COLOR_RGB2HSV)
+
+        # make a mask of the colors in our range we are looking for
+        mask = cv2.inRange(img_hsv, self.color_thr_low, self.color_thr_hi)
+
+        # which index of the range has the highest amount of yellow?
+        hist = np.sum(mask, axis=0)
+        max_yellow = np.argmax(hist)
+
+        return max_yellow, hist[max_yellow], mask
+
+
+    def run(self, cam_img):
+        '''
+        main runloop of the CV controller
+        input: cam_image, an RGB numpy array
+        output: steering, throttle, and the image.
+        If overlay_image is True, then the output image
+        includes and overlay that shows how the 
+        algorithm is working; otherwise the image
+        is just passed-through untouched. 
+        '''
+        if cam_img is None:
+            return 0, 0, False, None
+
+        max_yellow, confidence, mask = self.get_i_color(cam_img)
+        conf_thresh = 0.001
+
+        if self.target_pixel is None:
+            # Use the first run of get_i_color to set our relationship with the yellow line.
+            # You could optionally init the target_pixel with the desired value.
+            self.target_pixel = max_yellow
+            logger.info(f"Automatically chosen line position = {self.target_pixel}")
+
+        if self.pid_st.setpoint != self.target_pixel:
+            # this is the target of our steering PID controller
+            self.pid_st.setpoint = self.target_pixel
+
+        if confidence >= self.confidence_threshold:
+            # invoke the controller with the current yellow line position
+            # get the new steering value as it chases the ideal target_value
+            self.steering = self.pid_st(max_yellow)
+
+            # slow down linearly when away from ideal, and speed up when close
+            if abs(max_yellow - self.target_pixel) > self.target_threshold:
+                # we will be turning, so slow down
+                if self.throttle > self.throttle_min:
+                    self.throttle -= self.delta_th
+                if self.throttle < self.throttle_min:
+                    self.throttle = self.throttle_min
+            else:
+                # we are going straight, so speed up
+                if self.throttle < self.throttle_max:
+                    self.throttle += self.delta_th
+                if self.throttle > self.throttle_max:
+                    self.throttle = self.throttle_max
+        else:
+            logger.info(f"No line detected: confidence {confidence} < {self.confidence_threshold}")
+
+        # show some diagnostics
+        if self.overlay_image:
+            cam_img = self.overlay_display(cam_img, mask, max_yellow, confidence)
+
+        return self.steering, self.throttle, cam_img
+
+    def overlay_display(self, cam_img, mask, max_yellow, confidense):
+        '''
+        composite mask on top the original image.
+        show some values we are using for control
+        '''
+
+        mask_exp = np.stack((mask, ) * 3, axis=-1)
+        iSlice = self.scan_y
+        img = np.copy(cam_img)
+        img[iSlice : iSlice + self.scan_height, :, :] = mask_exp
+        # img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)
+
+        display_str = []
+        display_str.append("STEERING:{:.1f}".format(self.steering))
+        display_str.append("THROTTLE:{:.2f}".format(self.throttle))
+        display_str.append("I YELLOW:{:d}".format(max_yellow))
+        display_str.append("CONF:{:.2f}".format(confidense))
+
+        y = 10
+        x = 10
+
+        for s in display_str:
+            cv2.putText(img, s, color=(0, 0, 0), org=(x ,y), fontFace=cv2.FONT_HERSHEY_SIMPLEX, fontScale=0.4)
+            y += 10
+
+        return img
+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/create_application/index.html b/guide/create_application/index.html new file mode 100644 index 00000000..965a393d --- /dev/null +++ b/guide/create_application/index.html @@ -0,0 +1,350 @@ + + + + + + + + Create Donkeycar App. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Create your car application

+

If you are not already, please ssh into your vehicle.

+

Create Donkeycar from Template

+

Create a set of files to control your Donkey with this command:

+
donkey createcar --path ~/mycar
+
+

That creates a car using the default deep learning template. You can also create a car the uses the gps path follow template;

+
donkey createcar --template=path_follow --path ~/mycar
+
+

You can also create a car the uses the computer vision template;

+
donkey createcar --template=cv_control --path ~/mycar
+
+
+
+

mycar is not a special name; you can name your car folder anything you want.

+
+
+

See also more information on createcar.

+

Configure Options

+

Look at myconfig.py in your newly created directory, ~/mycar

+
cd ~/mycar
+nano myconfig.py
+
+

Each line has a comment mark. The commented text shows the default value. When you want to make an edit to over-write the default, uncomment the line by removing the # and any spaces before the first character of the option.

+

example:

+
# STEERING_LEFT_PWM = 460
+
+

becomes:

+
STEERING_LEFT_PWM = 500
+
+

when edited. You will adjust these later in the calibrate section.

+

Configure I2C PCA9685

+

If you are using a PCA9685 servo driver board, make sure you can see it on I2C.

+

Jetson Nano:

+
sudo usermod -aG i2c $USER
+sudo reboot
+
+

After a reboot, then try:

+
sudo i2cdetect -r -y 1
+
+

Raspberry Pi:

+
sudo apt-get install -y i2c-tools
+sudo i2cdetect -y 1
+
+

This should show you a grid of addresses like:

+
     0  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f
+00:          -- -- -- -- -- -- -- -- -- -- -- -- --
+10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+40: 40 -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
+70: 70 -- -- -- -- -- -- --
+
+

In this case, the 40 shows up as the address of our PCA9685 board. If this does not show up, then check your wiring to the board. On a pi, ensure I2C is enabled in menu of sudo raspi-config (notice, it suggest reboot).

+

If you have assigned a non-standard address to your board, then adjust the address in the myconfig.py under variable PCA9685_I2C_ADDR. If your board is on another bus, then you can specify that with the PCA9685_I2C_BUSNUM.

+

Jetson Nano: set PCA9685_I2C_BUSNUM = 1 in your myconfig.py . For the pi, this will be auto detected by the Adafruit library. But not on the Jetson Nano.

+

Sombrero Setup

+

Set HAVE_SOMBRERO = True in your myconfig.py if you have a sombrero board.

+

Robo HAT MM1 Setup

+

Set HAVE_ROBOHAT = True in your myconfig.py if you have a Robo HAT MM1 board. Also set the following variables according to your setup. Most people will be using the below values, however, if you are using a Jetson Nano, please set MM1_SERIAL_PORT = '/dev/ttyTHS1'

+
#ROBOHAT MM1
+HAVE_ROBOHAT = True            # set to true when using the Robo HAT MM1 from Robotics Masters.  This will change to RC Control.
+MM1_STEERING_MID = 1500         # Adjust this value if your car cannot run in a straight line
+MM1_MAX_FORWARD = 2000          # Max throttle to go fowrward. The bigger the faster
+MM1_STOPPED_PWM = 1500
+MM1_MAX_REVERSE = 1000          # Max throttle to go reverse. The smaller the faster
+MM1_SHOW_STEERING_VALUE = False
+# Serial port 
+# -- Default Pi: '/dev/ttyS0'
+# -- Jetson Nano: '/dev/ttyTHS1'
+# -- Google coral: '/dev/ttymxc0'
+# -- Windows: 'COM3', Arduino: '/dev/ttyACM0'
+# -- MacOS/Linux:please use 'ls /dev/tty.*' to find the correct serial port for mm1 
+#  eg.'/dev/tty.usbmodemXXXXXX' and replace the port accordingly
+MM1_SERIAL_PORT = '/dev/ttyS0'  # Serial Port for reading and sending MM1 data (raspberry pi default)
+
+# adjust controller type as Robohat MM1
+CONTROLLER_TYPE='MM1'
+# adjust drive train for web interface
+DRIVE_TRAIN_TYPE = 'MM1'
+
+

The Robo HAT MM1 uses a RC Controller and CircuitPython script to drive the car during training. You must put the CircuitPython script onto the Robo HAT MM1 with your computer before you can continue.

+
    +
  1. Download the CircuitPython Donkey Car Driver for Robo HAT MM1 to your computer from here
  2. +
  3. Connect the MicroUSB connector on the Robo HAT MM1 to your computer's USB port.
  4. +
  5. A CIRCUITPY device should appear on the computer as a USB Storage Device
  6. +
  7. Copy the file downloaded in Step 1 to the CIRCUITPY USB Storage Device. The file should be named code.py. It should be at the top level of the drive, not in any folder.
  8. +
  9. Download the Adafruit logging library python file, adafruit_logging.py, here
  10. +
  11. Copy the adafruit_logging.py file into the CIRCUITPY "lib" folder
  12. +
  13. Unplug USB Cable from the Robo HAT MM1 and place on top of the Raspberry Pi, as you would any HAT.
  14. +
+

You may need to enable the hardware serial port on your Raspberry Pi. On your Raspberry Pi...

+
    +
  1. Run the command sudo raspi-config
  2. +
  3. Navigate to the 5 - Interfaceing options section.
  4. +
  5. Navigate to the P6 - Serial section.
  6. +
  7. When asked: Would you like a login shell to be accessible over serial? NO
  8. +
  9. When asked: Would you like the serial port hardware to be enabled? YES
  10. +
  11. Close raspi-config
  12. +
  13. Restart
  14. +
+

If you would like additional hardware or software support with Robo HAT MM1, there are a few guides published on Hackster.io. They are listed below.

+

Raspberry Pi + Robo HAT MM1

+

Jetson Nano + Robo HAT MM1

+

Simulator + Robo HAT MM1

+

Joystick setup

+

If you plan to use a joystick, take a side track over to here.

+

Camera Setup

+

If you are using the default deep learning template or the computer vision template then you will need a camera. By default myconfig.py assumes a RaspberryPi camera. You can change this by editing the CAMERA_TYPE value in the myconfig.py file in your ~/mycar folder.

+

If you are using the gps path follow template then you do not need, and may not want, a camera. In this case you can change the camera type to mock; CAMERA_TYPE = "MOCK".

+

See Cameras for details on the various cameras and configuration.

+

Upgrade Donkey Car Software

+

Make all config changes to myconfig.py and they will be preserved through an update. When changes occur that you would like to get, you can pull the latest code, then issue a:

+
cd projects/donkeycar
+git pull
+donkey createcar --path ~/mycar --overwrite
+
+

If you created a car with the gps path follow template then remember to include the --template argument;

+
donkey createcar --template=path_follow --path ~/mycar --overwrite
+
+

Your ~/mycar/manage.py, ~/mycar/config.py and other files will change with this operation, but myconfig.py will not be touched. Your data and models dirs will not be touched.

+
+

Next

+

Calibrate your car

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/dataset_pretrained_models/index.html b/guide/dataset_pretrained_models/index.html new file mode 100644 index 00000000..d8df01fb --- /dev/null +++ b/guide/dataset_pretrained_models/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/guide/deep_learning/dataset_pretrained_models/index.html b/guide/deep_learning/dataset_pretrained_models/index.html new file mode 100644 index 00000000..77aa72b2 --- /dev/null +++ b/guide/deep_learning/dataset_pretrained_models/index.html @@ -0,0 +1,208 @@ + + + + + + + + Dataset and pre-trained models - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Dataset and pre-trained models

+

The purpose of providing a sample dataset and pre-trained models here is to help beginners to get started easier. You can download the dataset and use it to train together with your own dataset.

+

Dataset

+

Dataset in real world

+
    +
  1. Sample dataset #1 by Tawn - 2.0GB , Link
  2. +
+

Dataset in simulator

+
    +
  • Not available yet
  • +
+

Contributing dataset

+

If you have a dataset you want to contribute, please contact us on Discord #dataset channel, or raise a PR on donkey_datasets. Thank you.

+

Pre-trained models

+

Now available on donkey_datasets. We plan to grow the repository of pre-trained models.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/deep_learning/mobile_app/index.html b/guide/deep_learning/mobile_app/index.html new file mode 100644 index 00000000..8d02f697 --- /dev/null +++ b/guide/deep_learning/mobile_app/index.html @@ -0,0 +1,383 @@ + + + + + + + + Mobile app - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Robocar Controller

+

Robocar Controller is a mobile app designed to provide a “commandless” user experience to get started with the Donkey Car.

+

cover

+

Features

+
    +
  • Commandless experience - No SSH or text editor
  • +
  • Built-in Hotspot
  • +
  • Search vehicle on the network
  • +
  • Real-time Calibration
  • +
  • Virtual joystick
  • +
  • Visualize the data
  • +
  • Drive Summary
  • +
  • Free GPU training
  • +
  • Autopilot
  • +
  • Advanced configuration
  • +
  • Battery level
  • +
+

Requirements

+
    +
  • A Donkey Car with Pi 4B (Jetson Nano is not yet supported)
  • +
  • A Mobile phone with iOS or Android
  • +
+

Quickstart Guide

+

Please refer to the quick start guide here.

+
+

If you do not want to use the prebuilt image then you can install the server component onto your Donkey Car manually. See Optional Manual Installation below.

+
+

Features Details

+

Built-in Hotspot

+

The car will become a hotspot when there is no known Wifi network to connect. After connecting your phone to this hotspot, you can use the app to configure the car to join the Wifi network you want.

+

Search vehicle on the network

+

Once your car connects to the same network as your phone, the app will scan the whole network to discover it. The app will also show you the IP address of the car in case you want to connect to it via SSH.

+

Search Vehicle

+

Real-time Calibration

+

Sometimes it is quite annoying if the car goes too fast or does not run in a straight line. The calibration UI assists you to find the right settings so that your car could be properly calibrated. With the enhanced calibration function, the change will take place in real time and you could observe the change immediately.

+

Real-time calibration

+

Virtual Joystick

+

The virtual joystick offers a quick way to test drive the car if you don't have a physical gamepad controller. It also streams the video captured from the camera in real time. You can just look at the screen and start driving.

+

Drive UI

+

Drive Summary

+

The app presents a drive summary with histogram, the size and the number of images you have collected. The histogram is generated automatically by calling the tubhist function in the Donkey car software.

+

Drive summary

+

Visualize the data

+

The app shows all the data(tubs) and the metadata you have collected on the Pi. The metadata includes number of images, size of the tub, the resolutions, the histogram and the location. The app will make use of the donkey makemovie command to generate a video so you can review how the data look like.

+

Data

+

Free GPU Training

+

Free GPU training is available to user who use the app. You can train a model by selecting the data(tubs) you wish to train. The data will be uploaded to our server to start the training process. Once the training is completed, the app will show you the training loss and accuracy graph. At the same time, the app will download the model to your car and you can test the model right away.

+

Note: We keep the data and models for a period of time. After that, we will delete it from our storage.

+

Train

+

More on Free GPU Training

+

We are using AWS g4dn.xlarge instance to train the model. It feautres NVIDIA T4 GPU and up to 16GB GPU memory. Increase the batch size to 256 or more to fully utilize the powerful GPU.

+

Limitation

+

N.B.: To protect our equipment from being abused, we have the following rules to use the training service.

+
    +
  • Each training is limited to a maximum of 15 minutes. The training job will timeout if it last more than 15 minutes
  • +
  • Each device could train 5 times per 24 hours.
  • +
  • Max data size is 100MB per training
  • +
+

Autopilot

+

The app will list all models inside the Pi, no matter it is generated from the training function or just a model copied to the Pi. You can start the autopilot mode using a similar UI as the Drive UI.

+

Autopilot

+

Advanced configuration

+

The Doneky car software comes with a vast of configuration that you can experiment. We have included some of the popular options that you may want to change.

+
    +
  • Camera size
  • +
  • Training configuration
  • +
  • Drive train settings
  • +
+

Advanced configuration

+

Battery level

+

If you are using MM1, the app shows you the current battery level in percentage. We have also added an OS tweak that if battery level fall below 7V, the system will shutdown automatically.

+

Upcoming features

+
    +
  • Salient visualization
  • +
  • Auto throttle compensation based on battery level
  • +
  • Transfer learning
  • +
+

Report a problem

+

If you encountered a problem, please file an issue on this github project.

+

Optional Manual Installation

+

If you can not or do not want to use the prebuild SD image for you Donkey Car, then you can install the server component onto your Donkey car manually. +Donkey Car console is a management software of the donkey car that provides a rest-based API to support Donkey Car mobile app.

+
+

Note This software currently supports RaspberryPi 4B only.

+
+

1. Complete the Setup for RaspberryPi

+

2. Clone the Donkey Car Console project

+
git clone https://github.com/robocarstore/donkeycar-console
+sudo mv donkeycar-console /opt
+cd /opt/donkeycar-console
+
+

3. Install dependencies

+
pip install -r requirements/production.txt
+
+

4. Run the init script to set up the database

+
python manage.py migrate
+
+

5. Test the server if it is running properly

+
python manage.py runserver 0.0.0.0:8000
+
+

Go to http://your_pi_ip:8000/vehicle/status. If it returns something without error, it works.

+

6. Install the server as a service

+
sudo ln -s gunicorn.service /etc/systemd/system/gunicorn.service
+
+

7. Install the mobile app on your phone

+ +

Make sure your phone is connected to the same network as your Pi (if it won't connect, try turning off your cell data). Fire up the mobile app and you can search your car using the mobile app.

+

FAQ

+
    +
  • Why the app is called Robocar Controller instead of Donkeycar Controller?
  • +
+

We would love to call the app Donkeycar Controller but Apple does not allow us to do so. We are working with Adam to submit a proof to Apple that we can use the Donkeycar trademark in our app. In the meanwhile, we will be using the name Robocar Controller.

+

Commercial Usage

+

This app is developed by Robocar Store. If you plan to use this app to make money, please follow the Donkey Car guideline and send an email to Robocar Store.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/deep_learning/simulator/index.html b/guide/deep_learning/simulator/index.html new file mode 100644 index 00000000..f3e2c3e7 --- /dev/null +++ b/guide/deep_learning/simulator/index.html @@ -0,0 +1,638 @@ + + + + + + + + Donkey Simulator. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Donkey Simulator

+

The Donkey Gym project is a OpenAI gym wrapper around the Self Driving Sandbox donkey simulator (sdsandbox). When building the sim from source, checkout the donkey branch of the sdsandbox project.

+

The simulator is built on the the Unity game platform, uses their internal physics and graphics, and connects to a donkey Python process to use our trained model to control the simulated Donkey.

+

Installation Video:

+

Here's some videos to help you through the installation.

+

Linux: +https://youtu.be/J6Ll5Obtuxk

+

Windows: +https://youtu.be/wqQMmHVT8qw

+

My Virtual Donkey

+

There are many ways to use the simulator, depending on your goals. You can use the simulator to get to know and use the standard Donkeycar drive/train/test cycle by treating it as virtual hardware. You will collect data, drive, and train using the same commands as if you were using a real robot. We will walk through that use-case first.

+

sim_screen_shot

+

Install

+
    +
  • Download and unzip the simulator for your host pc platform from Donkey Gym Release.
  • +
  • Place the simulator where you like. For this example it will be ~/projects/DonkeySimLinux. Your dir will have a different name depending on platform.
  • +
  • Complete all the steps to install Donkey on your host pc.
  • +
  • Setup DonkeyGym:
  • +
+
cd ~/projects
+git clone https://github.com/tawnkramer/gym-donkeycar
+cd gym-donkeycar
+conda activate donkey
+pip install -e .[gym-donkeycar]
+
+
    +
  • You may use an existing ~/mycar donkey application, or begin a new one. Here we will start fresh:
  • +
+
donkey createcar --path ~/mysim
+cd ~/mysim
+
+
    +
  • Edit your myconfig.py to enable donkey gym simulator wrapper, replace <user-name> and the other parts of the path:
  • +
+
DONKEY_GYM = True
+DONKEY_SIM_PATH = "/home/<user-name>/projects/DonkeySimLinux/donkey_sim.x86_64"
+DONKEY_GYM_ENV_NAME = "donkey-generated-track-v0"
+
+
+

Note: your path to the executable will vary depending on platform and user. + Windows: DonkeySimWin/donkey_sim.exe + Mac OS: DonkeySimMac/donkey_sim.app/Contents/MacOS/donkey_sim + Linux: DonkeySimLinux/donkey_sim.x86_64

+
+

Drive

+

You may use all the normal commands to manage.py at this point. Such as:

+
python manage.py drive
+
+

This should start the simulator and connect to it automatically. By default you will have a web interface to control the donkey. Navigate to http://localhost:8887/drive to see control page.

+

On Ubuntu Linux only, you may plug in your joystick of choice. +If it mounts as /dev/input/js0 then there's a good chance it will work. +Modify myconfig.py to indicate your joystick model and use the --js arg to run.

+
python manage.py drive --js
+
+

As you drive, this will create a tub of records in your data dir as usual.

+

Train

+

You will not need to rsync your data, as it was recorded and resides locally. You can train as usual:

+
donkey train --tub ./data --model models/mypilot.h5
+
+

Test

+

You can use the model as usual:

+
python manage.py drive --model models/mypilot.h5
+
+

Then navigate to web control page. Set Mode and Pilot to Local Pilot(d). The car should start driving.

+

Sample Driving Data

+

Here's some sample driving data to get you started. Download this and unpack it into your data dir. This should train to a slow but stable driver.

+
+

API

+

Here's some info on the api to talk to the sim server. Make a TCP client and connect to port 9091 on whichever host the sim is running. The server sends and receives UTF-8 encoded JSON packets. Each message must have a "msg_type" field. The sim will end all JSON packets with a newline character for termination. You don't have to end each packet with a newline when sending to the server. But if it gets too many messages too quickly it may have troubles. Check the player log file for JSON parse errors if you are having troubles.

+
+

Get Protocol Version

+

Client=>Sim. Ask for the version of the protocol. Will help know when changes are made to these messages.

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "get_protocol_version" 
+    }
+
+
+

Protocol Version

+

Sim=>Client. Reply for the version of the protocol. Currently at version 2.

+

Fields:

+
    +
  • version : string integer
  • +
+

Example:

+
    {
+    "msg_type" : "protocol_version",
+    "version" : "2",
+    }
+
+
+

Scene Selection Ready

+

Sim=>Client. When the Menu scene is finished loading this will be sent. After this point, the sim can honor the Scene Loading message. (Menu only)

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "scene_selection_ready" 
+    }
+
+
+

Get Scene Names

+

Client=>Sim. Ask names of the scene you can load. (Menu only)

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "get_scene_names" 
+    }
+
+
+

Scene Names

+

Sim=>Client. Sim will reply with list of scene names.

+

Fields:

+
    +
  • scene_names : array of scene names
  • +
+

Example:

+
    {
+    "msg_type" : "scene_names" 
+    "scene_names" : [ "generated_road", "warehouse", "sparkfun_avc". "generated_track" ]
+    }
+
+
+

Load Scene

+

Client=>Sim. Asks the sim to load one of the scenes from the Menu screen. (Menu only)

+

Fields:

+

scene_name : generated_road | warehouse | sparkfun_avc | generated_track ( or whatever list the sim returns from get_scene_names)

+

Example:

+
    {
+        "msg_type" : "load_scene",
+        "scene_name" : "generated_track"
+    }
+
+
+

Scene Loaded

+

Sim=>Client. Once scene is loaded, in reply, you will get a:

+
    {
+        "msg_type" : "scene_loaded"
+    }
+
+
+

Car Loaded

+

Sim=>Client. Once the sim finishes loading your car, it sends this message. The car is loaded for you automatically once the scene is loaded with an active client. Or a client make a connection.

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "car_loaded" 
+    }
+
+
+

Car Config

+

Client=>Sim. Once loaded, you may configure your car visual details (scene only)

+

Fields:

+
    +
  • body_style : donkey | bare | car01 | cybertruck | f1
  • +
  • body_r : string value of integer between 0-255
  • +
  • body_g : string value of integer between 0-255
  • +
  • body_b : string value of integer between 0-255
  • +
  • car_name : string value car name to display over car. Newline accepted for multi-line.
  • +
  • font_size : string value of integer between 10-100 to set size of car name text
  • +
+

Example:

+
    {
+        "msg_type" : "car_config",
+        "body_style" : "car01",
+        "body_r" : "128",
+        "body_g" : "0",
+        "body_b" : "255",
+        "car_name" : "Your Name",
+        "font_size" : "100"
+    }
+
+
+

Camera Config

+

Client=>Sim. Once the scene is loaded, you may configure your car camera sensor details

+

Fields:

+
    +
  • fov : string value of float between 10-200. Sets the camera field of view in degrees.
  • +
  • fish_eye_x : string value of float between 0-1. Causes distortion warping in x axis.
  • +
  • fish_eye_y : string value of float between 0-1. Causes distortion warping in y axis.
  • +
  • img_w : string value of integer between 16-512. Sets camera sensor image width.
  • +
  • img_h : string value of integer between 16-512. Sets camera sensor image height.
  • +
  • img_d : string value of integer 1 or 3. Sets camera sensor image depth. In case of 1, you get 3 channels but all identicle with greyscale conversion done on the sim.
  • +
  • img_enc : Image format of data JPG | PNG | TGA
  • +
  • offset_x : string value of float. Moves the camera left and right axis.
  • +
  • offset_y : string value of float. Moves the camera up and down.
  • +
  • offset_z : string value of float. Moves the camera forward and back.
  • +
  • rot_x : string value of float. Degrees. Rotates camera around X axis.
  • +
+

Example:

+
    {
+    "msg_type" : "cam_config",
+    "fov" : "150", 
+    "fish_eye_x" : "1.0",
+    "fish_eye_y" : "1.0",
+    "img_w" : "255",
+    "img_h" : "255",
+    "img_d" : "1",
+    "img_enc" : "PNG",
+    "offset_x" : "0.0",
+    "offset_y" : "3.0",
+    "offset_z" : "0.0",
+    "rot_x" : "90.0"
+    }
+
+

Note: +You can add an other camera by changing the msg_type to "cam_config_b"

+
+

Control Car

+

Client=>Sim. Control throttle and steering.

+

Fields:

+
    +
  • steering : string value of float between -1 to 1. Maps to full left or right, 16 deg from center.
  • +
  • throttle : string value of float between -1 to 1. Full forward or reverse torque to wheels.
  • +
  • brake : string value of float between 0 to 1.
  • +
+

Example:

+
    {
+    "msg_type" : "control",
+    "steering" : "0.0",
+    "throttle" : "0.3",
+    "brake" : "0.0"
+    }
+
+
+

Telemetry

+

Sim=>Client. The sim sends this message containing camera image and details about vehicle state. These come at a regular rate set in the sim. Usually about 20 HZ.

+

Fields:

+
    +
  • steering_angle : Last steering applied. Why not just steering like control? idk.
  • +
  • throttle : Last throttle applied.
  • +
  • speed : magnitude of linear velocity.
  • +
  • image : a BinHex encoded binary image. Use PIL.Image.open(BytesIO(base64.b64decode(imgString)))
  • +
  • imageb : (optionnal) same as above but for the second camera
  • +
  • lidar : (optionnal) list of lidar points in the following format: {d: distanceToObject, rx: rayRotationX, ry: rayRotationY}
  • +
  • hit : name of the last object struck. Or None if no object hit.
  • +
  • accel_x : x acceleration of vehicle.
  • +
  • accel_y : y acceleration of vehicle.
  • +
  • accel_z : z acceleration of vehicle.
  • +
  • gyro_x : x gyro acceleration.
  • +
  • gyro_y : y gyro acceleration.
  • +
  • gyro_z : z gyro acceleration.
  • +
  • gyro_w : w gyro acceleration.
  • +
  • pitch : pitch of the car in degrees.
  • +
  • roll : roll of the car degrees.
  • +
  • yaw : yaw of the car degrees.
  • +
  • activeNode : Progress on track (not working properly with multiple car for the moment)
  • +
  • totalNodes : number of nodes on track
  • +
  • pos_x : (training only) x world coordinate of vehicle.
  • +
  • pos_y : (training only) y world coordinate of vehicle.
  • +
  • pos_z : (training only) z world coordinate of vehicle.
  • +
  • vel_x : (training only) x velocity of vehicle.
  • +
  • vel_y : (training only) y velocity of vehicle.
  • +
  • vel_z : (training only) z velocity of vehicle.
  • +
  • cte : (training only) Cross track error. The distance from the car to the path in the center of the right most lane or center of the track (depends on the track)
  • +
+

Example:

+
    {
+    "msg_type" : "telemetry", 
+    "steering_angle" : "0.0", 
+    "throttle" : "0.0", 
+    "speed" : "1.0", 
+    "image" : "0x123...", 
+    "hit" : "None", 
+    "pos_x" : "0.0", 
+    "pos_y" : "0.0", 
+    "pos_z" : "0.0", 
+    "accel_x" : "0.0", 
+    "accel_y" : "0.0", 
+    "accel_z" : "0.0", 
+    "gyro_x" : "0.0", 
+    "gyro_y" : "0.0", 
+    "gyro_z" : "0.0", 
+    "gyro_w" : "0.0",
+    "pitch" : "0.0", 
+    "roll" : "0.0", 
+    "yaw" : "0.0",
+    "activeNode" : "5"
+    "totalNodes" : "26"
+    "cte" : "0.5"
+    }
+
+
+

Reset Car

+

Client=>Sim. Return the car to the start point.

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "reset_car" 
+    }
+
+
+

Set Car Position

+

Client=>Sim. Move the car to the given position (training only)

+

Fields:

+
    +
  • pos_x : x world coordinate.
  • +
  • pos_y : y world coordinate.
  • +
  • pos_z : z world coordinate.
  • +
  • qx : (optionnal) quaternion x
  • +
  • qy : (optionnal) quaternion y
  • +
  • qz : (optionnal) quaternion z
  • +
  • qw : (optionnal) quaternion w
  • +
+

Example:

+
    {
+    "msg_type" : "set_position" 
+    "pos_x" : "0.0", 
+    "pos_y" : "0.0", 
+    "pos_z" : "0.0"
+    }
+
+

or:

+
    {
+    "msg_type" : "set_position" 
+    "pos_x" : "0.0", 
+    "pos_y" : "0.0", 
+    "pos_z" : "0.0",
+    "qx" : "0.0",
+    "qy" : "0.2",
+    "qz" : "0.0",
+    "qw" : "1.0"
+    }
+
+
+

Get node position and rotation

+

Client=>Sim. Ask for a node_position packet

+

Fields:

+
    +
  • index : node index
  • +
+

Example:

+
    {
+    "msg_type": "node_position",
+    "index": "0"
+    }
+
+
+

Node position and rotation

+

Sim=>Client. node_position packet (received after sending a node_position packet)

+

Fields:

+
    +
  • pos_x : x world coordinate.
  • +
  • pos_y : y world coordinate.
  • +
  • pos_z : z world coordinate.
  • +
  • qx : (optionnal) quaternion x
  • +
  • qy : (optionnal) quaternion y
  • +
  • qz : (optionnal) quaternion z
  • +
  • qw : (optionnal) quaternion w
  • +
+

Example:

+
    {
+    "msg_type": "node_position",
+    "Qx": "0",
+    "Qy": "0",
+    "Qz": "0",
+    "Qw": "1",
+    "pos_x": "0",
+    "pos_y": "0",
+    "pos_z": "0"
+    }
+
+
+

Exit Scene

+

Client=>Sim. Leave the scene and return to the main menu screen.

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "exit_scene" 
+    }
+
+
+

Quit App

+

Client=>Sim. Close the sim executable. (Menu only)

+

Fields: None

+

Example:

+
    {
+    "msg_type" : "quit_app" 
+    }
+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/deep_learning/train_autopilot/index.html b/guide/deep_learning/train_autopilot/index.html new file mode 100644 index 00000000..12b19c35 --- /dev/null +++ b/guide/deep_learning/train_autopilot/index.html @@ -0,0 +1,335 @@ + + + + + + + + Train an autopilot with Keras - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Train an autopilot with Keras

+

Now that you're able to drive your car reliably you can use Keras to train a +neural network to drive like you. Here are the steps.

+

Collect Data

+

Make sure you collect good, clean data. The neural network will learn what is in the data that it is train upon, so bad data results in a bad model. If your data has segments where you drive off the track, then there will be as aspect of the trained model that will reflect that. Ideally you would drive perfectly and no bad data would be recorded. However, that is not realistic, but there are ways to easily remove errors as they happen.

+

A technique that is useful to avoid collecting bad data is to use the erase-records button on the controller when you make a mistake. So if you crash then immediately toggle recording to off and select the erase-records button to delete the last 100 records. It get's even easier if you use set AUTO_RECORD_ON_THROTTLE = True in your myconfig.py file; in that case recording will turn on when you apply throttle and will turn off when you stop applying throttle. So between that and the erase-records button you can record tens-of-thousands of records without having to go back and clean data with the donkey tubclean command.

+

Beyond actual mistakes, the consistency in the data you record matters. The more variation there is in the data, the more data you will need to get a good model. The more consistent you are when you drive, then the more uniform the data will be and so the neural network will be able to correllate the inputs to the outputs more effectively. Consistency is particularly hard to get for throttle; it is very hard to replicate throttle exactly at each place on the track. One strategy (a strategy I use) is to find the maximum throttle you can maintain around the entire track or at least most of the track, then just use that; set JOYSTICK_MAX_THROTTLE in your myconfig.py file to that throttle, then you can put the pedal to the metal around most of the course when you are collecting data.

+
    +
  1. Practice driving around the track a couple times.
  2. +
  3. When you're confident you can drive 10 laps with few mistakes, restart the python mange.py process to create a new data recording session. Press Start Recording if using web controller or use AUTO_RECORD_ON_THROTTLE = True as described above so the joystick will auto record with any non-zero throttle.
  4. +
  5. If you crash or run off the track press Stop Car immediately and stop recording. If you are using a joystick tap the button that erases the last 100 records (5 seconds at 20 hz drive loop).
  6. +
  7. After you've collected 10-20 laps of good data (5-20k images) you can stop +your car with Ctrl-c in the ssh session for your car.
  8. +
  9. The data you've collected is in the mycar data folder.
  10. +
  11. If you recorded mistakes then you can use the donkey tubclean to edit your data and remove the mistakes.
  12. +
+

Transfer data from your car to your computer

+

Since the Raspberry Pi is not very powerful, we need to transfer the data +to a PC computer to train. The Jetson nano is more powerful, but still quite slow to train. If desired, skip this transfer step and train on the Nano.

+

Training the easy, GUI way

+

The easiest way to do the training on your desktop is by using the Donkey UI application. + Tub_manager UI

+

If, however, you want to do the training with the command line, read on....

+

Training with the command line

+

In a new terminal session on your host PC use rsync to copy your cars +folder from the Raspberry Pi.

+
rsync -rv --progress --partial pi@<your_pi_ip_address>:~/mycar/data/  ~/mycar/data/
+
+

Train a model

+
    +
  • In the same terminal you can now run the training script on the latest data by passing the path to that data as an argument. You can optionally pass path masks, such as ./data/* to gather multiple manifests. For example, from your mycar folder on your host PC:
  • +
+
~\mycar$ donkey train --tub ./data --model ./models/mypilot.h5
+
+

You may specify more than one tub using a comma separated list --tub=foo/data,bar/data or just leaving spaces like --tub foo/data bar/data. See Train the Model

+
    +
  • +

    You can create different model types with the --type argument during training. You may also choose to change the default model type in myconfig.py DEFAULT_MODEL_TYPE. When specifying a new model type, be sure to provide that type when running the model, or using the model in other tools like plotting or profiling. For more information on the different model types, look here for Keras Parts. The model will be placed into the folder models/. You can as well omit the --model flag and the model name will be auto created using the pattern pilot_YY-MM-DD_N.h5.

    +
  • +
  • +

    If you run with version >= 4.3.0, the model will be automatically created in tflite format for fast inferencing, generating a ./models/mypilot.tflite file, too. Tflite creation can be suppressed, by setting CREATE_TF_LITE = False in your myconfig.py file. In addition, a tensorrt model is produced if you set CREATE_TENSOR_RT = True, which is False by default. That setting produces a ./models/mypilot.trt file that should work on all platforms. On RPi, the tflite model will be the fastest.

    +
  • +
+
+

Note: There was a regression in version 4.2 where you only had to provide the model name in the model argument, like --model mypilot.h5. This got resolved in version 4.2.1. Please update to that version.

+
+
    +
  • +

    Image Augmentation With version >= 4.3.0 you also have access to image augmentations for training. Image augmentation is a technique where data, in this case images, is changed (augmented) to create variation in the data. The purpose is two-fold. First it can help extend your data when you don't have a lot of data. Second it can create a model that is more resilient to variations in the data at inference time. In our case, we want to handle various lighting conditions. Currently supported are AUGMENTATIONS = ['MULTIPLY', 'BLUR'] in the settings which generate brightness modifications and apply a Gaussian blur. These can be used individually or together. Augmentations are only applied during training; they are not applied when driving on autopilot.

    +
  • +
  • +

    Image Transformation With version >= 4.3.0 you also have access to image transformations like cropping or trapezoidal masking. Cropping and masking are similar; both 'erase' pixels on the image. This is done to remove pixels that are not important and that may add unwanted detail that can make the model perform poorly under conditions where that unwanted detail is different. Cropping can erase pixels on the top, bottom, left and/or right of the image. Trapezoidal masking is a little more flexible in that it can mask pixels using a trapezoidal mask that can account for perspective in the image. To crop the image or apply a trapezoidal mask you can provide TRANSFORMATIONS = ['CROP'] or TRANSFORMATIONS = ['TRAPEZE']. Generally you will use either cropping or trapezoidal masking but not both. Transformations must be applied in the same way in training and when driving on autopilot; make sure the transformation configuration is the same on your training machine and on your Donkey Car.

    +
  • +
+

Copy model back to car

+
    +
  • +

    In previous step we managed to get a model trained on the data. Now is time to move the model back to Rasberry Pi, so we can use it for testing it if it will drive itself.

    +
  • +
  • +

    Use rsync again to move your trained model pilot back to your car.

    +
  • +
+
rsync -rv --progress --partial ~/mycar/models/ pi@<your_ip_address>:~/mycar/models/
+
+
    +
  • +

    Ensure to place car on the track so that it is ready to drive.

    +
  • +
  • +

    Now you can start your car again and pass it your model to drive.

    +
  • +
+
python manage.py drive --model ~/mycar/models/mypilot.h5
+
+
    +
  • However, you will see better performance if you start with the tflite mode.
  • +
+
python manage.py drive --model ~/mycar/models/mypilot.tflite --type tflite_linear
+
+
    +
  • The car should start to drive on its own, congratulations!
  • +
+

[Optional] Use TensorRT on the Jetson Nano

+

Read this for more information.

+

Training Tips

+ + +
+
    +
  1. Mode & Pilot: Congratulations on getting it this far. The first thing to note after running the command above, is to look at the options in the Mode & Pilot menu. It can be pretty confusing. So here's what the different options mean:
  2. +
+

a. User : As you guessed, this is where you are in control of both the steering and throttle control.

+

b. Local Angle : Not too obvious, but this is where the trained model (mypilot from above) controls the steering. The Local refers to the trained model which is locally hosted on the raspberry-pi.

+

c. Local Pilot : This is where the trained model (mypilot) assumes control of both the steering and the throttle. As of now, it's purportedly not very reliable.

+

Be sure to also check out the Max Throttle and Throttle Mode options, and play around with a few settings. Can help with training quite a lot.

+
    +
  1. +

    Build a Simple Track : This isn't very well-documented, but the car should (theoretically) be able to train against any kind of track. To start off with, it might not be necessary to build a two-lane track with a striped center-lane. Try with a single lane with no center-line, or just a single strip that makes a circuit! At the least, you'll be able to do an end-to-end testing and verify that the software pipeline is all properly functional. Of course, as the next-step, you'll want to create a more standard track, and compete at a meetup nearest to you!

    +
  2. +
  3. +

    Get help : Try to get some helping hands from a friend or two. Again, this helps immensely with building the track, because it is harder than it looks to build a two-line track on your own! Also, you can save on resources (and tapes) by using a ribbon instead of tapes. They'll still need a bit of tapes to hold them, but you can reuse them and they can be laid down with a lot less effort (Although the wind, if you're working outside, might make it difficult to lay them down initially).

    +
  4. +
+

Training Behavior Models

+ + +
+

How to train a Behavior model

+
    +
  • +

    Make sure TRAIN_BEHAVIORS = True in myconfig.py when training and when running on the robot.

    +
  • +
  • +

    Setup an RGB led on robot to indicate which state is active. Enable in config.py. Verify when running robot that L1 PS3 button changes state led indicator. (that's the left upper shoulder button)

    +
  • +
  • +

    By default there are two states. If you like, adjust the number of states in bottom of config.py. Rename or change BEHAVIOR_LIST to an arbitrary number of labels. Make sure same number of rgb colors in BEHAVIOR_LED_COLORS. Make sure to reflect any changes to both PC and Robot.

    +
  • +
  • +

    Now for training: Activate any state with L1 shoulder button. Then drive as you wish the car to drive when in that state. Switch states and then transition to the new steady state behavior.

    +
  • +
  • +

    For the two lane case. Drive 33% in one lane, 33% in the other, and 33% transitioning between them. It's important to trigger the state transition before changing lanes.

    +
  • +
  • +

    Check the records in the data file. Open a .json. In addition to steering and throttle, you should also have some additional state information about your behavior vector and which was was activate on that frame. This is crucial to training correctly.

    +
  • +
  • +

    Move data to PC and train as normal, ensuring TRAIN_BEHAVIORS = True in myconfig.py on PC, otherwise extra state information will be ignored.

    +
  • +
  • +

    Move trained model back to robot. Now place the robot in the location of the initial state. Start the robot with the given model

    +
  • +
+
python manage.py drive --model=models/my_beh.h5 --type=behavior
+
+
    +
  • Now press select to switch to desired AI mode. Constant throttle available as well as trained throttle.
  • +
+

As it drives, you can now toggle states with L1 and see whether and how much it can replicate your steady state behaviors and transitions.

+

Be sure to include quite a lot of example of transitions from one state to another. At least 50, but more like 100.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/deep_learning/virtual_race_league/index.html b/guide/deep_learning/virtual_race_league/index.html new file mode 100644 index 00000000..fb10d184 --- /dev/null +++ b/guide/deep_learning/virtual_race_league/index.html @@ -0,0 +1,269 @@ + + + + + + + + Virtual Race League. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Virtual Race League

+

We've taken the next step in DIY Robocars competitions. We are now hosting special events online! We welcome competitors from all over the world. Events will be scheduled by Chris Anderson and the Donkeycar maintainers. But this is by no means donkeycar only. Please read on, and we will provide two paths depending on whether you decide to use the donkeycar framework to race.

+

race_previe

+

We will be broadcasting the race stream over Twitch. Check the event annoucement for the url. Race comepetitors will join a group Zoom chat event. Tawn will host the race server and share the stream over Zoom/Twitch. And we will see how things go.

+

About the Sim server

+

We are using the SDSandbox open source project as the racing sim. This creates a 3d environment using Unity game creation tool. It uses NVidia PhysX open source physics engine to simulate 4 wheeled vehicle dynamics. This sim also acts as a server, listening on TCP port 9091. This sends and receives JSON packets. More on the API later.

+

We use an OpenAI GYM style wrapper to interface with the server. The project for this wrapper is gym-donkeycar.

+

You can build the server from the source project above, or use pre-built binaries for Ubuntu, Mac, and Windows. This has been tested on Ubuntu 18.04, Mac 10.13, and Windows 10.

+

Setup for Donkeycar users

+

If you are using the donkeycar framework to race, you can use follow the guide to setup the simulator. If visuals directions help out, checkout the Windows Sim Setup Screen-Cast on Youtube. Use this to practice before the race. When it comes time to race, modify your myconfig.py to have these two changes:

+
DONKEY_SIM_PATH = "remote"
+SIM_HOST = "trainmydonkey.com"
+
+

This racing server will not always be running. We will bring it up for testing events and on race day. We are aiming to have it up from 7pm-9pm Pacific every night a week before race day. If not up, ask on Discord and we will try to get things running.

+
+

Note: If you trained a donkey model, but wish to run it on a Jetson Nano or some platform where you are having troubles installing all the dependencies, here's a single script you can use to run without any donkeycar or gym-donkeycar dependencies. Just pass it the model file name, the host name, and the car name. And it will run as a client to the race sim.

+
+

Setup for Non-Donkeycar users

+

If you would like to roll your own client, we have some python code to get you started.

+
    +
  • +

    You will first want to download the sim pre-built binary for your platform. Extract that where you like.

    +
  • +
  • +

    Then clone the gym-donkeycar python project and install. If you are using a virtual environment, don't forget to activate it first.

    +
  • +
+
git clone https://github.com/tawnkramer/gym-donkeycar
+pip install -e gym-donkeycar
+
+
    +
  • get the test client. Download via wget on Mac or Linux like:
  • +
+
wget https://raw.githubusercontent.com/tawnkramer/sdsandbox/master/src/test_client.py
+
+
    +
  • or on Windows open a browser to https://github.com/tawnkramer/sdsandbox/tree/master/src
  • +
  • +

    then right click on test_client.py and choose "Save link as..." and choose a location on your PC.

    +
  • +
  • +

    start up the simulator and let it get to the menu screen.

    +
  • +
  • run the test client like
  • +
+

python3 test_client.py

+

Checkout test_client.py to see what's going there. Class SimpleClient connects to the host of your choosing. Then it sends a load scene command depending on which course you want to try. It then sends some car visual configuration, and then some camera config information. Then it enters an update loop.

+

You can try changing the num_clients variable to 2 or more clients. See how the sim can handle them.

+

The test client will send random steering command for time_to_drive = 1.0 seconds. Then quit.

+

During that time, the telemetry messages will come into SimpleClient::on_msg_recv. See them printed out for you. Also take a look at the 'test.png' that it writes to get a feel for what the camera looks like.

+

There's some comments in there explaining the camera configuration in detail. If you have a custom camera setup, hopefully we can come close to matching it with these controls.

+

When it's time to race, change the variable:

+
host = "trainmydonkey.com"
+
+

Be sure to enable controls to start the car on your command. We will likely be old school calling 3, 2, 1, GO! over video chat.

+

Getting Help

+

There's a lot to learn. And come to Discord to get some help. Check out the #virtual-racing-league channel there.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/folder.png b/guide/folder.png new file mode 100644 index 00000000..a715ffb2 Binary files /dev/null and b/guide/folder.png differ diff --git a/guide/get_driving/index.html b/guide/get_driving/index.html new file mode 100644 index 00000000..c6e06448 --- /dev/null +++ b/guide/get_driving/index.html @@ -0,0 +1,308 @@ + + + + + + + + Get driving. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Drive your car

+

After you've calibrated your car you can start driving it.

+

If you are not already, please ssh into your vehicle.

+

Start your car

+
+

*** Put your car in a safe place where the wheels are off the ground ***.

+
+

This is the step were the car can take off.

+

Open your car's folder and start your car.

+
cd ~/mycar
+python manage.py drive
+
+

This script will start the drive loop in your car which includes a part that +is a web server for you to control your car. You can now control your car +from a web browser at the URL: <your car's hostname.local>:8887

+

drive UI

+

Driving with Web Controller

+

There are 3 ways to move the car using the web controller +- Device Tilt: +Select Device Tilt in the Control Mode section of the web controller. Then select User in the Mode section. You can then tilt your phone forward to increase throttle and tilt it side to side to turn the steering. +- Joystick: +Select Joystick in the Control Mode section of the web controller. Then select User in the Mode section. You can then touch and drag on the virtual joystick area that appears. Moving up increases throttle, moving down decreases or reverses. Moving left turns left, moving right turns right. Releasing you finger stops. +- Gamepad: +If you have a game controller attached (either by cable or by bluetooth) to the machine on which you are viewing the web ui, and that game controller is compatible with the HTML5 Gamepad API, then you can choose Gamepad in the Control Mode section of the web controller. Then select User in the Mode section. You can then drive the donkeycar using the game controller..

+

Features

+
    +
  • Recording - Press record data to start recording images, steering angels and throttle values.
  • +
  • Throttle mode - Option to set the throttle as constant. This is used in +races if you have a pilot that will steer but doesn't control throttle.
  • +
  • Pilot mode - Choose this if the pilot should control the angle and/or throttle.
  • +
  • Max throttle - Select the maximum throttle.
  • +
+

Keyboard shortcuts

+
    +
  • space : stop car and stop recording
  • +
  • r : toggle recording
  • +
  • i : increase throttle
  • +
  • k : decrease throttle
  • +
  • j : turn left
  • +
  • l : turn right
  • +
+
+

If you don't have a joystick then you can skip to next section - train an autopilot.

+
+

Driving with Physical Joystick Controller

+

You may find that it helps to use a physical joystick device to control your vehicle.

+

Setup Bluetooth and pair joystick

+

Check the Controllers section to read about setting up the bluetooth connection.

+

Start car

+
cd ~/mycar
+python manage.py drive --js
+
+

Optionally, if you want joystick use to be sticky and don't want to add the --js each time, modify your myconfig.py so that USE_JOYSTICK_AS_DEFAULT = True

+
nano myconfig.py
+
+

Joystick Controls

+
    +
  • Left analog stick - Left and right to adjust steering
  • +
  • Right analog stick - Forward to increase forward throttle
  • +
  • Pull back twice on right analog to reverse
  • +
+
+

Whenever the throttle is not zero, driving data will be recorded - as long as you are in User mode!

+
+
    +
  • Select button switches modes - "User, Local Angle, Local(angle and throttle)"
  • +
  • Triangle - Increase max throttle
  • +
  • X - Decrease max throttle
  • +
  • Circle - Toggle recording (disabled by default. auto record on throttle is enabled by default)
  • +
  • dpad up - Increase throttle scale
  • +
  • dpad down - Decrease throttle scale
  • +
  • dpad left - Increase steering scale
  • +
  • dpad right - Decrease steering scale
  • +
  • Start - Toggle constant throttle. Sets to max throttle (modified by X and Triangle).
  • +
+
+

Next let's train an autopilot.

+ + +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/host_pc/setup_mac/index.html b/guide/host_pc/setup_mac/index.html new file mode 100644 index 00000000..0aa0adbd --- /dev/null +++ b/guide/host_pc/setup_mac/index.html @@ -0,0 +1,264 @@ + + + + + + + + Setup mac - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Install Donkeycar on Mac

+

donkey

+ +

Setup your donkey conda env with:

+
conda create -n donkey python=3.9
+conda activate donkey
+
+

Now there are two different installations possible. Very likely you will +want to do the user install. Then you will perform Step +User install. In case +you want to debug or edit the source code, you will need to do the more advanced +Developer install. But you can do only one.

+
+

Note: Only do User install or Developer install but not both!

+
+

User install

+

As you have activated the new donkey env already you simply type:

+
pip install donkeycar[pc]
+
+

if you are using an Intel Mac or you type:

+
pip install donkeycar[macos]
+
+

if you are on Apple Silicon. This will install the latest release.

+

Developer install

+

Here you can choose which branch or tag you want to install, and you can +edit and/or debug the code, by downloading the source code from GitHub.

+

Install git 64 bit +and change to a dir you would like to use as the head of your projects.

+
mkdir projects
+cd projects
+git clone https://github.com/autorope/donkeycar
+cd donkeycar
+git checkout main
+pip install -e .[pc]
+
+

Note: if you are using ZSH (you'll know if you are), you won't be able to +run pip install -e .[pc]. You'll need to escape the brackets and run +pip install -e .\[pc\].

+

Further steps

+
    +
  • If this is not your first install, update Conda and remove old donkey
  • +
+
conda update -n base -c defaults conda
+conda env remove -n donkey
+
+
    +
  • Tensorflow GPU
  • +
+

Currently, there is no NVidia gpu support for +tensorflow on mac.

+
    +
  • Create your local working dir:
  • +
+
donkey createcar --path ~/mycar
+
+
+

Note: After closing the Terminal, when you open it again, you will need to +type +conda activate donkey +to re-enable the mappings to donkey specific +Python libraries

+
+
+

Next let's install software on Donkeycar

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/host_pc/setup_ubuntu/index.html b/guide/host_pc/setup_ubuntu/index.html new file mode 100644 index 00000000..483bcf4e --- /dev/null +++ b/guide/host_pc/setup_ubuntu/index.html @@ -0,0 +1,286 @@ + + + + + + + + Install Donkeycar on Linux - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Install Donkeycar on Linux

+
+

Note : tested on Ubuntu 20.04 LTS, 22.04 LTS

+
+
    +
  • +

    Open the Terminal application.

    +
  • +
  • +

    Install miniconda Python 3.9 64 bit.

    +
  • +
+
wget https://repo.anaconda.com/miniconda/Miniconda3-py39_23.3.1-0-Linux-x86_64.sh
+bash ./Miniconda3-py39_23.3.1-0-Linux-x86_64.sh
+
+

Setup your donkey conda env with:

+
conda create -n donkey python=3.11
+conda activate donkey
+
+

Now there are two different installations possible. Very likely you will +want to do the user install. Then you will perform Step +User install. In case +you want to debug or edit the source code, you will need to do the more advanced +Developer install. But you can do only one.

+
+

Note: Only do User install or Developer install but not both!

+
+

User install

+

As you have activated the new donkey env already you simply type:

+
pip install donkeycar[pc]
+
+

This will install the latest release. Note, if you are using ZSH then +you have to escape the [ and ], i.e.

+
pip install donkeycar\[pc\]
+
+

Developer install

+

Here you can choose which branch or tag you want to install, and you can +edit and/or debug the code, by downloading the source code from GitHub.

+

Create a project directory you would like to use as the +head of your projects, change into it and download and install donkeycar +from GitHub.

+
mkdir projects
+cd projects
+git clone https://github.com/autorope/donkeycar
+cd donkeycar
+git checkout main
+pip install -e .[pc]
+
+

Note: if you are using ZSH (you'll know if you are), you won't be able to +run pip install -e .[pc]. You'll need to escape the brackets and run +pip install -e .\[pc\].

+
    +
  • If this is not your first install, update Conda and remove old donkey
  • +
+
conda update -n base -c defaults conda
+conda env remove -n donkey
+
+

The newer version of Tensorflow is already built with GPU support. If you +have an Nvidia GPU, install Cuda 12 following instructions on Nivida's page +here

+
    +
  • Optional Install Coral edge tpu compiler
  • +
+

If you have a Google Coral edge tpu, you may wish to compile models. You +will need to install the edgetpu_compiler exectutable. Follow their +instructions.

+
    +
  • Optionally configure PyTorch to use GPU - only for NVidia Graphics cards
  • +
+

If you have an NVidia card, you should update to the latest drivers and +install Cuda SDK. +You will also need to change the code to use the GPU in a few places, so +you need the developer install.

+
conda install cudatoolkit=11 -c pytorch
+
+

You should replace <CUDA Version> with your CUDA version. Any version +above 10.0 should work. You can find out your CUDA version by running +nvcc --version or nvidia-smi. (if those commands don't work, it means you +don't already have them installed. Follow the directions given by that error +to install them.) If the version given by these two commands don't match, go +with the version given by nvidia-smi.

+
    +
  • Create your local working dir:
  • +
+
donkey createcar --path ~/mycar
+
+
+

Note: After closing the Anaconda Prompt, when you open it again, you will need to +type conda activate donkey to re-enable the mappings to donkey specific +Python libraries

+
+
+

Next let's install software on Donkeycar

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/host_pc/setup_windows/index.html b/guide/host_pc/setup_windows/index.html new file mode 100644 index 00000000..d35aa0e0 --- /dev/null +++ b/guide/host_pc/setup_windows/index.html @@ -0,0 +1,231 @@ + + + + + + + + Windows - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Windows

+

Donkey Car used to support a native Windows installation but this has been +deprecated in favor of the WSL install.

+

Install Donkeycar on Windows (WSL)

+

The Windows Subsystem for Linux (WSL) lets developers run a GNU/Linux environment -- including most command-line tools, utilities, and applications -- directly on Windows, unmodified, without the overhead of a traditional virtual machine or dualboot setup.

+
    +
  • Install Windows Subsystem for Linux.
  • +
  • If using Windows 10 (this is not necessary for Windows 11), turn on Windows 10 "Windows Subsystem for Linux" Feature (Settings > Apps > Programs and Features > Turn Windows features on or off)
  • +
  • Download a Linux Distribution from the Microsoft Store (recommend Ubuntu Latest)
  • +
  • +

    Open the Ubuntu App and configure.

    +
  • +
  • +

    Open the Ubuntu App to get a prompt window via Start Menu | Ubuntu

    +
  • +
  • +

    Refresh list of packages and install pip and xclip:

    +
  • +
+
sudo apt-get update
+sudo apt install python3-pip
+sudo apt-get install libmtdev1 libgl1 xclip
+
+
    +
  • Add export LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libstdc++.so.6 to your .bashrc and re-source it. + bash + source ~/.bashrc
  • +
+

At this point switch to the Ubuntu instructions and continue the setup there.

+
    +
  • Possible problems when running the UI
  • +
+

If you use the Donkey UI with an NVIDIA graphics card and you see a blurred window it might be due to some settings on +your PC. In the settings, switch the NVIDIA graphics card 3D rendering mode from +let the running program decide the 3D rendering mode +to let me decide on the 3D rendering mode: Quality.

+
+

Next let's install software on Donkeycar

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/install_software/index.html b/guide/install_software/index.html new file mode 100644 index 00000000..e100351d --- /dev/null +++ b/guide/install_software/index.html @@ -0,0 +1,262 @@ + + + + + + + + Install the software. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Install Software

+ +

Overview

+

Donkeycar has components to install on a host PC. This can be a laptop, or desktop machine. The machine doesn't have to be powerful, but it will benefit from faster cpu, more ram, and an NVidia GPU. An SSD hard drive will greatly impact your training times.

+

Donkeycar software components need to be installed on the robot platform of your choice. Raspberry Pi and Jetson Nano have setup docs. But it has been known to work on Jetson TX2, Friendly Arm SBC, or almost any Debian based SBC ( single board computer ).

+

After install, you will create the Donkeycar application from a template. This contains code that is designed for you to customize for your particular case. Don't worry, we will get you started with some useful defaults.

+

Next we will train the Donkeycar to drive on it's own based on your driving style! This uses a supervised learning technique often referred to as behavioral cloning.

+ + +

This is not the only method for getting your Donkeycar to drive itself. But it requires the least amount of hardware and least technical knowledge. Then you can explore other techniques in this Ai mobile laboratory called Donkeycar!

+

Step 1: Install Software on Host PC

+

When controlling your Donkey via behavioral cloning, you will need to setup a host pc to train your machine learning model from the data collected on the robot. Choose a setup that matches your computer OS.

+ +

Step 2: Install Software On Donkeycar

+

This guide will help you to setup the software to run Donkeycar on your Raspberry Pi or Jetson Nano. Choose a setup that matches your SBC type. (SBC = single board computer)

+ +

[Optional] Use the Intel Realsense T265 localization sensor instead of a RPi camera

+

Read this for more information.

+

[Optional] Use TensorRT on the Jetson Nano

+

Read this for more information.

+

Next:

+

Create Your Donkeycar Application.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/mobile_app/index.html b/guide/mobile_app/index.html new file mode 100644 index 00000000..c42ab3ce --- /dev/null +++ b/guide/mobile_app/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/guide/path_follow/path_follow/index.html b/guide/path_follow/path_follow/index.html new file mode 100644 index 00000000..608a881d --- /dev/null +++ b/guide/path_follow/path_follow/index.html @@ -0,0 +1,355 @@ + + + + + + + + The Path Follow Template - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

The Path Follow Template

+

The path follow template is an alternative to the deep learning template. The deep learning template is great for an indoor track where lighting conditions and the details of the room can be controlled, but it can be more difficult to get working outside where lighting conditions are variable and things change in the environment. Outside we have access to GPS; the path_follow template allows you to record a path using a GPS receiver and then configure an autopilot that can follow that path.

+

GPS positions are read from the GPS receiver over a serial port. We read these as NMEA sentences; a line oriented protocol that most GPS receivers use by default. The NMEA sentences include positions as latitude and longitude; we then convert those to a local coordinate system in meters.

+

When we record a path, we save each (x, y) coordinate pair (each waypoint) we get from the GPS receiver into an array in memory. We can then save that to a CSV file which has one (x, y) coordinate pair per line. Later, we can read this csv file back into memory. Once we have our waypoints in memory, we can enter autopilot mode and follow those points.

+

Similar to the deep learning template, we have 3 modes of operation:

+
    +
  • In User driving mode you manually control the car. Again similar to the deep learning template, you can use the web controller and/or a game controller to steer, apply throttle and choose actions using buttons.
  • +
  • In Autosteering mode the car will try to follow the set of recorded waypoints, but it will only control steering; you still control throttle manually. This is a good mode to start in when following the path as you can safely stop the car by letting off the throttle. It's also helpful in determining the maximum speed at which the car can reliably follow the waypoints.
  • +
  • In Autopilot mode the car will try to follow the set of recorded waypoints by controlling both steering and throttle. This is fully autonomous. To stop the car use your controller to end User mode.
  • +
+
+ +

Path Follow Autopilot in Action

+
+

Before we can record or follow a path, we need to create an application and do a little configuration.

+

Create a path follow Application

+

You can create a path follow application similarly to the how we create a deep learning application; we just tell it to use the path_follow template instead of the default template. First, make sure your donkeycar python environment is activated, then use the createcar command to create your application folder.

+
donkey createcar --template=path_follow --path=~/mycar
+
+

When updating to a new version of donkeycar, you will want to refresh your application folder. You can do this with the same command, but add --overwrite so that it does not erase your myconfig.py file.

+
donkey createcar --template=path_follow --path=~/mycar --overwrite
+
+

Configuration

+

Again, like the deep learning template, we can change default configuration values by editing the myconfig.py file in the mycar folder you created with the createcar command.

+

You will need to calibrate and configure the drivetrain as described in Calibrate your Car. If you have a game controller paired to your car, then you will want to configure it as described in Controllers.

+

Configuring GPS

+

In myconfig.py, search for the 'gps' section. Make sure HAVE_GPS = True is set. You will need to determine the serial port that the GPS receiver is connected to and the baud rate to use. If possible, set your serial port to 115200 baud to get good throughput.

+
    +
  • GPS_SERIAL = <serialport>
      +
    • The <serialport> value differs depending on how you have your gps receiver connected (by usb or gpio serial) and by SBC (RPi vs Nano)
    • +
    • You can list all potential serial ports; ls /dev/tty*. Note that most of these are actually not usable.
    • +
    • If connecting to the Nano USB port, use /dev/ttyUSB0.
    • +
    • If connecting to the RPi USB port, use /dev/ttyACM01.
    • +
    • If connecting to the default RPi gpio serial port (board pins 8&10) use /dev/ttyAMA0.
    • +
    • If connecting to the default Jetson Nano gpio serial port (board pins 8&10) use /dev/ttyTHS1.
    • +
    +
  • +
  • GPS_SERIAL_BAUDRATE = <baudrate>
      +
    • The <baudrate> value differs depending on your gps and if you have changed it using U-Center.
    • +
    • when connecting between the SBC's USB port and the usb port on the gps receiver the baud rate is detected by USB, so choose 115200 so you have a fast connection.
    • +
    • The ZED-F9P's other serial ports default to 38400 baud.
    • +
    • Cheap gps receivers generally default to 9600 baud.
    • +
    • See this video on how to use UBlox' U-Center to change the baudrate of the uarts on a UBlox GPS receiver.
    • +
    +
  • +
+

Note that both the RPi and Jetson Nano may be using the default gpio serial port as a login console (you can connect up a serial 'terminal' and login). If using the gpio serial ports you need to disable the login console. See Writing to a serial port for details.

+

Those two settings are the only ones related to the GPS receiver that need to be set in myconfig.py. Most GPS Receivers can also be directly configured to change things like the baudrate of the serial ports or how fast position estimates are sent to the computer. Ideally the rate of position estimates should be as fast as possible, but different receivers have different upper limits and there is some tradeoff between the rate of updates and how accurate they are. U-Blox based GPS receivers can be configured with U-Blox U-Center software; see the U-Center section of Donkeycar Meets RTK GPS for some details. Other chipset manufacturers have their own software; you will have to check your GPS receiver to determine the manufacturer. If you are using RTK high resolution GPS then you need to do a lot more configuration and wiring outside of Donkeycar. See Donkeycar meets RTK GPS for a detailed discussion of one way to setup an RTK GPS receiver for use with Donkeycar. Here is a related video that goes over the same information.

+

Configuring Encoders and Kinematics

+

An encoder setup can be used to estimate not only the vehicles' speed, but its position; that then allows encoders to be used with the Path Follow template in place of GPS, so it can be used indoors. This requires a few configurations to be set in the myconfig.py; basically measurements of the wheel diameter, the length of the wheel base and the length of the axle. See Odometer Software Setup for details.

+

Configure button actions

+

You can use either the web controller or a game controller. You can assign a game pad button OR web ui button to an action by editing the button assignments in myconfig.py. The name of the game pad buttons depend on the game controller you have configured (NOTE: one button is reserved for the emergency stop; you can see which one is assigned by looking at the console output when you start that car using the python manage.py drive command). The 5 available web ui buttons are named web/w1 to web/w5. If you assign None action to a button then it is ignored.

+
    +
  • SAVE_PATH_BTN is the button to save the in-memory path to a file.
  • +
  • LOAD_PATH_BTN is the button to (re)load path from the csv file into memory.
  • +
  • RESET_ORIGIN_BTN is the button to set the current position as the origin.
  • +
  • ERASE_PATH_BTN is the button to erase path from memory and reset the origin.
  • +
  • TOGGLE_RECORDING_BTN is the button to toggle recording mode on or off. Note that there is a pre-assigned button in the web ui, so there is not need to assign this button to one of the web/w* buttons if you are using the web ui.
  • +
  • INC_PID_D_BTN is the button to change PID 'D' constant by PID_D_DELTA.
  • +
  • DEC_PID_D_BTN is the button to change PID 'D' constant by -PID_D_DELTA
  • +
  • INC_PID_P_BTN is the button to change PID 'P' constant by PID_P_DELTA
  • +
  • DEC_PID_P_BTN is the button to change PID 'P' constant by -PID_P_DELTA
  • +
+

Recording a path

+

The algorithm assumes we will be driving in a continuous connected path such that the start and end are the same. You can adjust the space between recorded waypoints by editing the PATH_MIN_DIST value in myconfig.py You can change the name and location of the saved file by editing the PATH_FILENAME value.

+

The workflow for recording a path is as follows:

+
    +
  • Enter User driving mode using either the web controller or a game controller.
  • +
  • Move the car to the desired starting point
  • +
  • Erase the path in memory (which will also reset the origin).
  • +
  • Toggle recording on.
  • +
  • Drive the car manually around the track until you reach the desired starting point again.
  • +
  • Toggle recording off.
  • +
  • If desired, save the path.
  • +
+

The path is saved as a comma-separated-values (.csv) file. Each line in the file contains 3 numbers separated by commas; x-position, y-position, throttle. The x and y positions are where the car was when the position was read and the throttle is the throttle value that was in effect at that time. Here is a section from a path file for illustration;

+
0.0033510593930259347, 7.996719985734671, 0.14
+0.11206169077195227, 9.325505392625928, 0.16
+0.20344207028392702, 10.525161047000438, 0.18
+0.311049185693264, 11.724678185302764, 0.14
+0.23874327179510146, 12.75951695209369, 0.13
+0.26568955020047724, 14.015127370599657, 0.15
+0.35580877534812316, 15.06704786233604, 0.18
+0.4303318051388487, 16.192974457982928, 0.15
+0.2126157897291705, 17.302927474025637, 0.17
+-0.37973403913201764, 18.24986434960738, 0.17
+-1.2822835729457438, 18.97783037694171, 0.17
+-2.4313870034529828, 19.338536370545626, 0.17
+-3.633584696042817, 19.182584955357015, 0.17
+-4.694471199880354, 18.471380048431456, 0.25
+-5.2241318183369, 17.256997687276453, 0.25
+-5.462499356712215, 15.947787401732057, 0.25
+-5.5869644057238474, 14.674541235901415, 0.25
+
+

Since the path is saved in a simple .csv file it can be visualized in many tools. A simple one to visualize your path is CSV Plot. Use the button in the upper-right (just to the left of the home button) to make the axis scale square. Here is an example path (rotated to fit a little better);

+

A CSV Path plotted in https://csvplot.com

+

Following a path

+

The current autopilot uses a constant throttle value. You can set this by editing the PID_THROTTLE value in myconfig.py.

+

The workflow for following a path is as follows:

+
    +
  • Enter User driving mode using either the web controller or a game controller.
  • +
  • Move the car to the desired starting point.
  • +
  • If you are following a saved path, then load the path into memory.
  • +
  • Reset the origin (be careful; don't erase the path, just reset the origin).
  • +
  • Enter Autosteering or Autopilot driving mode. If you are in Autosteering mode you will need to manually provide throttle for the car to move. If you are in Autopilot mode the car should drive itself completely.
  • +
  • Re-enter User mode to stop the car.
  • +
+

The Path Follow Algorithm

+

The algorithm we use for following the path is extremely simple; it's the Hello World of path following.

+
    +
  • Get the vehicle's current GPS position
  • +
  • Find the nearest point in the list of waypoints; starting at the last nearest waypoint, search up to PATH_SEARCH_LENGTH points and choose the waypoint that is closest to the current position.
  • +
  • Choose the waypoint PATH_LOOK_AHEAD points ahead of the closest point on the path.
  • +
  • Choose the waypoint PATH_LOOK_BEHIND points behind the closes point on the path.
  • +
  • Use behind and ahead waypoints to create a line that represents the desired track.
  • +
  • Calculate the cross-track error between the vehicle's current position and the desired track. The cross-track error is a signed value that represents the distance from the line and which side of the line we are on.
  • +
  • Use the cross-track error as the error input into the PID controller that controls steering.
  • +
  • The PID controller outputs a new steering value.
  • +
+

In addition to steering, the path follow controller will set the throttle to the throttle saved with the closest point on the path scaled by the PID_THROTTLE value in the myconfig.py file. That can be overridden if USE_CONSTANT_THROTTLE = True in the myconfig.py, in which case it will use PID_THROTTLE as the constant throttle.

+

Configuring Path Follow Parameters

+

So the algorithm uses the cross-track error between a desired line and the vehicle's measured position to decide how much and which way to steer. But the path we recorded is not a simple line; it is a lot of points that is typically some kind of circuit. As described above, we use the vehicle's current position to choose a short segment of the path that we use as our desired track. That short segment is recalculated every time we get a new measured car position. There are a few configuration parameters that determine exactly which two points on the path that we use to calculate the desired track line.

+
PATH_SEARCH_LENGTH = None   # number of points to search for closest point, None to search entire path
+PATH_LOOK_AHEAD = 1         # number of points ahead of the closest point to include in cte track
+PATH_LOOK_BEHIND = 1        # number of points behind the closest point to include in cte track   
+
+

Generally, if you are driving very fast you might want the look ahead to be larger than if driving slowly so that your steering can anticipate upcoming curves. Increasing the length of the resulting track line, by increasing the look behind and/or look ahead, also acts as a noise filter; it smooths out the track. This reduces the amount of jitter in the controller. However, this must be balanced with the true curves in the path; longer track segments effectively 'flatten' curves and so can result in understeer; not steering enough when on a curve.

+

What is a PID Controller?

+

A PID controller is function that takes two parameters; 1) a target value to be achieved and 2) the current measured value. The PID function uses the difference between the target value and the measured value (the error) to calculate a control value that can be used to achieve the target value (so to drive the error between the desired value and the measured value to zero).

+

In our case, we want to stay on the desired track; we want the cross-track error (the distance between the desired line and the vehicle's measured position) to be zero; the control value that is output is a steering value that should move the vehicle closer to the desired line. So our PID controller is controlling steering based on which side of the line and how far from the desired line the car is.

+

The algorithm uses the sign of the cross track error to determine which way to steer. Naturally, if the cross-track error indicates the vehicle is to the left of the desired track, then the vehicle should turn right to move towards the desired track. If the cross-track error indicates the vehicle is to the right of the desired track, then the vehicle should turn left to move towards the desired track. If the vehicle is on the desired track, then the steering should be neutral.

+

But how much should we steer; should we turn only slightly or should be turn very hard? The PID controller will output a steering value that is proportional to the magnitude of the cross-track error. So if we are near the desired track, then it will steer slightly. If we are far off the desired track then it will turn harder.

+

Determining PID Coefficients

+

The PID coefficients are the most important (and time consuming) parameters to configure. If they are not correct for your car then it will not follow the path. The coefficients can be changed by editing their values in the myconfig.py file.

+
    +
  • PID_P is the proportional coefficient; it is multiplied with the cross-track error. This is the most important parameter; it contributes the most to the output steering value and in some cases may be all that is needed to follow the line. If this is too small then car will not turn enough when it reaches a curve. If this to too large then it will over-react to small changes in the path and may start turning in circles; especially when it gets to a curve.
  • +
  • PID_D is the differential coefficient; it is multiplied with the change in the cross-track error. This parameter can be useful in reducing oscillations and overshoot.
  • +
  • PID_I is the integral coefficient; it is multiplied with the total accumulated cross-track error. This may be useful in reducing offsets caused by accumulated error; such as if one wheel is slightly smaller in diameter than another.
  • +
+

As descibed in the Configuring Button Actions section above, you can also assign functions like INC_PID_P_BTN or DEC_PID_P_BTN to the game controller or web ui buttons to modify the PID parameters 'on the fly'. This helps when you are figuring out the best coefficients. The button functions allow you to change values without having to stop the car, edit myconfig.py and restart the car.

+

Determining PID Coefficients can be difficult. One approach is:

+
    +
  • First determine the P coefficient.
      +
    • zero out the D and the I coefficients.
    • +
    • Use a kind of 'binary' search to find a value where the vehicle will roughly follow a recorded straight line; probably oscillating around it. It will be weaving like it is under the influence.
    • +
    • To do this, record a short straight line, maybe 6 meters. You can do this by putting the car into recording mode and walking with the car (so you can keep the throttle at zero). Once the short line is recorded put the car in autopilot mode and stand in the middle of the line holding the car parallel to the line; the car's front wheels should stay stable and straight. Now slowly move the car off the line, keeping the car parallel to the line; the car should start to turn back towards the line. The more off the line you move the more that car should turn. Try both sides of the line.
        +
      • If the car turns away from the line rather than towards the line then change the sign of the P value.
      • +
      • If the car turns very little then increase the P value.
      • +
      • If the car turns very abruptly when off the line then reduce the P value.
      • +
      • Play with the P value until you get the car to turn back to the line smoothly and proportional to how far from the line it is held.
      • +
      • Now try actually driving the line in autopilot mode. The car may oscillate around the line; it if oscillates a lot then reduce the P value. Adjust the P value so it can actually drive that line from one end to the other. It will likely go out of control at the end of the line; that is normal because the path is not closed.
      • +
      • Once you can drive a short straight line then drive the car in autopilot on a full closed path with only the P value set. Make sure there is a fairly tight turn in the path. Adjust the P value until you get acceptable performance. Once you get that working then you can refine things with the D value.
      • +
      +
    • +
    +
  • +
  • Next find a D coefficient that reduces the weaving (oscillations) on a straight line. Then record a path with a tight turn. Find a D coefficient that reduces the overshoot when turning.
  • +
  • You may not even need the I value. If the car becomes unstable after driving for a while then you may want to start to set this value. It will likely be much smaller than the other values.
  • +
+

Be patient. Start with a reasonably slow speed. Change one thing at a time and test the change; don't make many changes at once. Write down what is working.

+

Once you have a stable PID controller, then you can figure our just how fast you can go with it before autopilot becomes unstable. If you want to go faster then set the desired speed and start tweaking the values again using the method suggested above.

+


+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/robot_sbc/intel_t265/index.html b/guide/robot_sbc/intel_t265/index.html new file mode 100644 index 00000000..5fbdee4e --- /dev/null +++ b/guide/robot_sbc/intel_t265/index.html @@ -0,0 +1,224 @@ + + + + + + + + A Guide to using the Intel Realsense T265 sensor with Donkeycar - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+
    +
  • + +
  • + Edit on GitHub +
  • +
+
+
+
+
+ +

A Guide to using the Intel Realsense T265 sensor with Donkeycar

+
+
    +
  • Note Although the Realsense T265 can be used with a Nvidia Jetson Nano, it's a bit easier to set up with a Raspberry Pi (we recommend the RPi 4, with at least 4GB memory). Also, the Intel Realsense D4XX series can also be used with Donkeycar as a regular camera (with the use of its depth sensing data coming soon), and we'll add instructions for that when it's ready.
  • +
+

Original T265 path follower code by Tawn Kramer

+

Step 1: Setup Librealsense on Ubuntu Machine

+

Using the latest version of Raspian (tested with Raspian Buster) on the RPi, follow these instructions to set up Intel's Realsense libraries (Librealsense) and dependencies.

+

Step 1: Setup Donkeycar

+

Follow the standard instructions here

+

Step 3: Run the Donkeycar path follower app

+

After you’ve done that, set up the directory with this:

+

```donkey createcar --path ~/follow --template path_follower

+

Running +cd ~/follow +python3 manage.py drive

+

Once it’s running, open a browser on your laptop and enter this in the URL bar: http://:8890

+

The rest of the instructions from Tawn’s repo:

+

When you drive, this will draw a red line for the path, a green circle for the robot location.

+

1) Mark a nice starting spot for your robot. Be sure to put it right back there each time you start. +2) Drive the car in some kind of loop. You see the red line show the path. +3) Hit X on the PS3/4 controller to save the path. +4) Put the bot back at the start spot. +5) Then hit the “select” button (on a PS3 controller) or “share” (on a PS4 controller) twice to go to pilot mode. This will start driving on the path. If you want it go faster or slower, change this line in the myconfig.py file: THROTTLE_FORWARD_PWM = 530

+

Check the bottom of myconfig.py for some settings to tweak. PID values, map offsets and scale. things like that. You might want to start by downloading and using the myconfig.py file from my repo, which has some known-good settings and is otherwise a good place to start. +Some tips:

+

When you start, the green dot will be in the top left corner of the box. You may prefer to have it in the center. If so, change PATH_OFFSET = (0, 0) in the myconfig.py file to PATH_OFFSET = (250, 250)

+

For a small course, you may find that the path is too small to see well. In that case, change PATH_SCALE = 5.0 to PATH_SCALE = 10.0 (or more, if necessary)

+

If you’re not seeing the red line, that means that a path file has already been written. Delete “donkey_path.pkl” (rm donkey_path.pkl) and the red line should show up

+

When you're running in auto mode, the green dot will change to blue

+

It defaults to recording a path point every 0.3 meters. If you want it to be smoother, you can change to a smaller number in myconfig.py with this line: PATH_MIN_DIST = 0.3

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/robot_sbc/intelt265/index.html b/guide/robot_sbc/intelt265/index.html new file mode 100644 index 00000000..ae55ff3e --- /dev/null +++ b/guide/robot_sbc/intelt265/index.html @@ -0,0 +1,232 @@ + + + + + + + + A Guide to using the Intel Realsense T265 sensor with Donkeycar - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+
    +
  • + +
  • + Edit on GitHub +
  • +
+
+
+
+
+ +

A Guide to using the Intel Realsense T265 sensor with Donkeycar

+
+
    +
  • Note Although the Realsense T265 can be used with a Nvidia Jetson Nano, it's a bit easier to set up with a Raspberry Pi (we recommend the RPi 4, with at least 4GB memory). Also, the Intel Realsense D4XX series can also be used with Donkeycar as a regular camera (with the use of its depth sensing data coming soon), and we'll add instructions for that when it's ready.
  • +
+

Original T265 path follower code by Tawn Kramer

+
    +
  • +

    Step 1: Setup Donkeycar

    +
  • +
  • +

    Step 2: Setup Librealsense on Ubuntu Machine. +Using the latest version of Raspian (tested with Raspian Buster) on the RPi, follow these instructions to set up Intel's Realsense libraries (Librealsense) and dependencies.

    +
  • +
  • +

    Step 3: Setup TensorRT on your Jetson Nano

    +
  • +
+

After you’ve done that, set up the directory with this:

+
donkey createcar --path ~/follow --template path_follower
+
+

Running

+
cd ~/follow 
+python3 manage.py drive
+
+

Once it’s running, open a browser on your laptop and enter this in the URL bar: http://<your nano’s IP address>:8887

+

The rest of the instructions from Tawn’s repo:

+

When you drive, this will draw a red line for the path, a green circle for the robot location. +Mark a nice starting spot for your robot. Be sure to put it right back there each time you start. +Drive the car in some kind of loop. You see the red line show the path. +Hit X on the PS3/4 controller to save the path. +Put the bot back at the start spot. +Then hit the “select” button (on a PS3 controller) or “share” (on a PS4 controller) twice to go to pilot mode. This will start driving on the path. If you want it go faster or slower, change this line in the myconfig.py file: THROTTLE_FORWARD_PWM = 530 +Check the bottom of myconfig.py for some settings to tweak. PID values, map offsets and scale. things like that. You might want to start by downloading and using the myconfig.py file from my repo, which has some known-good settings and is otherwise a good place to start. +Some tips:

+

When you start, the green dot will be in the top left corner of the box. You may prefer to have it in the center. If so, change PATH_OFFSET = (0, 0) in the myconfig.py file to PATH_OFFSET = (250, 250)

+

For a small course, you may find that the path is too small to see well. In that case, change PATH_SCALE = 5.0 to PATH_SCALE = 10.0 (or more, if necessary)

+

If you’re not seeing the red line, that means that a path file has already been written. Delete “donkey_path.pkl” (rm donkey_path.pkl) and the red line should show up

+

It defaults to recording a path point every 0.3 meters. If you want it to be smoother, you can change to a smaller number in myconfig.py with this line: PATH_MIN_DIST = 0.3

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/robot_sbc/setup_jetson_nano/index.html b/guide/robot_sbc/setup_jetson_nano/index.html new file mode 100644 index 00000000..19cae8f7 --- /dev/null +++ b/guide/robot_sbc/setup_jetson_nano/index.html @@ -0,0 +1,463 @@ + + + + + + + + Get Your Jetson Nano/Xavier NX Working - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Get Your Jetson Nano/Xavier NX Working

+

donkey

+

We have a different approaches to installing the software depending on the +version of Donkey Car. For Donkey Car <= 4.5.X we are using Jetpack 4.5.X +which comes with Tensorflow 2.3.1. The python installation is using virtual +env, i.e. it is based on the system python with version 3.6. This is the +only version that is working on the old Jetson Nano.

+

For the main branch we have updated Tensorflow to 2.9 and python to 3.8 or +3.9. This is running on a newer version of Jetpack. You will need a Jetson +Xavier or any of the newer Jetsons like the Orin, to use Jetpack 5.0.2. To +decouple the python installation from the system python we are using Miniforge +which is a mamba based version of Miniconda that works on the aarm architecture.

+

For Donkey Car <= 4.5.X please go to the next section. For the latest +version on the main branch please jump +to this section.

+

We recommend to use 4GB version of the Jetson Nano or the Jetson Xavier to +run the software without issues. It's also recommended using a 128GB +microSD card with U3 speed, like for example +this SanDisk SD Card.

+

These are the supported versions:

+ + + + + + + + + + + + + + + + + + + + + + + + + + +
JetsonJetpackPythonDonkeyTensorflow
Nano4.5.13.6<= 4.5.X2.3.1
Xavier/Orin5.0.23.8>= 5.X2.9
+

Then Create your Donkeycar Application

+

Installation for Donkey Car <= 4.5.X

+
+

Note: These instructions are working for DC 4.3.6 and DC 4.4.0 only at the +moment. We are working on a patch to have support for 4.5.X too.

+
+ +

Step 1a: Flash Operating System

+

These instructions work for Jetpack 4.5.1.

+ +

This installs the official Nvidia build of Tensorflow 2.3.1; make sure you are +using the same version of Tensorflow on your host PC if you are using one. Using +a different version of Tensorflow to train your network may result in errors +when you attempt to use it as an autopilot.

+

Visit the official Nvidia Jetson Nano Getting Started Guide +or Nvidia Xavier NX Getting Started Guide. +Work through the Prepare for Setup, Writing Image to the microSD Card, +and Setup and First Boot instructions, then return here.

+

Once you're done with the setup, ssh into your vehicle. Use the terminal for +Ubuntu or Mac. Putty +for windows.

+

Remove Libre Office:

+
sudo apt-get remove --purge libreoffice*
+sudo apt-get clean
+sudo apt-get autoremove
+
+

And add a 8GB swap file:

+
git clone https://github.com/JetsonHacksNano/installSwapfile
+cd installSwapfile
+./installSwapfile.sh
+sudo reboot now 
+
+

Step 2a: Free up the serial port (optional. Only needed if you're using the Robohat MM1)

+
sudo usermod -aG dialout <your username>
+sudo systemctl disable nvgetty
+
+

Step 3a: Install System-Wide Dependencies

+

First install some packages with apt-get.

+
sudo apt-get update -y
+sudo apt-get upgrade -y
+sudo apt-get install -y libhdf5-serial-dev hdf5-tools libhdf5-dev zlib1g-dev zip libjpeg8-dev liblapack-dev libblas-dev gfortran
+sudo apt-get install -y python3-dev python3-pip
+sudo apt-get install -y libxslt1-dev libxml2-dev libffi-dev libcurl4-openssl-dev libssl-dev libpng-dev libopenblas-dev
+sudo apt-get install -y git nano
+sudo apt-get install -y openmpi-doc openmpi-bin libopenmpi-dev libopenblas-dev
+
+

Step 4a: Setup Python Environment.

+

Setup Virtual Environment

+
pip3 install virtualenv
+python3 -m virtualenv -p python3 env --system-site-packages
+echo "source ~/env/bin/activate" >> ~/.bashrc
+source ~/.bashrc
+
+

Setup Python Dependencies

+

Next, you will need to install packages with pip:

+
pip3 install -U pip testresources setuptools
+pip3 install -U futures==3.1.1 protobuf==3.12.2 pybind11==2.5.0
+pip3 install -U cython==0.29.21 pyserial
+pip3 install -U future==0.18.2 mock==4.0.2 h5py==2.10.0 keras_preprocessing==1.1.2 keras_applications==1.0.8 gast==0.3.3
+pip3 install -U absl-py==0.9.0 py-cpuinfo==7.0.0 psutil==5.7.2 portpicker==1.3.1 six requests==2.24.0 astor==0.8.1 termcolor==1.1.0 wrapt==1.12.1 google-pasta==0.2.0
+pip3 install -U gdown
+
+# This will install tensorflow as a system package
+pip3 install --pre --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v45 tensorflow==2.5
+
+

Install Donkeycar Python Code

+

Change to a dir you would like to use as the head of your projects. Assuming +you've already made the projects directory above, you can use that. Get +the latest 4.5.X release and install that into the venv.

+
mkdir projects
+cd ~/projects
+git clone https://github.com/autorope/donkeycar
+cd donkeycar
+git fetch --all --tags -f
+git checkout 4.5.1
+pip install -e .[nano]
+
+
+

Step 5a: (Optional) Install PyGame for USB camera

+

If you plan to use a USB camera, you will also want to setup pygame:

+
sudo apt-get install python-dev libsdl1.2-dev libsdl-image1.2-dev libsdl-mixer1.2-dev libsdl-ttf2.0-dev libsdl1.2-dev libsmpeg-dev python-numpy subversion libportmidi-dev ffmpeg libswscale-dev libavformat-dev libavcodec-dev libfreetype6-dev
+pip install pygame
+
+

Later on you can add the CAMERA_TYPE="WEBCAM" in myconfig.py.

+

Installation for Donkey Car >= 5.X

+

Instructions for the latest code from the main branch or newer releases >= +5.X. Note the installation differs between the two available OSs. On Jetson +you need to install Jetpack 5.0.2.

+

Installation on Jetson Xavier (or newer Jetson boards)

+ +

Step 1b: Flash Operating System

+

These instructions work for Jetpack 5.0.2.

+

Please install the Jetpack image from jetson-nx-developer-kit-sd-card-image.zip.

+

Visit the official Nvidia Xavier NX Getting Started Guide. +Work through the Prepare for Setup, Writing Image to the microSD Card, +and Setup and First Boot instructions, then return here.

+

Once you're done with the setup, ssh into your vehicle. Use the terminal for +Ubuntu or Mac. Putty +for windows.

+

Remove Libre Office:

+
sudo apt-get remove --purge libreoffice*
+sudo apt-get clean
+sudo apt-get autoremove
+
+

And add a 8GB swap file. Note, if you intend to run from an SSD, perform the +swap file setup only after booting from the SSD:

+
git clone https://github.com/JetsonHacksNano/installSwapfile
+cd installSwapfile
+./installSwapfile.sh -s 8
+reboot 
+
+

Step 2b: Free up the serial port (optional. Only needed if you're using the Robohat MM1)

+
sudo usermod -aG dialout <your username>
+sudo systemctl disable nvgetty
+
+

Step 3b: Setup python environment

+
    +
  • Step 3b-1: Install tensorflow into the system python environment
  • +
+

To install tensorflow and its dependencies for JP5.1.2 follow the NVIDIA instructions here

+
    +
  • Step 3b-2: Set up a venv
  • +
+
python3 -m venv env --system-site-packages
+echo "source ~/env/bin/activate" >> ~/.bashrc
+source ~/.bashrc
+
+
    +
  • Step 3b-3: Install Donkey Car
  • +
+

There are two different installations possible. Very likely you will +want to do the user install. Then you will perform Step 2a. In case you want +to debug or edit the source code, you will need to do the more advanced +developer install. But you can do only one.

+
+

Note: Only do Step 3b-4 or 3b-5 but not both!

+
+
    +
  • Step 3b-4: User install
  • +
+

As you have activated the new env env already you type:

+
pip install donkeycar[nano]
+pip install -U albumentations --no-binary qudida,albumentations
+pip uninstall opencv-python-headless
+pip uninstall scikit-learn
+git clone https://github.com/scikit-learn/scikit-learn.git
+cd scikit-learn/
+python setup.py install
+sudo chmod 666 /dev/gpiochip*
+
+

This will install the latest release.

+
    +
  • Step 3b-5: Developer install
  • +
+

Only do this if you have not done the user install in 3b-4.

+

Here you can choose which branch or tag you want to install, and you can +edit and/or debug the code, by downloading the source code from GitHub. Do this +for getting the latest version from the main branch.

+
mkdir projects
+cd projects
+git clone https://github.com/autorope/donkeycar
+cd donkeycar
+git checkout main
+pip install -e .[nano]
+pip install -U albumentations --no-binary qudida,albumentations
+pip uninstall opencv-python-headless
+pip uninstall scikit-learn
+git clone https://github.com/scikit-learn/scikit-learn.git
+cd scikit-learn/
+python setup.py install
+sudo chmod 666 /dev/gpiochip*
+
+
    +
  • Step 3b-6: Check the TF and OpenCV installation
  • +
+

Run python and verify that tensorflow is version 2.9 and trt is version 8.2.1. +To get the tensorrt shared libraries to load correctly we must set the +environment variable LD_PRELOAD as:

+
export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libnvinfer.so.8:/usr/lib/aarch64-linux-gnu/libgomp.so.1
+
+

Note, this has to be done either every time you run donkeycar or tensorflow, or +you put the above line into your .bashrc.

+
python
+>>> import tensorflow as tf
+>>> tf.__version__
+>>> from tensorflow.python.compiler.tensorrt import trt_convert as trt
+>>> trt._check_trt_version_compatibility()
+>>> import cv2
+>>> print(cv2.getBuildInformation())
+
+

Step 4b: (Optional) Install PyGame for USB camera

+

If you plan to use a USB camera, you will also want to setup pygame:

+
pip install pygame
+
+

Later on you can add the CAMERA_TYPE="WEBCAM" in myconfig.py.

+

(Optional) Fix for pink tint on CSIC cameras

+

This applies to any installation you did above, either JP 4.6.X or 5.0.X. +If you're using a CSIC camera you may have a pink tint on the images. As +described here, +this fix will remove it.

+
wget https://www.dropbox.com/s/u80hr1o8n9hqeaj/camera_overrides.isp
+sudo cp camera_overrides.isp /var/nvidia/nvcam/settings/
+sudo chmod 664 /var/nvidia/nvcam/settings/camera_overrides.isp
+sudo chown root:root /var/nvidia/nvcam/settings/camera_overrides.isp
+
+
+

Next, create your Donkeycar application.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/robot_sbc/setup_raspberry_pi/index.html b/guide/robot_sbc/setup_raspberry_pi/index.html new file mode 100644 index 00000000..0702a090 --- /dev/null +++ b/guide/robot_sbc/setup_raspberry_pi/index.html @@ -0,0 +1,530 @@ + + + + + + + + Get Your Raspberry Pi Working - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Get Your Raspberry Pi Working

+

donkey

+

Please read this carefully as Donkey Car is now installed differently +depending on the version. The latest Donkey Car version is 5.x and requires +64-bit Raspberry Pi OS Bookworm.

+

If you're using an older version of Donkey Car, sucbh as 4.X, then you need to +use the older Raspberry Pi OS (Raspian) version called Buster. Jump to +those instructions here.

+

Tub data, car templates, etc are compatible between the two versions as well +as models in keras format .h5. However, Tensorflow Lite models .tflite +are not and need to be regenerated.

+

In general, we recommend the RPi 4 or 5 with 4GB of ram. It's also recommended +using a 128GB microSD card with U3 speed, like for example +this SanDisk SD Card.

+

Installation for latest Donkey Car (>= 5.0) using Raspberry Pi OS Bookworm

+

This installation is using Raspberry Pi OS Bookworm (64 bit).

+ +

Step 1: Install Raspberry Pi OS

+

Raspberry Pi OS can be installed with the graphical installer Raspberry Pi +Imager which can be downloaded from here. +Please download and start the application, with the SD card you'll be using for your RaspberryPi inserted into your computer's SD card reader.

+

First choose the device you'll be using: Raspberry Pi 5 or Raspberry Pi 4

+

Then click on 'Operating System' and select 'Raspberry Pi OS (64 bit)'

+

Then click on 'Storage' and select your SD card.

+

Press 'NEXT' and you will be given the option to apply 'OS customization settings'. Select 'Edit Settings'

+

Here you can enter the specifics of your username. password and wifi details. Set a hostname (here chosen to be +'donkeycar'), desired password, your wifi, region, etc.

+

It should look like this: +imager_advanced_dialog

+

Everything else can be left at the default. When you're done, click on 'Save' which will bring you back to the OS customization dialog. Click on 'Yes' and it will write the OS to your SD card.

+

When it's done, you can place your SD card in the Pi and power it on. It will take a minute or so to boot the first time, but once it has done so (the green light stops flashing)

+

You should be able to ssh in to the Pi through your network using the hostname 'donkeycar.local' (or whatever +you chose in the menu) like this: ssh username@hostname.local. So in the above example it would be ssh mydonkey@donkeycar.local .

+

Step 2: Update and Upgrade

+
sudo apt-get update --allow-releaseinfo-change
+sudo apt-get upgrade
+
+

Step 3: Raspi-config

+

Launch the Raspi config utility:

+
sudo raspi-config
+
+
    +
  • Enable Interfacing Options - I2C
  • +
  • Select Advanced Options - Expand Filesystem so you can use your whole + sd-card storage
  • +
  • Do not enable the legacy camera (it's disabled by default, so don't change anything)
  • +
+

Choose <Finish> and hit enter.

+
+

Note: Reboot after changing these settings. Should happen if you select yes.

+
+

Alternatively if you connect to the full desktop using VNC and are running +the desktop, go to 'Raspberry -> Preferences -> Raspberry Pi Configuration' +and apply the settings there.

+
+

Note: If you prefer to install the headless version of Raspberry Pi OS, +please follow the steps here. +You will need to run sudo apt -y install pip git afterwards.

+
+

Step 4: Create a virtual environment for donkeycar

+

To create a virtual environmnet run the following from your home directory:

+
python3 -m venv env --system-site-packages
+echo "source ~/env/bin/activate" >> ~/.bashrc
+source ~/.bashrc
+
+

Install required libraries

+
sudo apt install libcap-dev
+
+

Step 5: Install Donkeycar Python Code

+

Create a project directory you would like to use as the +head of your projects, change into it and download and install donkeycar +from GitHub. Make sure your donkey env is activated.

+
mkdir projects
+cd projects
+git clone https://github.com/autorope/donkeycar
+cd donkeycar
+git checkout main
+pip install -e .[pi]
+
+

Further steps

+

You can validate your tensorflow install with

+
python -c "import tensorflow; print(tensorflow.__version__)"
+
+

+
+

Installation for Donkeycar <= 4.5 using Raspberry Pi OS Buster

+

This installation is using Raspberry Pi OS Buster (32 bit).

+ +

Step 1: Flash Operating System

+
+

Note: If you plan to use the mobile app, consider using the pre-built image. +Refer to the mobile app user guide for +details.

+
+

You need to flash a micro SD image with an operating system.

+
+

Note: Raspbian Latest (bullseye) is not compatible with the Python camera +bindings. The underlying camera system has changed. Please follow steps +below for installing the latest version from the main branch.

+
+
    +
  1. Download Raspian Legacy (Buster).
  2. +
  3. Follow OS specific guides here.
  4. +
  5. Leave micro SD card in your machine and edit/create some files as below:
  6. +
+

Step 2: Setup the WiFi for first boot

+

We can create a special file which will be used to login to wifi on first boot. +More +reading here, +but we will walk you through it.

+

On Windows, with your memory card image burned and memory disc still inserted, +you should see two drives, which are actually two partitions on the mem disc. +One is labeled boot. On Mac and Linux, you should also have access to the _ +_boot__ partition of the mem disc. +This is formatted with the common FAT type and is where we will edit some files +to help it find and log-on to your wifi on its first boot.

+
+

Note: If boot is not visible right away, try unplugging and re-inserting +the memory card reader.

+
+
    +
  • Start a text editor
      +
    • gedit on Linux.
    • +
    • Notepad++ on Windows.
    • +
    • VI on Mac (type vi /Volumes/boot/wpa_supplicant.conf where boot is the + name of the SD Card).
    • +
    +
  • +
  • Possible country codes to use can be + found here
  • +
  • Paste and edit this contents to match your wifi, adjust as needed:
  • +
+
country=US
+ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
+update_config=1
+
+network={
+    ssid="<your network name>"
+    psk="<your password>"
+}
+
+
+

Note - country defines allowed wifi channels, ensure to set it properly to +your location and hardware.

+

Replace <your network name> with the ID of your network. Leave the quotes. +I've seen problems when the network name contained an apostrophe, like "Joe's +iPhone". +Replace <your password> with your password, leaving it surrounded by quotes. +If it bothers you to leave your password unencrypted, you may change +the contents later +once you've gotten the pi to boot and log-in.

+
    +
  • Save this file to the root of boot partition with the + filename wpa_supplicant.conf. On first boot, this file will be moved + to /etc/wpa_supplicant/wpa_supplicant.conf where it may be edited later. If + you are using Notepad on Windows, make sure it doesn't have a .txt at the end.
  • +
+

Step 3: Setup Pi's Hostname

+
+

Note: This step only possible on a Linux host pc. Otherwise you can set it up +later in raspi-config after logging in to your pi.

+
+

We can also setup the hostname so that your Pi easier to find once on the +network. If yours is the only Pi on the network, then you can find it with

+
ping raspberrypi.local
+
+

once it's booted. If there are many other Pi's on the network, then this will +have problems. +If you are on a Linux machine, or are able to edit the UUID partition, then you +can edit the /etc/hostname and /etc/hosts files now to make finding your pi +on the network easier after boot. +Edit those to replace raspberrypi with a name of your choosing. +Use all lower case, no special characters, no hyphens, yes underscores _. +Good idea is to use something like pi-<MAC_ADDRESS> such as pi-deadbeef +especially if you have more pi devices in the same network.

+
sudo vi /media/userID/UUID/etc/hostname
+sudo vi /media/userID/UUID/etc/hosts
+
+

Step 4: Enable SSH on Boot

+

Put a file named ssh in the root of your boot partition. On Mac or Linux +this can be done using the touch command. For example, on the +Mac, touch /Volumes/boot/ssh where boot is the name of the SD card.

+

Now your SD card is ready. Eject it from your computer - wait until system shows +the writing is done +and it is safe to remove card. Ensure Pi is turned off, put the card in the Pi +and power on the Pi.

+

Step 5: Connecting to the Pi

+

If you followed the above instructions to add wifi access, your Pi should +now be connected to your wifi network. Now you need to find its IP address +so you can connect to it via SSH.

+

The easiest way (on Ubuntu) is to use the findcar donkey command. +You can try ping raspberrypi.local. If you've modified the hostname, then you +should try:

+
ping <your hostname>.local
+
+

This will fail on a windows machine. Windows users will need the full IP +address (unless using cygwin).

+

If you are having troubles locating your Pi on the network, you will want to +plug in an HDMI monitor and USB keyboard into the Pi. Boot it. Login with:

+
    +
  • Username: pi
  • +
  • Password: raspberry
  • +
+

Then try the command:

+
ifconfig wlan0
+
+

or just all Ip addresses assigned to the pi (wifi or cable):

+
ip -br a
+
+

If this has a valid IPv4 address, 4 groups of numbers separated by dots, then +you can try that with your SSH command. If you don't see anything like that, +then your wifi config might have a mistake. You can try to fix with

+
sudo nano /etc/wpa_supplicant/wpa_supplicant.conf
+
+

If you don't have a HDMI monitor and keyboard, you can plug-in the Pi with a +CAT5 cable to a router with DHCP. +If that router is on the same network as your PC, you can try:

+
ping raspberrypi.local
+
+

Hopefully, one of those methods worked and you are now ready to SSH into your +Pi. On Mac and Linux, you can open Terminal. +On Windows you can +install Putty, one of the alternatives, +or on Windows 10 you may have ssh via the command prompt.

+

If you have a command prompt, you can try:

+
ssh pi@raspberrypi.local
+
+

or

+
ssh pi@<your pi ip address>
+
+

or via Putty.

+
    +
  • Username: pi
  • +
  • Password: raspberry
  • +
  • Hostname: <your pi IP address>
  • +
+

Step 6: Update and Upgrade

+
sudo apt-get update --allow-releaseinfo-change
+sudo apt-get upgrade
+
+

Step 7: Raspi-config

+
sudo raspi-config
+
+
    +
  • change default password for pi
  • +
  • change hostname
  • +
  • enable Interfacing Options - I2C
  • +
  • enable Interfacing Options - Camera
  • +
  • select Advanced Options - Expand Filesystem so you can use your whole + sd-card storage
  • +
+

Choose <Finish> and hit enter.

+
+

Note: Reboot after changing these settings. Should happen if you select yes.

+
+

Step 8: Install Dependencies

+
sudo apt-get install build-essential python3 python3-dev python3-pip python3-virtualenv python3-numpy python3-picamera python3-pandas python3-rpi.gpio i2c-tools avahi-utils joystick libopenjp2-7-dev libtiff5-dev gfortran libatlas-base-dev libopenblas-dev libhdf5-serial-dev libgeos-dev git ntp
+
+

Step 9: (Optional) Install OpenCV Dependencies

+

If you are going for a minimal install, you can get by without these. But it can +be handy to have OpenCV.

+
sudo apt-get install libilmbase-dev libopenexr-dev libgstreamer1.0-dev libjasper-dev libwebp-dev libatlas-base-dev libavcodec-dev libavformat-dev libswscale-dev
+
+

Step 10: Setup Virtual Env

+

This needs to be done only once:

+
python3 -m virtualenv -p python3 env --system-site-packages
+echo "source ~/env/bin/activate" >> ~/.bashrc
+source ~/.bashrc
+
+

Modifying your .bashrc in this way will automatically enable this environment +each time you login. To return to the system python you can type deactivate.

+

Step 11: Install Donkeycar Python Code

+
    +
  • Create and change to a directory you would like to use as the head of your + projects.
  • +
+
mkdir projects
+cd projects
+
+
    +
  • Get the latest stable release (which will be a 4.4 version)
  • +
+
git clone https://github.com/autorope/donkeycar
+cd donkeycar
+git fetch --all --tags -f
+latestTag=$(git describe --tags `git rev-list --tags --max-count=1`)
+git checkout $latestTag
+pip install -e .[pi]
+pip install https://github.com/lhelontra/tensorflow-on-arm/releases/download/v2.2.0/tensorflow-2.2.0-cp37-none-linux_armv7l.whl
+
+

You can validate your tensorflow install with

+
python -c "import tensorflow; print(tensorflow.__version__)"
+
+

Step 12: (Optional) Install OpenCV

+

If you've opted to install the OpenCV dependencies earlier, you can install +Python OpenCV bindings now with command:

+
sudo apt install python3-opencv
+
+

If that failed, you can try pip:

+
pip install opencv-python
+
+

Then test to see if import succeeds.

+
python -c "import cv2"
+
+

And if no errors, you have OpenCV installed!

+

Step 13: (Optional) Install Mobile App

+

There is a mobile application available on the iPhone and Android that provides +an alternative user experience. It can be installed manually or by downloading +an SD card image. Follow +these instructions +to install manually.

+
+

Note The server component currently supports RaspberryPi 4B only.

+
+

Next, create your Donkeycar application.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/robot_sbc/tensorrt_jetson_nano/index.html b/guide/robot_sbc/tensorrt_jetson_nano/index.html new file mode 100644 index 00000000..b6ee0540 --- /dev/null +++ b/guide/robot_sbc/tensorrt_jetson_nano/index.html @@ -0,0 +1,280 @@ + + + + + + + + A Guide to using TensorRT on the Nvidia Jetson Nano - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+
    +
  • + +
  • + Edit on GitHub +
  • +
+
+
+
+
+ +

A Guide to using TensorRT on the Nvidia Jetson Nano

+
+
    +
  • Note This guide assumes that you are using Ubuntu 18.04. If you are + using Windows refer to + these + instructions on how to setup your computer to use TensorRT.
  • +
+
+

Step 1: Setup TensorRT on Ubuntu Machine

+

Follow the instructions here. +Make sure you use the tar file instructions unless you have previously +installed CUDA using .deb files.

+

Step 2: Setup TensorRT on your Jetson Nano

+
    +
  • Setup some environment variables so nvcc is on $PATH. Add the following + lines to your ~/.bashrc file.
  • +
+
# Add this to your .bashrc file
+export CUDA_HOME=/usr/local/cuda
+# Adds the CUDA compiler to the PATH
+export PATH=$CUDA_HOME/bin:$PATH
+# Adds the libraries
+export LD_LIBRARY_PATH=$CUDA_HOME/lib64:$LD_LIBRARY_PATH
+
+
    +
  • Test the changes to your .bashrc.
  • +
+
source ~/.bashrc
+nvcc --version
+
+

You should see something like:

+
nvcc: NVIDIA (R) Cuda compiler driver
+Copyright (c) 2005-2018 NVIDIA Corporation
+Built on ...
+Cuda compilation tools, release 10.0, Vxxxxx
+
+
    +
  • Switch to your virtualenv and install PyCUDA.
  • +
+
# This takes a a while.`
+pip install pycuda
+
+
    +
  • After this you will also need to setup PYTHONPATH such that + your dist-packages are included as part of your virtualenv. Add this to + your .bashrc. This needs to be done because the python bindings + to tensorrt are available in dist-packages and this folder is usually not + visible to your virtualenv. To make them visible we add it to PYTHONPATH.
  • +
+
export PYTHONPATH=/usr/lib/python3.6/dist-packages:$PYTHONPATH
+
+
    +
  • Test this change by switching to your virtualenv and importing tensorrt.
  • +
+
> import tensorrt as trt
+>  # This import should succeed
+
+

Step 3: Train, Freeze and Export your model to TensorRT format (uff)

+

After you train the linear model you end up with a file with a .h5 +extension.

+
# You end up with a Linear.h5 in the models folder
+python manage.py train --model=./models/Linear.h5 --tub=./data/tub_1_19-06-29,...
+
+# (optional) copy './models/Linear.h5' from your desktop computer to your Jetson Nano in your working dir (~mycar/models/)
+
+# Freeze model using freeze_model.py in donkeycar/scripts ; the frozen model is stored as protocol buffers.
+# This command also exports some metadata about the model which is saved in ./models/Linear.metadata
+python ~/projects/donkeycar/scripts/freeze_model.py --model=~/mycar/models/Linear.h5 --output=~/mycar/models/Linear.pb
+
+# Convert the frozen model to UFF. The command below creates a file ./models/Linear.uff
+cd /usr/lib/python3.6/dist-packages/uff/bin/
+python convert_to_uff.py ~/mycar/models/Linear.pb
+
+

Now copy the converted uff model and the metadata to your Jetson Nano.

+

Step 4

+
    +
  • In myconfig.py pick the model type as tensorrt_linear.
  • +
+
DEFAULT_MODEL_TYPE = `tensorrt_linear`
+
+
    +
  • Finally you can do
  • +
+
# After you scp your `uff` model to the Nano
+python manage.py drive --model=./models/Linear.uff
+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/guide/simulator/index.html b/guide/simulator/index.html new file mode 100644 index 00000000..4e672fda --- /dev/null +++ b/guide/simulator/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/guide/train_autopilot/index.html b/guide/train_autopilot/index.html new file mode 100644 index 00000000..6f6a6278 --- /dev/null +++ b/guide/train_autopilot/index.html @@ -0,0 +1,242 @@ + + + + + + + + Create an autopilot. - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Create an Autopilot

+

Donkey supports three kinds of autopilots; a deep-learning autopilot, a path follow autopilot and a computer vision autopilot.

+

If you followed along with the Create Donkeycar App section, then you know that you choose which template to use when you create your mycar application folder using the createcar command.

+

This section will talk about what the templates are for, then we can get onto training an autopilot.

+

Deep Learning Autopilot

+

The deep learning autopilot uses a single forward facing camera and a convolutional neural network to implement an autopilot using a technique known as Behavioral Cloning (also known as Imitation Learning). The technique is called Behavioral Cloning because the goal is to create an autopilot that imitates that actions of a human. This is the first kind of autopilot that Donkeycar supported and what it is best known for. For driving a car the overall process looks like this;

+
    +
  • A human drives the car to gather data. As the you manually drive around the track Donkeycar records data 20 times per second. Each piece of data has 3 components; a camera image, the throttle value and the steering values that were in place at the time the image was taken. We want about 10,000 of these.
  • +
  • Clean the data. We don't want driving mistakes in the data; things like driving off the track or crashing into an obstacle. Alternatively, we can delete such data while we are driving so it never get's into the data set.
  • +
  • Use the collected data to calculate (train) a Convolutional Neural Network.
  • +
  • Use the trained CNN to infer the throttle and steering values given an image. So when we are in autopilot mode, we take an image, give it to the CNN, get the throttle and steering and put those into the cars hardware. We do that 20 times per second and now we are driving!
  • +
+

Because the deep learning autopilot depends on a camera image, lighting conditions are important. The deep learning template is great for an indoor track where lighting conditions and the details of the room can be controlled, but it can be more difficult to get working outside where lighting conditions are variable and things change in the environment.

+

Aside: The Donkeycar approach to deep learning driving was inspired by an Nvidia research paper entitled End to End Learning for Self-Driving Cars.

+

Train a deep learning autopilot

+

Path Follow Autopilot (using GPS, wheel encoders, etc)

+

The path follow template is an alternative to the deep learning template. Outside we have access to GPS; the path follow template allows you to record a path using a GPS receiver and then configure an autopilot that can follow that path. The overall process looks like this;

+
    +
  • A human drives the car to gather data. The data is aquired from a GPS receiver and represents and (x,y) position in meters. Each (x,y) position is called a waypoint. The user will drive the course once to collect waypoints. The complete set of waypoints is called a path.
  • +
  • The autopilot gets the car's current (x,y) position from the GPS receiver, then finds the two closest points in the path and adjusts the car's steering to drive towards the path. It does this 20 times per second and now we are driving!
  • +
+

There is a lot more detail on this in the next section.

+

Train a path follow autopilot

+

Computer Vision Autopilot

+

The computer vision autopilot uses traditional computer vision techniques, such as color space conversion and edge detection algorithms, to identify features in the camera image and turn those into steering and throttle values. This autopilot has an advantage over the other autopilots in that it does not require manual driving to gather data. Instead you will choose or write a computer vision algorithm and modify the algorithm parameters to suit the track. This autopilot is specifically designed to make it easy to write your own algorithm using the OpenCV library and the many Donkeycar-provided primitives.

+
    +
  • A human chooses the cv algorithm and modifies parameters until it delivers accurate and reliable steering and throttle values.
  • +
  • A human places the car on the track and changes from user to autopilot mode.
  • +
  • When in autopilot mode, an image from the camera is passed to the cv algorithm which interprets it and outputs steering and throttle values. It does this 20 times per second and now we are driving!
  • +
+

There is a lot more detail in the next section about the build-in algorithm and how to write your own algorithm.

+

Computer Vision Autopilot

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/guide/virtual_race_league/index.html b/guide/virtual_race_league/index.html new file mode 100644 index 00000000..68c93643 --- /dev/null +++ b/guide/virtual_race_league/index.html @@ -0,0 +1,15 @@ + + + + + + Redirecting... + + + + + + +Redirecting... + + diff --git a/img/favicon.ico b/img/favicon.ico new file mode 100644 index 00000000..e85006a3 Binary files /dev/null and b/img/favicon.ico differ diff --git a/index.html b/index.html new file mode 100644 index 00000000..9c06b230 --- /dev/null +++ b/index.html @@ -0,0 +1,268 @@ + + + + + + + + Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

About Donkey®

+

Release +All Contributors +License +Discord

+
+

Donkey is a Self Driving Car Platform for hobby remote control cars. Donkey Car is made up of several components: +* It is a high level self driving library written in Python. It was developed with a focus on enabling fast experimentation and easy contribution. +* It is an Open Source Hardware design that makes it easy for you to build your own car +* It is a simulator that enables you to use Donkey without hardware +* It is a community of enthusiasts, developers and data scientists that enjoy racing, coding and discussing the future of ML, Cars and who will win the next race.

+

Enjoy

+
+

Build your own Donkey

+

Donkey is the standard car that most people build first. The parts cost about $250 to $300 and take 2 hours to assemble. Here are the main steps to build your own car:

+
    +
  1. Assemble hardware.
  2. +
  3. Install software.
  4. +
  5. Create Donkey App.
  6. +
  7. Calibrate your car.
  8. +
  9. Start driving.
  10. +
  11. Create an autopilot.
  12. +
  13. Experiment with simulator.
  14. +
+
+

Hello World.

+

Donkeycar is designed to make adding new parts to your car easy. Here's an +example car application that captures images from the camera and saves them.

+
import donkey as dk
+
+#initialize the vehicle
+V = dk.Vehicle()
+
+#add a camera part
+cam = dk.parts.PiCamera()
+V.add(cam, outputs=['image'], threaded=True)
+
+#add tub part to record images
+tub = dk.parts.Tub(path='~/mycar/data',
+                   inputs=['image'],
+                   types=['image_array'])
+V.add(tub, inputs=inputs)
+
+#start the vehicle's drive loop
+V.start(max_loop_count=100)
+
+
+

Installation

+

How to install

+
+

Why the name Donkey?

+

The ultimate goal of this project is to build something useful. Donkey's were +one of the first domesticated pack animals, they're notoriously stubborn, and +they are kid safe. Until the car can navigate from one side of a city to the +other, we'll hold off naming it after some celestial being.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + Next » + + +
+ + + + + + + + + + diff --git a/js/html5shiv.min.js b/js/html5shiv.min.js new file mode 100644 index 00000000..1a01c94b --- /dev/null +++ b/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); diff --git a/js/jquery-3.6.0.min.js b/js/jquery-3.6.0.min.js new file mode 100644 index 00000000..c4c6022f --- /dev/null +++ b/js/jquery-3.6.0.min.js @@ -0,0 +1,2 @@ +/*! jQuery v3.6.0 | (c) OpenJS Foundation and other contributors | jquery.org/license */ +!function(e,t){"use strict";"object"==typeof module&&"object"==typeof module.exports?module.exports=e.document?t(e,!0):function(e){if(!e.document)throw new Error("jQuery requires a window with a document");return t(e)}:t(e)}("undefined"!=typeof window?window:this,function(C,e){"use strict";var t=[],r=Object.getPrototypeOf,s=t.slice,g=t.flat?function(e){return t.flat.call(e)}:function(e){return t.concat.apply([],e)},u=t.push,i=t.indexOf,n={},o=n.toString,v=n.hasOwnProperty,a=v.toString,l=a.call(Object),y={},m=function(e){return"function"==typeof e&&"number"!=typeof e.nodeType&&"function"!=typeof e.item},x=function(e){return null!=e&&e===e.window},E=C.document,c={type:!0,src:!0,nonce:!0,noModule:!0};function b(e,t,n){var r,i,o=(n=n||E).createElement("script");if(o.text=e,t)for(r in c)(i=t[r]||t.getAttribute&&t.getAttribute(r))&&o.setAttribute(r,i);n.head.appendChild(o).parentNode.removeChild(o)}function w(e){return null==e?e+"":"object"==typeof e||"function"==typeof e?n[o.call(e)]||"object":typeof e}var f="3.6.0",S=function(e,t){return new S.fn.init(e,t)};function p(e){var t=!!e&&"length"in e&&e.length,n=w(e);return!m(e)&&!x(e)&&("array"===n||0===t||"number"==typeof t&&0+~]|"+M+")"+M+"*"),U=new RegExp(M+"|>"),X=new RegExp(F),V=new RegExp("^"+I+"$"),G={ID:new RegExp("^#("+I+")"),CLASS:new RegExp("^\\.("+I+")"),TAG:new RegExp("^("+I+"|[*])"),ATTR:new RegExp("^"+W),PSEUDO:new RegExp("^"+F),CHILD:new RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+M+"*(even|odd|(([+-]|)(\\d*)n|)"+M+"*(?:([+-]|)"+M+"*(\\d+)|))"+M+"*\\)|)","i"),bool:new RegExp("^(?:"+R+")$","i"),needsContext:new RegExp("^"+M+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+M+"*((?:-\\d)?\\d*)"+M+"*\\)|)(?=[^-]|$)","i")},Y=/HTML$/i,Q=/^(?:input|select|textarea|button)$/i,J=/^h\d$/i,K=/^[^{]+\{\s*\[native \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,ee=/[+~]/,te=new RegExp("\\\\[\\da-fA-F]{1,6}"+M+"?|\\\\([^\\r\\n\\f])","g"),ne=function(e,t){var n="0x"+e.slice(1)-65536;return t||(n<0?String.fromCharCode(n+65536):String.fromCharCode(n>>10|55296,1023&n|56320))},re=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\0-\x1f\x7f-\uFFFF\w-]/g,ie=function(e,t){return t?"\0"===e?"\ufffd":e.slice(0,-1)+"\\"+e.charCodeAt(e.length-1).toString(16)+" ":"\\"+e},oe=function(){T()},ae=be(function(e){return!0===e.disabled&&"fieldset"===e.nodeName.toLowerCase()},{dir:"parentNode",next:"legend"});try{H.apply(t=O.call(p.childNodes),p.childNodes),t[p.childNodes.length].nodeType}catch(e){H={apply:t.length?function(e,t){L.apply(e,O.call(t))}:function(e,t){var n=e.length,r=0;while(e[n++]=t[r++]);e.length=n-1}}}function se(t,e,n,r){var i,o,a,s,u,l,c,f=e&&e.ownerDocument,p=e?e.nodeType:9;if(n=n||[],"string"!=typeof t||!t||1!==p&&9!==p&&11!==p)return n;if(!r&&(T(e),e=e||C,E)){if(11!==p&&(u=Z.exec(t)))if(i=u[1]){if(9===p){if(!(a=e.getElementById(i)))return n;if(a.id===i)return n.push(a),n}else if(f&&(a=f.getElementById(i))&&y(e,a)&&a.id===i)return n.push(a),n}else{if(u[2])return H.apply(n,e.getElementsByTagName(t)),n;if((i=u[3])&&d.getElementsByClassName&&e.getElementsByClassName)return H.apply(n,e.getElementsByClassName(i)),n}if(d.qsa&&!N[t+" "]&&(!v||!v.test(t))&&(1!==p||"object"!==e.nodeName.toLowerCase())){if(c=t,f=e,1===p&&(U.test(t)||z.test(t))){(f=ee.test(t)&&ye(e.parentNode)||e)===e&&d.scope||((s=e.getAttribute("id"))?s=s.replace(re,ie):e.setAttribute("id",s=S)),o=(l=h(t)).length;while(o--)l[o]=(s?"#"+s:":scope")+" "+xe(l[o]);c=l.join(",")}try{return H.apply(n,f.querySelectorAll(c)),n}catch(e){N(t,!0)}finally{s===S&&e.removeAttribute("id")}}}return g(t.replace($,"$1"),e,n,r)}function ue(){var r=[];return function e(t,n){return r.push(t+" ")>b.cacheLength&&delete e[r.shift()],e[t+" "]=n}}function le(e){return e[S]=!0,e}function ce(e){var t=C.createElement("fieldset");try{return!!e(t)}catch(e){return!1}finally{t.parentNode&&t.parentNode.removeChild(t),t=null}}function fe(e,t){var n=e.split("|"),r=n.length;while(r--)b.attrHandle[n[r]]=t}function pe(e,t){var n=t&&e,r=n&&1===e.nodeType&&1===t.nodeType&&e.sourceIndex-t.sourceIndex;if(r)return r;if(n)while(n=n.nextSibling)if(n===t)return-1;return e?1:-1}function de(t){return function(e){return"input"===e.nodeName.toLowerCase()&&e.type===t}}function he(n){return function(e){var t=e.nodeName.toLowerCase();return("input"===t||"button"===t)&&e.type===n}}function ge(t){return function(e){return"form"in e?e.parentNode&&!1===e.disabled?"label"in e?"label"in e.parentNode?e.parentNode.disabled===t:e.disabled===t:e.isDisabled===t||e.isDisabled!==!t&&ae(e)===t:e.disabled===t:"label"in e&&e.disabled===t}}function ve(a){return le(function(o){return o=+o,le(function(e,t){var n,r=a([],e.length,o),i=r.length;while(i--)e[n=r[i]]&&(e[n]=!(t[n]=e[n]))})})}function ye(e){return e&&"undefined"!=typeof e.getElementsByTagName&&e}for(e in d=se.support={},i=se.isXML=function(e){var t=e&&e.namespaceURI,n=e&&(e.ownerDocument||e).documentElement;return!Y.test(t||n&&n.nodeName||"HTML")},T=se.setDocument=function(e){var t,n,r=e?e.ownerDocument||e:p;return r!=C&&9===r.nodeType&&r.documentElement&&(a=(C=r).documentElement,E=!i(C),p!=C&&(n=C.defaultView)&&n.top!==n&&(n.addEventListener?n.addEventListener("unload",oe,!1):n.attachEvent&&n.attachEvent("onunload",oe)),d.scope=ce(function(e){return a.appendChild(e).appendChild(C.createElement("div")),"undefined"!=typeof e.querySelectorAll&&!e.querySelectorAll(":scope fieldset div").length}),d.attributes=ce(function(e){return e.className="i",!e.getAttribute("className")}),d.getElementsByTagName=ce(function(e){return e.appendChild(C.createComment("")),!e.getElementsByTagName("*").length}),d.getElementsByClassName=K.test(C.getElementsByClassName),d.getById=ce(function(e){return a.appendChild(e).id=S,!C.getElementsByName||!C.getElementsByName(S).length}),d.getById?(b.filter.ID=function(e){var t=e.replace(te,ne);return function(e){return e.getAttribute("id")===t}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n=t.getElementById(e);return n?[n]:[]}}):(b.filter.ID=function(e){var n=e.replace(te,ne);return function(e){var t="undefined"!=typeof e.getAttributeNode&&e.getAttributeNode("id");return t&&t.value===n}},b.find.ID=function(e,t){if("undefined"!=typeof t.getElementById&&E){var n,r,i,o=t.getElementById(e);if(o){if((n=o.getAttributeNode("id"))&&n.value===e)return[o];i=t.getElementsByName(e),r=0;while(o=i[r++])if((n=o.getAttributeNode("id"))&&n.value===e)return[o]}return[]}}),b.find.TAG=d.getElementsByTagName?function(e,t){return"undefined"!=typeof t.getElementsByTagName?t.getElementsByTagName(e):d.qsa?t.querySelectorAll(e):void 0}:function(e,t){var n,r=[],i=0,o=t.getElementsByTagName(e);if("*"===e){while(n=o[i++])1===n.nodeType&&r.push(n);return r}return o},b.find.CLASS=d.getElementsByClassName&&function(e,t){if("undefined"!=typeof t.getElementsByClassName&&E)return t.getElementsByClassName(e)},s=[],v=[],(d.qsa=K.test(C.querySelectorAll))&&(ce(function(e){var t;a.appendChild(e).innerHTML="",e.querySelectorAll("[msallowcapture^='']").length&&v.push("[*^$]="+M+"*(?:''|\"\")"),e.querySelectorAll("[selected]").length||v.push("\\["+M+"*(?:value|"+R+")"),e.querySelectorAll("[id~="+S+"-]").length||v.push("~="),(t=C.createElement("input")).setAttribute("name",""),e.appendChild(t),e.querySelectorAll("[name='']").length||v.push("\\["+M+"*name"+M+"*="+M+"*(?:''|\"\")"),e.querySelectorAll(":checked").length||v.push(":checked"),e.querySelectorAll("a#"+S+"+*").length||v.push(".#.+[+~]"),e.querySelectorAll("\\\f"),v.push("[\\r\\n\\f]")}),ce(function(e){e.innerHTML="";var t=C.createElement("input");t.setAttribute("type","hidden"),e.appendChild(t).setAttribute("name","D"),e.querySelectorAll("[name=d]").length&&v.push("name"+M+"*[*^$|!~]?="),2!==e.querySelectorAll(":enabled").length&&v.push(":enabled",":disabled"),a.appendChild(e).disabled=!0,2!==e.querySelectorAll(":disabled").length&&v.push(":enabled",":disabled"),e.querySelectorAll("*,:x"),v.push(",.*:")})),(d.matchesSelector=K.test(c=a.matches||a.webkitMatchesSelector||a.mozMatchesSelector||a.oMatchesSelector||a.msMatchesSelector))&&ce(function(e){d.disconnectedMatch=c.call(e,"*"),c.call(e,"[s!='']:x"),s.push("!=",F)}),v=v.length&&new RegExp(v.join("|")),s=s.length&&new RegExp(s.join("|")),t=K.test(a.compareDocumentPosition),y=t||K.test(a.contains)?function(e,t){var n=9===e.nodeType?e.documentElement:e,r=t&&t.parentNode;return e===r||!(!r||1!==r.nodeType||!(n.contains?n.contains(r):e.compareDocumentPosition&&16&e.compareDocumentPosition(r)))}:function(e,t){if(t)while(t=t.parentNode)if(t===e)return!0;return!1},j=t?function(e,t){if(e===t)return l=!0,0;var n=!e.compareDocumentPosition-!t.compareDocumentPosition;return n||(1&(n=(e.ownerDocument||e)==(t.ownerDocument||t)?e.compareDocumentPosition(t):1)||!d.sortDetached&&t.compareDocumentPosition(e)===n?e==C||e.ownerDocument==p&&y(p,e)?-1:t==C||t.ownerDocument==p&&y(p,t)?1:u?P(u,e)-P(u,t):0:4&n?-1:1)}:function(e,t){if(e===t)return l=!0,0;var n,r=0,i=e.parentNode,o=t.parentNode,a=[e],s=[t];if(!i||!o)return e==C?-1:t==C?1:i?-1:o?1:u?P(u,e)-P(u,t):0;if(i===o)return pe(e,t);n=e;while(n=n.parentNode)a.unshift(n);n=t;while(n=n.parentNode)s.unshift(n);while(a[r]===s[r])r++;return r?pe(a[r],s[r]):a[r]==p?-1:s[r]==p?1:0}),C},se.matches=function(e,t){return se(e,null,null,t)},se.matchesSelector=function(e,t){if(T(e),d.matchesSelector&&E&&!N[t+" "]&&(!s||!s.test(t))&&(!v||!v.test(t)))try{var n=c.call(e,t);if(n||d.disconnectedMatch||e.document&&11!==e.document.nodeType)return n}catch(e){N(t,!0)}return 0":{dir:"parentNode",first:!0}," ":{dir:"parentNode"},"+":{dir:"previousSibling",first:!0},"~":{dir:"previousSibling"}},preFilter:{ATTR:function(e){return e[1]=e[1].replace(te,ne),e[3]=(e[3]||e[4]||e[5]||"").replace(te,ne),"~="===e[2]&&(e[3]=" "+e[3]+" "),e.slice(0,4)},CHILD:function(e){return e[1]=e[1].toLowerCase(),"nth"===e[1].slice(0,3)?(e[3]||se.error(e[0]),e[4]=+(e[4]?e[5]+(e[6]||1):2*("even"===e[3]||"odd"===e[3])),e[5]=+(e[7]+e[8]||"odd"===e[3])):e[3]&&se.error(e[0]),e},PSEUDO:function(e){var t,n=!e[6]&&e[2];return G.CHILD.test(e[0])?null:(e[3]?e[2]=e[4]||e[5]||"":n&&X.test(n)&&(t=h(n,!0))&&(t=n.indexOf(")",n.length-t)-n.length)&&(e[0]=e[0].slice(0,t),e[2]=n.slice(0,t)),e.slice(0,3))}},filter:{TAG:function(e){var t=e.replace(te,ne).toLowerCase();return"*"===e?function(){return!0}:function(e){return e.nodeName&&e.nodeName.toLowerCase()===t}},CLASS:function(e){var t=m[e+" "];return t||(t=new RegExp("(^|"+M+")"+e+"("+M+"|$)"))&&m(e,function(e){return t.test("string"==typeof e.className&&e.className||"undefined"!=typeof e.getAttribute&&e.getAttribute("class")||"")})},ATTR:function(n,r,i){return function(e){var t=se.attr(e,n);return null==t?"!="===r:!r||(t+="","="===r?t===i:"!="===r?t!==i:"^="===r?i&&0===t.indexOf(i):"*="===r?i&&-1:\x20\t\r\n\f]*)[\x20\t\r\n\f]*\/?>(?:<\/\1>|)$/i;function j(e,n,r){return m(n)?S.grep(e,function(e,t){return!!n.call(e,t,e)!==r}):n.nodeType?S.grep(e,function(e){return e===n!==r}):"string"!=typeof n?S.grep(e,function(e){return-1)[^>]*|#([\w-]+))$/;(S.fn.init=function(e,t,n){var r,i;if(!e)return this;if(n=n||D,"string"==typeof e){if(!(r="<"===e[0]&&">"===e[e.length-1]&&3<=e.length?[null,e,null]:q.exec(e))||!r[1]&&t)return!t||t.jquery?(t||n).find(e):this.constructor(t).find(e);if(r[1]){if(t=t instanceof S?t[0]:t,S.merge(this,S.parseHTML(r[1],t&&t.nodeType?t.ownerDocument||t:E,!0)),N.test(r[1])&&S.isPlainObject(t))for(r in t)m(this[r])?this[r](t[r]):this.attr(r,t[r]);return this}return(i=E.getElementById(r[2]))&&(this[0]=i,this.length=1),this}return e.nodeType?(this[0]=e,this.length=1,this):m(e)?void 0!==n.ready?n.ready(e):e(S):S.makeArray(e,this)}).prototype=S.fn,D=S(E);var L=/^(?:parents|prev(?:Until|All))/,H={children:!0,contents:!0,next:!0,prev:!0};function O(e,t){while((e=e[t])&&1!==e.nodeType);return e}S.fn.extend({has:function(e){var t=S(e,this),n=t.length;return this.filter(function(){for(var e=0;e\x20\t\r\n\f]*)/i,he=/^$|^module$|\/(?:java|ecma)script/i;ce=E.createDocumentFragment().appendChild(E.createElement("div")),(fe=E.createElement("input")).setAttribute("type","radio"),fe.setAttribute("checked","checked"),fe.setAttribute("name","t"),ce.appendChild(fe),y.checkClone=ce.cloneNode(!0).cloneNode(!0).lastChild.checked,ce.innerHTML="",y.noCloneChecked=!!ce.cloneNode(!0).lastChild.defaultValue,ce.innerHTML="",y.option=!!ce.lastChild;var ge={thead:[1,"","
"],col:[2,"","
"],tr:[2,"","
"],td:[3,"","
"],_default:[0,"",""]};function ve(e,t){var n;return n="undefined"!=typeof e.getElementsByTagName?e.getElementsByTagName(t||"*"):"undefined"!=typeof e.querySelectorAll?e.querySelectorAll(t||"*"):[],void 0===t||t&&A(e,t)?S.merge([e],n):n}function ye(e,t){for(var n=0,r=e.length;n",""]);var me=/<|&#?\w+;/;function xe(e,t,n,r,i){for(var o,a,s,u,l,c,f=t.createDocumentFragment(),p=[],d=0,h=e.length;d\s*$/g;function je(e,t){return A(e,"table")&&A(11!==t.nodeType?t:t.firstChild,"tr")&&S(e).children("tbody")[0]||e}function De(e){return e.type=(null!==e.getAttribute("type"))+"/"+e.type,e}function qe(e){return"true/"===(e.type||"").slice(0,5)?e.type=e.type.slice(5):e.removeAttribute("type"),e}function Le(e,t){var n,r,i,o,a,s;if(1===t.nodeType){if(Y.hasData(e)&&(s=Y.get(e).events))for(i in Y.remove(t,"handle events"),s)for(n=0,r=s[i].length;n").attr(n.scriptAttrs||{}).prop({charset:n.scriptCharset,src:n.url}).on("load error",i=function(e){r.remove(),i=null,e&&t("error"===e.type?404:200,e.type)}),E.head.appendChild(r[0])},abort:function(){i&&i()}}});var _t,zt=[],Ut=/(=)\?(?=&|$)|\?\?/;S.ajaxSetup({jsonp:"callback",jsonpCallback:function(){var e=zt.pop()||S.expando+"_"+wt.guid++;return this[e]=!0,e}}),S.ajaxPrefilter("json jsonp",function(e,t,n){var r,i,o,a=!1!==e.jsonp&&(Ut.test(e.url)?"url":"string"==typeof e.data&&0===(e.contentType||"").indexOf("application/x-www-form-urlencoded")&&Ut.test(e.data)&&"data");if(a||"jsonp"===e.dataTypes[0])return r=e.jsonpCallback=m(e.jsonpCallback)?e.jsonpCallback():e.jsonpCallback,a?e[a]=e[a].replace(Ut,"$1"+r):!1!==e.jsonp&&(e.url+=(Tt.test(e.url)?"&":"?")+e.jsonp+"="+r),e.converters["script json"]=function(){return o||S.error(r+" was not called"),o[0]},e.dataTypes[0]="json",i=C[r],C[r]=function(){o=arguments},n.always(function(){void 0===i?S(C).removeProp(r):C[r]=i,e[r]&&(e.jsonpCallback=t.jsonpCallback,zt.push(r)),o&&m(i)&&i(o[0]),o=i=void 0}),"script"}),y.createHTMLDocument=((_t=E.implementation.createHTMLDocument("").body).innerHTML="
",2===_t.childNodes.length),S.parseHTML=function(e,t,n){return"string"!=typeof e?[]:("boolean"==typeof t&&(n=t,t=!1),t||(y.createHTMLDocument?((r=(t=E.implementation.createHTMLDocument("")).createElement("base")).href=E.location.href,t.head.appendChild(r)):t=E),o=!n&&[],(i=N.exec(e))?[t.createElement(i[1])]:(i=xe([e],t,o),o&&o.length&&S(o).remove(),S.merge([],i.childNodes)));var r,i,o},S.fn.load=function(e,t,n){var r,i,o,a=this,s=e.indexOf(" ");return-1").append(S.parseHTML(e)).find(r):e)}).always(n&&function(e,t){a.each(function(){n.apply(this,o||[e.responseText,t,e])})}),this},S.expr.pseudos.animated=function(t){return S.grep(S.timers,function(e){return t===e.elem}).length},S.offset={setOffset:function(e,t,n){var r,i,o,a,s,u,l=S.css(e,"position"),c=S(e),f={};"static"===l&&(e.style.position="relative"),s=c.offset(),o=S.css(e,"top"),u=S.css(e,"left"),("absolute"===l||"fixed"===l)&&-1<(o+u).indexOf("auto")?(a=(r=c.position()).top,i=r.left):(a=parseFloat(o)||0,i=parseFloat(u)||0),m(t)&&(t=t.call(e,n,S.extend({},s))),null!=t.top&&(f.top=t.top-s.top+a),null!=t.left&&(f.left=t.left-s.left+i),"using"in t?t.using.call(e,f):c.css(f)}},S.fn.extend({offset:function(t){if(arguments.length)return void 0===t?this:this.each(function(e){S.offset.setOffset(this,t,e)});var e,n,r=this[0];return r?r.getClientRects().length?(e=r.getBoundingClientRect(),n=r.ownerDocument.defaultView,{top:e.top+n.pageYOffset,left:e.left+n.pageXOffset}):{top:0,left:0}:void 0},position:function(){if(this[0]){var e,t,n,r=this[0],i={top:0,left:0};if("fixed"===S.css(r,"position"))t=r.getBoundingClientRect();else{t=this.offset(),n=r.ownerDocument,e=r.offsetParent||n.documentElement;while(e&&(e===n.body||e===n.documentElement)&&"static"===S.css(e,"position"))e=e.parentNode;e&&e!==r&&1===e.nodeType&&((i=S(e).offset()).top+=S.css(e,"borderTopWidth",!0),i.left+=S.css(e,"borderLeftWidth",!0))}return{top:t.top-i.top-S.css(r,"marginTop",!0),left:t.left-i.left-S.css(r,"marginLeft",!0)}}},offsetParent:function(){return this.map(function(){var e=this.offsetParent;while(e&&"static"===S.css(e,"position"))e=e.offsetParent;return e||re})}}),S.each({scrollLeft:"pageXOffset",scrollTop:"pageYOffset"},function(t,i){var o="pageYOffset"===i;S.fn[t]=function(e){return $(this,function(e,t,n){var r;if(x(e)?r=e:9===e.nodeType&&(r=e.defaultView),void 0===n)return r?r[i]:e[t];r?r.scrollTo(o?r.pageXOffset:n,o?n:r.pageYOffset):e[t]=n},t,e,arguments.length)}}),S.each(["top","left"],function(e,n){S.cssHooks[n]=Fe(y.pixelPosition,function(e,t){if(t)return t=We(e,n),Pe.test(t)?S(e).position()[n]+"px":t})}),S.each({Height:"height",Width:"width"},function(a,s){S.each({padding:"inner"+a,content:s,"":"outer"+a},function(r,o){S.fn[o]=function(e,t){var n=arguments.length&&(r||"boolean"!=typeof e),i=r||(!0===e||!0===t?"margin":"border");return $(this,function(e,t,n){var r;return x(e)?0===o.indexOf("outer")?e["inner"+a]:e.document.documentElement["client"+a]:9===e.nodeType?(r=e.documentElement,Math.max(e.body["scroll"+a],r["scroll"+a],e.body["offset"+a],r["offset"+a],r["client"+a])):void 0===n?S.css(e,t,i):S.style(e,t,n,i)},s,n?e:void 0,n)}})}),S.each(["ajaxStart","ajaxStop","ajaxComplete","ajaxError","ajaxSuccess","ajaxSend"],function(e,t){S.fn[t]=function(e){return this.on(t,e)}}),S.fn.extend({bind:function(e,t,n){return this.on(e,null,t,n)},unbind:function(e,t){return this.off(e,null,t)},delegate:function(e,t,n,r){return this.on(t,e,n,r)},undelegate:function(e,t,n){return 1===arguments.length?this.off(e,"**"):this.off(t,e||"**",n)},hover:function(e,t){return this.mouseenter(e).mouseleave(t||e)}}),S.each("blur focus focusin focusout resize scroll click dblclick mousedown mouseup mousemove mouseover mouseout mouseenter mouseleave change select submit keydown keypress keyup contextmenu".split(" "),function(e,n){S.fn[n]=function(e,t){return 0"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t + + + + + + + About - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Donkeycar Software Architecture

+

Donkeycar is very simple; code is organized into parts that take inputs and return outputs. These parts are added to a vehicle. The vehicle loop, once started, runs the parts in order. The parts effectively communicate by reading and mutating the vehicle memory.

+

A 'template' is a python file that contains code to construct a 'vehicle' and one or more 'parts'. A part is a Python class that wraps a functional component of a vehicle. The parts are added to the vehicle. The parts can take values from the vehicle's memory as inputs and can write values to the vehicle's memory as outputs. When the vehicle loop is started the parts are run in the order that they were added; getting their inputs from memory and outputing their results to memory. This continues in a loop until the vehicle is stopped, then all the parts are shutdown and the template exits.

+

Templates

+

When you create your car application using the donkey createcar ... command as described in the Create Donkeycar App section of the docs, what happens under the hood is that a few files are copied from the donkeycar/templates folder into your my car folder. The two we need to talk about are manage.py and myconfig.py.

+

The files that are copied to the mycar folder are renamed versions of a pair of template files in the templates folder. The files are chosen based on the template name you passed in the --template argument to the createcar command; if you pass nothing then the default is --template = complete. So donkey createcar --path=~/mycar is the same as donkey createcar --path=~/mycar --template=complete. In this case then the files that are renamed and copied to ~/mycar/manage.py and ~/mycar/myconfig.py are donkeycar/templates/complete.py and donkeycar/templates/cfg_complete.py respectively. If you create a path follow application by passing --template=path_follow to createcar, then the files that are copied are donkeycar/templates/path_follow.py and donkeycar/templates/cfg_path_follow.py

+

Now technically another copy of the donkeycar/template/cfg_xxxx.py is copied to the mycar folder as config.py; this contains the default configuration and should not be edited. The myconfig.py file is really a commented out version of config.py. To change your app's configuration (like to choose the kind of camera or drivetrain) uncomment the section you care about in myconfig.py and edit it.

+

The manage.py file is where that action really is; this is the code that runs your car. It is organized into a 'vehicle loop' that runs at the rate specified by the DRIVE_LOOP_HZ value in your myconfig.py file; that is how often the vehicle loop's 'parts' will get updated. The donkeycar vehicle loop is a pipeline of what we call 'parts' that get and set state in a hashmap we call 'memory'.

+

The complete.py and path_follow.py templates are fairly complex because they are very configurable. However they are not in anyway special. You can create your own template to do what you want; or you don't have create or use a template at all; you can write your own manage.py directly. Here is an example of a vehicle loop with a single part that will accept a number, multiply it by a random number and return the result. As the vehicle loop runs, the value will continue to get randomized.

+
import random
+
+# the randomizer part
+class RandPercent:
+    def run(self, x):
+        value = x * random.random()
+        print(f"{x} -> {value}")
+        return value
+
+# create the vehicle and it's internal memory
+V = dk.Vehicle()
+
+# initialize the value in the vehicle's memory and give it a name
+V.mem['var'] = 4
+
+# add the part to read and write to the same value.
+V.add(RandPercent(), inputs=['var'], outputs=['var'])
+
+# start the vehicle loop running; quit after 5 loops
+V.start(max_loops=5)
+
+

Parts

+

A part is a Python class that wraps a functional component of a vehicle.

+

These include:

+
    +
  • Sensors - Cameras, Lidar, Odometers, GPS ...
  • +
  • Actuators - Motor Controllers
  • +
  • Pilots - Lane Detectors, Behavioral Cloning models, ...
  • +
  • Controllers - Web based or Bluetooth.
  • +
  • Stores - Tub, or a way to save data.
  • +
+

Tawn Kramer has created a video (actual two parts) that walks through how to make a part. Also, there is a video of a presentationat the Arm AIoT conference that shows how the OLED part was created.

+

Each part is constructed and then added to the vehicle loop with its named inputs and named outputs and an optional run_condition specified. The vehicle's parts are (for the most part) executed in the order that they are added to the vehicle loop. Each time the vehicle loop runs the part's inputs are read from vehicle memory and passed to the part's run() method, the run() method does it's work, and it's return values are assigned to the output values. If there is a run_condition, then the part's run() method is only called in the value of the run_condition property is True; so if the run_condition property is False then the part is 'turned off'.

+
    +
  • memory: Vehicle memory is a hash map of named values. It is the 'state' of the vehicle. In includes values uses as inputs, outputs and conditions. It is shared by all parts.
  • +
  • inputs: inputs are memory values passed to the run() method of a part; they are declared when the part is added to the vehicle loop. So for the aiLauncher example, when we add the part we include the argument, inputs=['user/mode', 'pilot/throttle']. Just before the run() method is called, the vehicle loop looks up the input values and then passes them to the part's run() method as arguments. So when the aiLauncher part's run() method is called it will be passed two arguments; the first will be the value of the user/mode property in vehicle memory and the second will be the value of the pilot/throttle property. Note that the number of inputs declared when the part is added must match the number of arguments in the part's run() method otherwise a runtime error results.
  • +
  • outputs: outputs are memory values that are returned by the run() method of the part; they are declared when the part is added to the vehicle loop. After the part's run() method is called, the return values are assigned to named output properties. So for the aiLauncher example, when we add the part we include the argument, outputs=['pilot/throttle']. When the aiLauncher part finishes running, it will return a single value and that value will be assigned to the 'pilot/throttle' property in vehicle memory. Note that the number of outputs declared when the part is added must match the number of returned values in the part's run() method otherwise a runtime error results.
  • +
  • run_condition: the run_condition is a boolean memory value that can be used to decide if a part's run() method is called. If the condition is True then the part's run() method is called, otherwise it is not called. This is a way to turn on and off a part. So for instance, if we only ever wanted aiLauncher to run when in autopilot mode, we would maintain a named memory value, let's say 'run_pilot', that was True when running in autopilot mode and False when running in user (manual) mode. Then we would pass run_condition='run_pilot' to the V.add() method when we added the aiLauncher part to the vehicle. The aiLaucher's run() method would only be called if the named memory value 'run_pilot' was True.
  • +
+

So you can see that you can control how a part operates by changing the value of its input properties. One part can affect another parts by outputting values (and so changing them) that are used as inputs or run_conditions by other parts.

+

Here is an example of adding a part; the AiLaunch part overides the throttle when then driving mode transitions from manual driving to autopilot; it is used to provide a high throttle for a short time at the very start of a race. In this case it does not have an explicit run_condition argument, so it defaults to True.

+
    aiLauncher = AiLaunch(cfg.AI_LAUNCH_DURATION, cfg.AI_LAUNCH_THROTTLE, cfg.AI_LAUNCH_KEEP_ENABLED)
+    V.add(aiLauncher,
+          inputs=['user/mode', 'pilot/throttle'],
+          outputs=['pilot/throttle'])
+
+

To implement this 'launch' it needs to know the current driving mode and the current autopilot throttle value; those are its inputs. If it is not launching then it just passes the throttle value through without modifying it, but when it is launching it outputs a throttle valud equal to cfg.AI_LAUNCH_THROTTLE. So the throttle is it's only output. The part's run() method must take in these two inputs in the correct order and return the one output. You can see this in part's code;

+
import time
+
+class AiLaunch():
+    '''
+    This part will apply a large thrust on initial activation. This is to help
+    in racing to start fast and then the ai will take over quickly when it's
+    up to speed.
+    '''
+
+    def __init__(self, launch_duration=1.0, launch_throttle=1.0, keep_enabled=False):
+        self.active = False
+        self.enabled = False
+        self.timer_start = None
+        self.timer_duration = launch_duration
+        self.launch_throttle = launch_throttle
+        self.prev_mode = None
+        self.trigger_on_switch = keep_enabled
+
+    def enable_ai_launch(self):
+        self.enabled = True
+        print('AiLauncher is enabled.')
+
+    def run(self, mode, ai_throttle):
+        new_throttle = ai_throttle
+
+        if mode != self.prev_mode:
+            self.prev_mode = mode
+            if mode == "local" and self.trigger_on_switch:
+                self.enabled = True
+
+        if mode == "local" and self.enabled:
+            if not self.active:
+                self.active = True
+                self.timer_start = time.time()
+            else:
+                duration = time.time() - self.timer_start
+                if duration > self.timer_duration:
+                    self.active = False
+                    self.enabled = False
+        else:
+            self.active = False
+
+        if self.active:
+            print('AiLauncher is active!!!')
+            new_throttle = self.launch_throttle
+
+        return new_throttle
+
+

It is common for configuration values to be passed as arguments to a part's constructor, as they are in this example. Also, if the part grabs some hardware resource, such as a camera or a serial port, it should also have a shutdown() function that releases those resources properly when donkey is stopped.

+

So as we said, the part's run() method is called each time the vehicle loop is run; the input values are read from vehicle memory and passed as arguments to the run() method, which does it's work and then returns values that are then written to vehicle memory as outputs. Since parts are run in the order they are added (for the most part) then you can see that you need to add a part that provides an output ahead of any part that needs that value as an input.

+

Threaded Parts

+

I say parts run in the order they were added 'for the most part' because you can also specify that a part is to be run in it's own thread so it can operate at it's own rate. A threaded part has a run_threaded() method rather than a run() method; the inputs are arguments and the return values are outputs just like the run() method. Also similar to the run() method, the run_threaded() method is called once each time the vehicle loop runs.

+

So if run_threaded() is called each time through the vehicle loop, just like the run() method, and it's inputs and outputs are organized just the like a non-threaded part, then what is the difference between a threaded part and a non-threaded part? One difference you can see below is that when you add a threaded part you pass threaded=True. But the most important difference with a threaded part is that it must have a no-argument update() method. When the threaded part is launched a thread is created and the part's update() method is registered as the method that is executed on the thread. The update() method will run separately from the vehicle loop and it will run as fast as python's scheduler will allow; generally much faster than the vehicle loop runs. The update() method should not return until the part is told to shutdown(); it should run a loop that does its work over and over; such as reading from a device like the TFMini part does. In a threaded part the run_threaded() method is usually quite simple; it typically just sets class properties used by the update() method and returns class properties that are maintained by the update() method.

+

Here is an example of adding a threaded part to the vehicle loop. This part interfaces to a TF-Mini single-beam lidar via a serial port; it reports a distance. The part takes no input arguments and outputs just the distance value. Note that the argument inputs=[] is not really necessary; that is the default for inputs so it can be left off.

+
    if cfg.HAVE_TFMINI:
+        from donkeycar.parts.tfmini import TFMini
+        lidar = TFMini(port=cfg.TFMINI_SERIAL_PORT)
+        V.add(lidar, inputs=[], outputs=['lidar/dist'], threaded=True)
+
+

Here is a listing of the TFMini part;

+
class TFMini:
+    """
+    Class for TFMini and TFMini-Plus distance sensors.
+    See wiring and installation instructions at https://github.com/TFmini/TFmini-RaspberryPi
+
+    Returns distance in centimeters. 
+    """
+
+    def __init__(self, port="/dev/serial0", baudrate=115200, poll_delay=0.01, init_delay=0.1):
+        self.ser = serial.Serial(port, baudrate)
+        self.poll_delay = poll_delay
+
+        self.dist = 0
+
+        if not self.ser.is_open:
+            self.ser.close() # in case it is still open, we do not want to open it twice
+            self.ser.open()
+
+        self.logger = logging.getLogger(__name__)
+
+        self.logger.info("Init TFMini")
+        time.sleep(init_delay)
+
+    def update(self):
+        while self.ser.is_open:
+            self.poll()
+            if self.poll_delay > 0:
+                time.sleep(self.poll_delay)
+
+    def poll(self):
+        try:
+            count = self.ser.in_waiting
+            if count > 8:
+                recv = self.ser.read(9)   
+                self.ser.reset_input_buffer() 
+
+                if recv[0] == 0x59 and recv[1] == 0x59:     
+                    dist = recv[2] + recv[3] * 256
+                    strength = recv[4] + recv[5] * 256
+
+                    if strength > 0:
+                        self.dist = dist
+
+                    self.ser.reset_input_buffer()
+
+        except Exception as e:
+            self.logger.error(e)
+
+
+    def run_threaded(self):
+        return self.dist
+
+    def run(self):
+        self.poll()
+        return self.dist
+
+    def shutdown(self):
+        self.ser.close()
+
+
+

NOTE: The TFMini part manages a serial port itself; it is recommend to use the SerialPort part to read line oriented data from a serial port instead of managing the port in your part. The SerialPort part can handle all the details of the serial port and outputing the resulting data; then your part only needs to take that data as input and use it.

+
+

In the TFMini part the update() method runs a loop as long as the serial port remains open. The serial port is opened in the constructor and closed when the shutdown() method is called. In a threaded part, the update() method is almost like an infinite loop, running over and over as much as python will give it time to run. This is the section of code that can run much faster than the vehicle loop runs.

+

The reason to use a threaded part is if your part needs to go faster than the vehicle loop or needs to respond to a device in close to real time. The loop in the update() method will run as fast as the python interpreter can let it, which will usually be much faster than the vehicle loop. It's important to understand that the update() method is called by the part's thread BUT the run_threaded() method is called by the main vehicle loop thread. This means that these two methods may interupt each other in the middle of what they are doing.

+

You should use approprate thread-safe patterns, such as locks, to make sure that data updates and/or reads and other critical sections of code are safely isolated and atomic. In some cases this requires a Lock to make sure resources are accessed safely from threads or that multiple lines of code are executated atomically. It is worth remembering that assignment in Python is atomic (so there is one good thing about that Global Interpreter Lock, GIL). So while this is NOT atomic;

+
x = 12.34
+y = 34.56
+angle = 1.34
+
+

because your code could be interrupted in between those assignments. This IS atomic;

+
pose = (12.34, 34.56, 1.34)
+
+

So if you have aggregate internal state that may be mutated in a thread, then put it in a tuple and you can read and write it atomically without locks.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/actuators/index.html b/parts/actuators/index.html new file mode 100644 index 00000000..5a100fcf --- /dev/null +++ b/parts/actuators/index.html @@ -0,0 +1,547 @@ + + + + + + + + Actuators - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Actuators

+

A car needs a way to to move forward and backward and to turn left and right. We commonly call devices that produce a physical movement in the robot 'actuators'. Common actuators are DC motors, Servo motors, continuous servo motors and stepper motors. There are many, many different ways that these actuators can be combined to propel and turn a robot. Donkeycar supports two common configurations that can be implemented with various actuators:

+
    +
  • Car-like vehicles steer by angling the front wheels left or right and move by turning the drive wheels forward or reverse. Common RC cars fall in this category.
  • +
  • Differential drive vehicles have two independently controlled drive wheels to provide both movement and steering. For instance a differential drive car can be driven straight forward by turning the two drive wheels forward at the same speed. In order to turn, one motor can be driven faster than the other and the car will turn an arc in the direction of the slower motor. Differential Drive cars are discussed in a section near the end of this page.
  • +
+

Actuators take control signals from the Donkeycar to control their actions. There are several options for generating these control signals.

+
    +
  • PCA9685 Servo controller board
  • +
  • RPi/Jetson 40 pin GPIO header +
  • +
  • Arduino
  • +
+

Below we will describe the supported actuator setups and software configuration of their control signals.

+

Standard RC with ESC and Steering Servo.

+

A standard RC car is equipped with a steering servo for steering the front wheels and an ESC (Electronic Speed Controller) to control the speed of the DC motor driving the wheels. Both the steering servo and the ESC take a PWM (Pulse Width Modulation) control signal. A PWM signal is simply a square wave pulse of a certain duration and frequency. In the case of the steering servo the PWM signal determines the position of the servo's arm, which is generally between 0 degrees (full right) and 180 degrees (full left). In the case of the ESC the PWM signal determines the direction and speed of the drive motor, from full reverse, through stopped, to full forward.

+
    +
  • Standard RC servo pulses range from 1 millisecond (full reverse for ESC, fully left for servo) to 2 milliseconds (full forward for ESC, full right for servo) with 1.5 milliseconds being neutral (stopped for ESC, straight for servo).
  • +
  • These pulses are typically sent at 50 hertz (one duty cycle every 20 milliseconds). One duty cycle includes a period where the signal is brought high followed by a period where the signal is brought low. This means that, using the standard 50hz frequency, a 1 millisecond pulse (1 ms high followed by 19 ms low) represents a 5% duty cycle and a 2 millisecond pulse represents a 10% duty cycle.
  • +
  • The most important part is the length of the pulse; it must be in the range of 1 to 2 milliseconds.
  • +
+

A diagram showing typical PWM timing for a servomotor (Wikipedia)

+
    +
  • So this means that if a different frequency is used, then the duty cycle must be adjusted in order to get the 1ms to 2ms pulse.
  • +
  • For instance, if a 60hz frequency is used, then a 1 ms pulse requires a duty cycle of 0.05 * 60 / 50 = 0.06 (6%) duty cycle
  • +
  • We default the frequency of our PCA9685 to 60 hz, so pulses in config are generally based on 60hz frequency and 12 bit values. We use 12 bit values because the PCA9685 has 12 bit resolution. So a 1 ms pulse is 0.06 * 4096 ~= 246, a neutral pulse of 0.09 duty cycle is 0.09 * 4096 ~= 367 and full forward pulse of 0.12 duty cycles is 0.12 * 4096 ~= 492
  • +
  • These are generalizations that are useful for understanding the underlying api call arguments and the values that are generating when calibrating. The final choice of duty-cycle/pulse length depends on your hardware and perhaps your strategy (you may not want to go too fast, and so you may choose is low max throttle pwm)
  • +
+

Generating PWM pulses with a PCA9685 Servo controller

+
    +
  • The hardware connection of the PCA9685 I2C servo driver board is described fully in the overall setup instructions here
  • +
  • The PCA9685 Servo controller is connected the RaspberryPi or Jetson Nano via the I2C pins on the 40 Pin bus, then the 3 pin cables from the ESC and Steering Servo are connected to the PCA9685, generally to channel 0 and channel 1 respectively. See Step 4: Connect Servo Shield. Connection of a PCA9685 to a Jetson Nano is the same.
  • +
+

Configuration

+
    +
  • Use DRIVE_TRAIN_TYPE = "PWM_STEERING_THROTTLE" in myconfig.py
  • +
  • Set use PCA9685 pin specifiers for PWM_STEERING_PIN and PWM_THROTTLE_PIN in the PWM_STEERING_THROTTLE section of myconfig.py. For example:
  • +
+
    DRIVE_TRAIN_TYPE = "PWM_STEERING_THROTTLE"
+
+    #
+    # PWM_STEERING_THROTTLE
+    #
+    # Drive train for RC car with a steering servo and ESC.
+    # Uses a PwmPin for steering (servo) and a second PwmPin for throttle (ESC)
+    # Base PWM Frequence is presumed to be 60hz; use PWM_xxxx_SCALE to adjust pulse with for non-standard PWM frequencies
+    #
+    PWM_STEERING_THROTTLE = {
+        "PWM_STEERING_PIN": "PCA9685.1:40.0",   # PCA9685, I2C bus 1, address 0x40, channel 0
+        "PWM_STEERING_SCALE": 1.0,              # used to compensate for PWM frequency differents from 60hz; NOT for adjusting steering range
+        "PWM_STEERING_INVERTED": False,         # True if hardware requires an inverted PWM pulse
+        "PWM_THROTTLE_PIN": "PCA9685.1:40.1",   # PCA9685, I2C bus 1, address 0x40, channel 1
+        "PWM_THROTTLE_SCALE": 1.0,              # used to compensate for PWM frequence differences from 60hz; NOT for increasing/limiting speed
+        "PWM_THROTTLE_INVERTED": False,         # True if hardware requires an inverted PWM pulse
+        "STEERING_LEFT_PWM": 460,               # pwm value for full left steering
+        "STEERING_RIGHT_PWM": 290,              # pwm value for full right steering
+        "THROTTLE_FORWARD_PWM": 500,            # pwm value for max forward throttle
+        "THROTTLE_STOPPED_PWM": 370,            # pwm value for no movement
+        "THROTTLE_REVERSE_PWM": 220,            # pwm value for max reverse throttle
+    }
+
+
+

NOTE: the pwm values (STEERING_LEFT_PWM, etc.) differ from car to car and are derived by running the calibration procedure. See Calibrate your Car

+

See pins for a detailed discussion of pin providers and pin specifiers.

+
+

Generating PWM pulses from the 40 pin GPIO header

+
    +
  • Here the PWM signal is generated from the 40 pin GPIO header. The data pin on the 3-pin ESC and Servo connectors are connected to a PWM pin on the GPIO. The ground pins on the 3-pin connectors are connected to a common ground. The 5V pins on the 3-pin connectors are connected to the 5V pins on the GPIO: the 3-pin connector from the ESC will generally provide 5V that can then be used to power the Servo.
  • +
+

Configuration

+
    +
  • Use DRIVE_TRAIN_TYPE = "PWM_STEERING_THROTTLE" in myconfig.py
  • +
  • Set the pin specifiers for GPIO in the # PWM_STEERING_THROTTLE section. Note that each pin has both a BOARD mode and a BCM (Broadcom) mode identifier. You can use either mode, but all pins must use the same mode.
  • +
  • The RaspberryPi 4b has 4 pwm hardware outputs; 3 of which are mapped to pins on the 40 pin header (see https://linuxhint.com/gpio-pinout-raspberry-pi/); note that pins can be addressed either by their board number or by their internal gpio number (see http://docs.donkeycar.com/parts/pins/). In the case of the hardware PWM pins, board pin 12 ("RPI_GPIO.BOARD.12") is the same as GPIO18 ("RPI_GPIO.BCM.18"), board pin 33 ("RPI_GPIO.BOARD.33") is the same as GPIO13 ("RPI_GPIO.BCM.13") and board pin 32 ("RPI_GPIO.BOARD.32") is the same as GPIO12 ("RPI_GPIO.BCM.12"). So you should be setting up your myconfig.py so that DRIVE_TRAIN_TYPE = "PWM_STEERING_THROTTLE" and PWM_STEERING_PIN and PWM_THROTTLE_PIN are set to use one of the hardware pwm pins for output. For example:
  • +
+
    DRIVE_TRAIN_TYPE = "PWM_STEERING_THROTTLE"
+
+    #
+    # PWM_STEERING_THROTTLE
+    #
+    # Drive train for RC car with a steering servo and ESC.
+    # Uses a PwmPin for steering (servo) and a second PwmPin for throttle (ESC)
+    # Base PWM Frequence is presumed to be 60hz; use PWM_xxxx_SCALE to adjust pulse with for non-standard PWM frequencies
+    #
+    PWM_STEERING_THROTTLE = {
+        "PWM_STEERING_PIN": "RPI_GPIO.BOARD.33",# GPIO board mode pin-33 == BCM mode pin-13
+        "PWM_STEERING_SCALE": 1.0,              # used to compensate for PWM frequency differents from 60hz; NOT for adjusting steering range
+        "PWM_STEERING_INVERTED": False,         # True if hardware requires an inverted PWM pulse
+        "PWM_THROTTLE_PIN": "RPI_GPIO.BOARD.12",# GPIO board mode pin-12 == BCM mode pin-18
+        "PWM_THROTTLE_SCALE": 1.0,              # used to compensate for PWM frequence differences from 60hz; NOT for increasing/limiting speed
+        "PWM_THROTTLE_INVERTED": False,         # True if hardware requires an inverted PWM pulse
+        "STEERING_LEFT_PWM": 460,               # pwm value for full left steering
+        "STEERING_RIGHT_PWM": 290,              # pwm value for full right steering
+        "THROTTLE_FORWARD_PWM": 500,            # pwm value for max forward throttle
+        "THROTTLE_STOPPED_PWM": 370,            # pwm value for no movement
+        "THROTTLE_REVERSE_PWM": 220,            # pwm value for max reverse throttle
+    }
+
+
+

NOTE: the pwm values (STEERING_LEFT_PWM, etc.) differ from car to car and are derived by running the calibration procedure. See Calibrate your Car

+

See pins for a detailed discussion of pin providers and pin specifiers.

+
+

Direct control with the RaspberryPi GPIO pins.

+

Please follow the instructions here

+

Control with the Robo HAT MM1 board.

+

Please follow the instructions here

+

Arduino

+

Arduino can be used in the following fashion to generate PWM signals to control the steering and throttle.

+

For now the Arduino mode is only tested on the Latte Panda Delta (LP-D) board. +However it should be straightforward to use it with Raspberry Pi / Jetson Nano (instead of PCA 9685).

+

Refer to the below block diagram to understand where things fits in.

+

block diagram

+

Arduino board should be running the standard firmata sketch (This sketch comes by default when you download the arduino tool). Load the standard firmata sketch (from Examples > Firmata > StandardFirmata) onto the Arduino. +wiring diagram +Further pymata_aio_ python package needs to be installed on the car computer via pip3 install pymata_aio.

+

As shown in the block-diagram above LattePanda combines both the x86 CPU and the Connected Arduino into a single board.

+

The following diagram shows how to connect the Arduino pins to steering servo and ESC.

+

wiring diagram +Note that the power for the servo is provided by the ESC battery elemininator circuit (BEC) which most ESC's provide. +This is done to avoid supplying the entire servo power from Arduino's 5v. +In large RC cars the servo can drag up to 2 amps, which lead to a destruction of the Arduino.

+

Calibration

+

Note that the calibration procedure/values are slightly different for the Arduino (than PCA9685). +Note that 90 is the usual midpoint (i.e. 1.5 ms pulse width at 50 Hz), so it is recommended to start + with 90 and adjust +/- 5 until you figure the desired range for steering / throttle.

+
(env1) jithu@jithu-lp:~/master/pred_mt/lp/001/donkey$ donkey calibrate --arduino --channel 6
+using donkey v2.6.0t ...
+
+pymata_aio Version 2.33 Copyright (c) 2015-2018 Alan Yorinks All rights reserved.
+
+Using COM Port:/dev/ttyACM0
+
+Initializing Arduino - Please wait...
+Arduino Firmware ID: 2.5 StandardFirmata.ino
+Auto-discovery complete. Found 30 Digital Pins and 12 Analog Pins
+
+
+Enter a PWM setting to test(0-180)95
+Enter a PWM setting to test(0-180)90
+Enter a PWM setting to test(0-180)85
+...
+
+

Note the --arduino switch passed to the calibrate command. Further note that the arduino pin being + calibrated is passed via the --channel parameter.

+

Using the arduino actuator part

+

The following snippet illustrates how to exercise the Arduino actuator in the drive() loop:

+
    #Drive train setup
+    arduino_controller = ArduinoFirmata(
+                                    servo_pin=cfg.STEERING_ARDUINO_PIN,
+                                    esc_pin=cfg.THROTTLE_ARDUINO_PIN)
+
+    steering = ArdPWMSteering(controller=arduino_controller,
+                        left_pulse=cfg.STEERING_ARDUINO_LEFT_PWM,
+                        right_pulse=cfg.STEERING_ARDUINO_RIGHT_PWM)
+
+    throttle = ArdPWMThrottle(controller=arduino_controller,
+                        max_pulse=cfg.THROTTLE_ARDUINO_FORWARD_PWM,
+                        zero_pulse=cfg.THROTTLE_ARDUINO_STOPPED_PWM,
+                        min_pulse=cfg.THROTTLE_ARDUINO_REVERSE_PWM)
+
+    V.add(steering, inputs=['user/angle'])
+    V.add(throttle, inputs=['user/throttle'])
+
+

Refer to templates/arduino_drive.py for more details.

+

HBridge Motor Controller and Steering Servo

+

In this configuration the DC motor that drives the wheels is controlled by an L298N HBridge motor controller or compatible. Steering the front wheels is accomplished with a Steering Servo that takes an PWM pulse. The motor driver is wired in one of two ways; 3 pin wiring or 2 pin wiring.

+

3-pin HBridge and Steering Servo

+

A single DC gear motor is controlled with an L298N using two TTL output pins to select direction and a PWM pin to control the power to the motor.

+

See https://www.electronicshub.org/raspberry-pi-l298n-interface-tutorial-control-dc-motor-l298n-raspberry-pi/ for a discussion of how the L298N HBridge module is wired in 3-pin mode to the RaspberryPi GPIO. This also applies to the some other driver chips that emulate the L298N, such as the TB6612FNG motor driver.

+

Configuration

+
    +
  • use DRIVETRAIN_TYPE = "SERVO_HBRIDGE_3PIN" in myconfig.py
  • +
  • Example pin specifiers using 40 pin GPIO header to generate signals:
  • +
+
HBRIDGE_3PIN_FWD = "RPI_GPIO.BOARD.18"   # ttl pin, high enables motor forward 
+HBRIDGE_3PIN_BWD = "RPI_GPIO.BOARD.16"   # ttl pin, highenables motor reverse
+HBRIDGE_3PIN_DUTY = "RPI_GPIO.BOARD.35"  # provides duty cycle to motor
+PWM_STEERING_PIN = "RPI_GPIO.BOARD.33"   # provides servo pulse to steering servo
+STEERING_LEFT_PWM = 460         # pwm value for full left steering (use `donkey calibrate` to measure value for your car)
+STEERING_RIGHT_PWM = 290        # pwm value for full right steering (use `donkey calibrate` to measure value for your car)
+
+

A PCA9685 could also be used to generate all control signals. See pins for a detailed discussion of pin providers and pin specifiers.

+

2-pin HBridge and Steering Servo

+

A single DC gear motor is controlled with an 'mini' L298N HBridge (or an L9110S HBridge) using 2 PWM pins; one pwm pin to enable and control forward speed and one to enable and control reverse motor speed.

+

See https://www.instructables.com/Tutorial-for-Dual-Channel-DC-Motor-Driver-Board-PW/ for how an L298N mini-hbridge module is wired in 2-pin mode.
+See https://electropeak.com/learn/interfacing-l9110s-dual-channel-h-bridge-motor-driver-module-with-arduino/ for how an L9110S/HG7881 motor driver module is wired.

+

Configuration

+
    +
  • use DRIVETRAIN_TYPE = "SERVO_HBRIDGE_2PIN" in myconfig.py
  • +
  • Example pin specifiers using 40 pin GPIO header to generate signals:
  • +
+
  HBRIDGE_2PIN_DUTY_FWD = "RPI_GPIO.BOARD.18"  # provides forward duty cycle to motor
+  HBRIDGE_2PIN_DUTY_BWD = "RPI_GPIO.BOARD.16"  # provides reverse duty cycle to motor
+  PWM_STEERING_PIN = "RPI_GPIO.BOARD.33"       # provides servo pulse to steering servo
+  STEERING_LEFT_PWM = 460         # pwm value for full left steering (use `donkey calibrate` to measure value for your car)
+  STEERING_RIGHT_PWM = 290        # pwm value for full right steering (use `donkey calibrate` to measure value for your car)
+
+

A PCA9685 could also be used to generate all control signals. See pins for a detailed discussion of pin providers and pin specifiers.

+

HBridge for both Steering and Throttle

+

Some very inexpensive toy cars use a DC motor to drive the back wheels forward and reverse and another DC motor to steer the front wheels left or right. A single L298N HBridge (or L9110S HBridge) can be used to control these two motors. This driver assumes 2-pin wiring where each motor uses two PWM pins, one for each direction.

+

Configuration

+
    +
  • use DRIVETRAIN_TYPE = "DC_STEER_THROTTLE" in myconfig.py
  • +
  • Example pin specifiers using 40 pin GPIO header to generate signals:
  • +
+
  HBRIDGE_PIN_LEFT = "RPI_GPIO.BOARD.18"   # pwm pin produces duty cycle for steering left
+  HBRIDGE_PIN_RIGHT = "RPI_GPIO.BOARD.16"  # pwm pin produces duty cycle for steering right
+  HBRIDGE_PIN_FWD = "RPI_GPIO.BOARD.15"    # pwm pin produces duty cycle for forward drive
+  HBRIDGE_PIN_BWD = "RPI_GPIO.BOARD.13"    # pwm pin produces duty cycle for reverse drive
+
+

A PCA9685 could also be used to generate all control signals. See pins for a detailed discussion of pin providers and pin specifiers.

+

VESC for both Steering and Throttle

+

VESC is an advanced version of ESC that provides you a lot of customization options on how the ESC operates. It includes features such as regenerative braking, temperature control etc.

+

This was tested with a VESC 6 and Traxxas Brushless Motor +Follow this F1Tenth tutorial to update your VESC firmware and calibrate it: https://f1tenth.readthedocs.io/en/stable/getting_started/firmware/firmware_vesc.html +It's important to use the servo out bin so that we can control steering with the VESC as well

+

Requires installation of PyVESC from source for servo control (pip install git+https://github.com/LiamBindle/PyVESC.git@master) +Configuration

+
    +
  • use DRIVETRAIN_TYPE = "VESC" in myconfig.py
  • +
  • Example parameters
  • +
+
  VESC_MAX_SPEED_PERCENT =.2  # Max speed as a percent of the actual speed
+  VESC_SERIAL_PORT= "/dev/ttyACM0" # Serial device to use for communication. Can check with ls /dev/tty*
+  VESC_HAS_SENSOR= True # Whether or not the bldc motor is using a hall effect sensor
+  VESC_START_HEARTBEAT= True # Whether or not to automatically start the heartbeat thread that will keep commands alive.
+  VESC_BAUDRATE= 115200 # baudrate for the serial communication. Shouldn't need to change this.
+  VESC_TIMEOUT= 0.05 # timeout for the serial communication
+  VESC_STEERING_SCALE= 0.5 # VESC accepts steering inputs from 0 to 1. Joystick is usually -1 to 1. This changes it to -0.5 to 0.5
+  VESC_STEERING_OFFSET = 0.5 # VESC accepts steering inputs from 0 to 1. Coupled with above change we move Joystick to 0 to 1
+
+

Differential Drive cars

+

An inexpensive Donkeycar compatible robot can be constructed using a cheap smart car robot chassis that includes 2 DC gear motors and an L298N motor driver or compatible to run the motors. Steering is accomplished by running one motor faster than the other, causing the car to drive in an arc. The motor driver can be wired in one of two ways; 3 pin wiring or 2 pin wiring. The name of the DonkeyCar drivetrains for differential drive all start with DC_TWO_WHEEL.

+

3-pin HBridge Differential Drive

+

2 DC gear motors are controlled with an L298N, each motor using two TTL output pins to select direction and a PWM pin to control the power to the motor. Since each motor uses 3 pins, so a total of 6 pins are used in a differential drive configuration. The advantage of this wiring scheme is that it only requires 2 PWM pins, which happens to be the maximum number of PWM pins on the Jetson Nano.

+

See https://www.electronicshub.org/raspberry-pi-l298n-interface-tutorial-control-dc-motor-l298n-raspberry-pi/ for a discussion of how the L298N HBridge module is wired in 3-pin mode to the RaspberryPi GPIO. This also applies to the some other driver chips that emulate the L298N, such as the TB6612FNG motor driver.

+

Configuration

+
    +
  • use DRIVETRAIN_TYPE = "DC_TWO_WHEEL_L298N" in myconfig.py
  • +
  • Example pin specifiers using 40 pin GPIO header to generate signals:
  • +
+
DC_TWO_WHEEL_L298N = {
+    "LEFT_FWD_PIN": "RPI_GPIO.BOARD.16",        # TTL output pin enables left wheel forward
+    "LEFT_BWD_PIN": "RPI_GPIO.BOARD.18",        # TTL output pin enables left wheel reverse
+    "LEFT_EN_DUTY_PIN": "RPI_GPIO.BOARD.22",    # PWM pin generates duty cycle for left motor speed
+
+    "RIGHT_FWD_PIN": "RPI_GPIO.BOARD.15",       # TTL output pin enables right wheel forward
+    "RIGHT_BWD_PIN": "RPI_GPIO.BOARD.13",       # TTL output pin enables right wheel reverse
+    "RIGHT_EN_DUTY_PIN": "RPI_GPIO.BOARD.11",   # PWM pin generates duty cycle for right wheel speed
+}
+
+
    +
  • Example pin specifiers using a PCA9685 to generate signals:
  • +
+
DC_TWO_WHEEL_L298N = {
+    "LEFT_FWD_PIN": "PCA9685.1:40.3",        # TTL output pin enables left wheel forward
+    "LEFT_BWD_PIN": "PCA9685.1:40.2",        # TTL output pin enables left wheel reverse
+    "LEFT_EN_DUTY_PIN": "PCA9685.1:40.1",    # PWM pin generates duty cycle for left motor speed
+
+    "RIGHT_FWD_PIN": "PCA9685.1:40.6",       # TTL output pin enables right wheel forward
+    "RIGHT_BWD_PIN": "PCA9685.1:40.5",       # TTL output pin enables right wheel reverse
+    "RIGHT_EN_DUTY_PIN": "PCA9685.1:40.4",   # PWM pin generates duty cycle for right wheel speed
+}
+
+
    +
  • In the configuration, the HBRIDGE_L298N_PIN_xxxx_EN pins determine how fast the motors spin. These pins must support PWM output. Remember that the Jetson Nano only supports 2 PWM output pins and only if they are enabled using /opt/nvidia/jetson-io/jetson-io.py. See Generating PWM from the Jetson Nano.
  • +
  • The HBRIDGE_L298N_PIN_xxxx_FWD and HBRIDGE_L298N_PIN_xxxx_BWD pins are TTL output pins that determine the direction the motors spin.
  • +
+
+

See pins for a detailed discussion of pin providers and pin specifiers.

+
+

2 Pin HBridge Differential Drive

+

2 DC Motors controlled with an 'mini' L293D HBridge, each motor using 2 PWM pins; one pwm pin to enable and control forward speed and one to enable and control reverse motor speed. This advantage of this wiring method is that it only requires a total of 4 pins; however all of those pins must be able to output PWM.

+
    +
  • See L293 Tutorial for how an L293D mini-HBridge module is wired in 2-pin mode.
  • +
  • This driver can also be used with an L9110S/HG7881 motor driver. See Interfacing L9110S for how an L9110S motor driver module is wired.
  • +
  • The driver can also be used with a DRV8833. See DRV8833 HBridge for how to interface to an arduino.
  • +
+

Configuration

+
    +
  • use DRIVETRAIN_TYPE = "DC_TWO_WHEEL" in myconfig.py
  • +
  • example pin specifiers using the 40 pin GPIO to generate signals:
  • +
+
DC_TWO_WHEEL = {
+    "LEFT_FWD_DUTY_PIN": "RPI_GPIO.BCM.16",  # BCM.16 == BOARD.36, pwm pin produces duty cycle for left wheel forward
+    "LEFT_BWD_DUTY_PIN": "RPI_GPIO.BCM.20",  # BCM.20 == BOARD.38, pwm pin produces duty cycle for left wheel reverse
+    "RIGHT_FWD_DUTY_PIN": "RPI_GPIO.BCM.5",  # BCM.5 == BOARD.29, pwm pin produces duty cycle for right wheel forward
+    "RIGHT_BWD_DUTY_PIN": "RPI_GPIO.BCM.6",  # BCM.6 == BOARD.31, pwm pin produces duty cycle for right wheel reverse
+}
+
+
    +
  • example pin specifiers using a PCA9685 to generate signals:
  • +
+
DC_TWO_WHEEL = {
+    "LEFT_FWD_DUTY_PIN": "PCA9685.1:40.0",  # pwm pin produces duty cycle for left wheel forward
+    "LEFT_BWD_DUTY_PIN": "PCA9685.1:40.1",  # pwm pin produces duty cycle for left wheel reverse
+    "RIGHT_FWD_DUTY_PIN": "PCA9685.1:40.5",  # pwm pin produces duty cycle for right wheel forward
+    "RIGHT_BWD_DUTY_PIN": "PCA9685.1:40.6",  # pwm pin produces duty cycle for right wheel reverse
+}
+
+
+

See pins for a detailed discussion of pin providers and pin specifiers.

+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/cameras/index.html b/parts/cameras/index.html new file mode 100644 index 00000000..b00d8b2e --- /dev/null +++ b/parts/cameras/index.html @@ -0,0 +1,252 @@ + + + + + + + + Cameras - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Cameras

+

Donkeycar supports a large number of cameras via the CAMERA_TYPE configuration. For most applications a wide field of vision is important, so your camera should use a 120 degree wide angle lens or better. A 160 degree wide angle lense is recommended.

+

Camera Setup

+

If you are using the default deep learning template or the computer vision template then you will need a camera. By default myconfig.py assumes a RaspberryPi camera. You can change this by editing the CAMERA_TYPE value in the myconfig.py file in your ~/mycar folder.

+

If you are using the gps path follow template then you do not need, and may not want, a camera. In this case you can change the camera type to mock; CAMERA_TYPE = "MOCK".

+

Raspberry Pi:

+

If you are on a raspberry pi and using the recommended pi camera ("PICAM"), then no changes are needed to your myconfg.py.

+

This works with all Raspberry Pi cameras, including the original Raspberry Pi Camera Module based on the 5 megapixel OV5647 chipset and the Raspberry Pi Camera Module v2 based on the Sony IMX219 chip. These cameras are easily obtainable and are offered in generic (clone) versions by many vendors.

+

Jetson Nano:

+

The Jetson does not have a driver for the original 5 megapixels OV5647 based Raspberry Pi Camera, but it does have a driver for the v2 camera based on the IMX219 chip. Indeed the recommended camera is based on the IMX219 chip.

+

The default setting CAMERA_TYPE = "PICAM" does not work on the Jetson, even if you are using an 8mp RaspberryPi camera or a camera based on a Sony IMX219 based camera In either of these cases you will want edit your myconfg.py to have: CAMERA_TYPE = "CSIC".

+

For flipping the image vertically set CSIC_CAM_GSTREAMER_FLIP_PARM = 6 - this is helpful if you have to mount the camera in a rotated position.

+

USB Cameras

+

CAMERA_TYPE = CVCAM is a camera type that has worked for USB cameras when OpenCV is setup. This requires additional setup for OpenCV for Nano or OpenCV for Raspberry Pi.

+

If you have installed the optional pygame library then you can connect to the camera by editing the camera type to CAMERA_TYPE = "WEBCAM". See the required additional setup for pygame.

+

If you have more than one camera then you made need to the CAMERA_INDEX configuration value. By default it is zero.

+
+
+

NOTE: CAMERA_TYPE = CVCAM depends upon a version of OpenCV that has GStreamer support compiled in. This is the default on the Jetson computers and is supported in the recommended version of OpenCV for that Raspberry Pi.

+
+
+

Intel Realsense D435

+

The Intel Realsense cameras are RGBD cameras; they provide RGB images and Depth. You can use them as an RGB camera to provide images for the Deep Learning template or the Computer Vision template by setting CAMERA_TYPE = "D435" in your myconfig.py settings. You will also want to review the settings that are specific to the Intel Realsense cameras;

+
# Intel Realsense D435 and D435i depth sensing camera
+REALSENSE_D435_RGB = True       # True to capture RGB image
+REALSENSE_D435_DEPTH = True     # True to capture depth as image array
+REALSENSE_D435_IMU = False      # True to capture IMU data (D435i only)
+REALSENSE_D435_ID = None        # serial number of camera or None if you only have one camera (it will autodetect)
+
+

If you are not using depth then you will want to set REALSENSE_D435_DEPTH = False so it does not save the depth data.

+

Troubleshooting

+

If the colors look wrong it may be that the camera is outputting BGR colors rather than RGB. You can set BGR2RGB = True to convert from BGR to RGB.

+

We are adding other cameras over time, so read the camera section in myconfig.py to see what options are available.

+

If you are having troubles with your camera, check out our Discord hardware channel for more help.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/controllers/index.html b/parts/controllers/index.html new file mode 100644 index 00000000..14ca9d23 --- /dev/null +++ b/parts/controllers/index.html @@ -0,0 +1,564 @@ + + + + + + + + Controllers - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Controller Parts

+

Web Controller

+

The default controller to drive the car with your phone or browser. This has a web live preview of camera. Control options include:

+
    +
  1. A virtual joystick
  2. +
  3. The tilt, when using a mobile device with supported accelerometer
  4. +
  5. A physical joystick using the web adapter. Support varies per browser, OS, and joystick combination.
  6. +
  7. Keyboard input via the 'ikjl' keys.
  8. +
+
+

Note: Recently iOS has disabled default Safari access to motion control.

+
+

RC Controller

+

If you bought an RC car then it might have come with a standard 2.4GHz car radio and receiver as shown in picture below. This can be used to drive the car. There are a few ways this can be accomplished.

+
    +
  • +

    You can use a Teensy or Arduino microcontroller that can emulate a USB HID device and use this arduino sketch to make your RC controller emulate a game controller. Instructions for wiring and using the sketch can be found in the associated wiki. Once that is setup you would choose CONTROLLER_TYPE = "rc3" as the controller type in your myconfig.py configuration.

    +
  • +
  • +

    You can wire your RC car's receiver directly to the Raspberry Pi's gpio pins to read the length of the PWM steering and throttle signals being sent by your RC controller. Note that we don't recommend this for Jetson Nano users; the gpio support is not adequate. You can also use the RaspberryPi pins to output PWM directly to the car's servo and ESC, without the need for an I2C servo driver board. You will need to install the PiGPIO driver on you RaspberryPi to to make this work; it is not installed by default. If you wire your RC receiver to gpio pins, then you would choose CONTROLLER_TYPE = pigpio_rc in your myconfig.py configuration. A full tutorial on implementing the RC controller and servo/esc control via gpio pins is here.

    +
  • +
  • +

    Finally you can use the Donkeycar RC Hat. This board plugs into your RaspberryPi's gpio header and exposes connections for your RC Reciever's servo and throttle channels. It can also be used to control the car's steering servo and ESC, so you don't need a PCA9685 board. It includes a very handy OLED display that can be used to show your car's IP address on startup; see this DIYRobocars article. See the docs for how to setup the RC Hat. The Donkeycar RC Hat can be purchased in the Donkeycar Store.

    +
  • +
+

RC Hat for RaspberryPi

+

Joystick Controller

+

Many people find it easier to control the car using a game controller. There are several parts that provide this option.

+

The default web controller may be replaced with a one line change to use a physical joystick part for input. This uses the OS device /dev/input/js0 by default. In theory, any joystick device that the OS mounts like this can be used. In practice, the behavior will change depending on the model of joystick ( Sony, or knockoff ), or XBox controller and the Bluetooth driver used to support it. The default code has been written and tested with a Sony brand PS3 Sixaxis controller. Other controllers may work, but will require alternative Bluetooth installs, and tweaks to the software for correct axis and buttons.

+

These joysticks are known to work:

+ +

These can be enabled by finding the CONTROLLER_TYPE in your myconfig.py and setting it to the correct string identifier ( after disabling the comment ).

+

These can be used plugged in with a USB cable. It's been much more convenient to setup Bluetooth for a wireless control.

+

There are controller specific setup details below.

+
+

Note: If you have a controller that is not listed below, or you are having troubles getting your controller to work or you want to map your controller differently, see Creating a New or Custom Game Controller.

+
+

Change myconfig.py or run with --js

+
python manage.py drive --js
+
+

Will enable driving with the joystick. This disables the live preview of the camera and the web page features. If you modify myconfig.py to make USE_JOYSTICK_AS_DEFAULT = True, then you do not need to run with the --js.

+

PS3 Controller

+

Bluetooth Setup

+

Follow this guide. You can ignore steps past the 'Accessing the SixAxis from Python' section. I will include steps here in case the link becomes stale.

+
sudo apt-get install bluetooth libbluetooth3 libusb-dev
+sudo systemctl enable bluetooth.service
+sudo usermod -G bluetooth -a pi
+
+

Reboot after changing the user group.

+

Plug in the PS3 with USB cable. Hit center PS logo button. Get and build the command line pairing tool. Run it:

+
wget http://www.pabr.org/sixlinux/sixpair.c
+gcc -o sixpair sixpair.c -lusb
+sudo ./sixpair
+
+

Use bluetoothctl to pair

+
bluetoothctl
+agent on
+devices
+trust <MAC ADDRESS>
+default-agent
+quit
+
+

Unplug USB cable. Hit center PS logo button.

+

To test that the Bluetooth PS3 remote is working, verify that /dev/input/js0 exists:

+
ls /dev/input/js0
+
+

Troubleshooting

+

In case the BT connection on the Raspberry Pi does not work, you see might something like this in bluetoothctl:

+
[NEW] Controller 00:11:22:33:44:55 super-donkey [default]
+[NEW] Device AA:BB:CC:DD:EE:FF PLAYSTATION(R)3 Controller
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: yes
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: no
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: yes
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: no
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: yes
+...
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: yes
+[CHG] Device AA:BB:CC:DD:EE:FF Connected: no
+[bluetooth]#
+
+

Try updating the Linux kernel and firmware by running:

+
sudo rpi-update
+
+

And then reboot:

+
sudo reboot
+
+

Charging PS3 Sixaxis Joystick

+

For some reason, they don't like to charge in a powered USB port that doesn't have an active Bluetooth control and OS driver. This means a phone type USB charger will not work. Try a powered Linux or mac laptop USB port. You should see the lights blink after plugging in and hitting center PS logo.

+

After charging, you will need to plug-in the controller again to the Pi, hit the PS logo, then unplug to pair again.

+

New Battery for PS3 Sixaxis Joystick

+

Sometimes these controllers can be quite old. Here's a link to a new battery. Be careful when taking off the cover. Remove 5 screws. There's a tab on the top half between the hand grips. You'll want to split/open it from the front and try pulling the bottom forward as you do, or you'll break the tab off as I did.

+

PS3 Mouse problems on Linux

+

Sometimes when you plug-in the PS3 joystick it starts taking over your mouse. If you want to prevent that, you can run this:

+
xinput set-prop "Sony PLAYSTATION(R)3 Controller" "Device Enabled" 0
+
+

PS4 DualShock 4 Wireless Gamepad Controller

+

The following instructions are intended for use with the Raspberry Pi 3 or 4 running Raspberry Pi OS Buster. +The DS4 gamepad will be connected via bluetooth without installing any additional software. Bluetoothd is a system service +that runs as a daemon automatically on boot. Bluetoothctl is a program to manage connection and pairing devices.

+

Configure your user account to use Bluetoothctl without sudo

+

Add the pi user to the bluetooth group. And then reboot so that the change takes effect properly.

+
sudo usermod -a -G bluetooth pi
+sudo reboot
+
+

Scan for your PS4 gamepad

+

After reboot, run bluetoothctl, turn on scanner to find bluetooth devices. See below for an example response. Note that +the actual HEX characters will be different for your devices!

+
bluetoothctl
+<response> Agent registered
+<response> [bluetooth]# 
+scan on
+<response>
+[CHG] Controller BB:22:EE:77:BB:CC Discovering: yes
+[NEW] Device 10:20:30:40:50:60 10-20-30-40-50-60
+[NEW] Device 10:20:30:40:50:70 10-20-30-40-50-70
+[NEW] Device 10:20:30:40:50:80 10-20-30-40-50-80
+[NEW] Device 10:20:30:40:50:90 10-20-30-40-50-90
+[NEW] Device 20:AA:88:44:BB:10 WHSCL1
+
+

Wait a couple of minutes for the scanner to find all your existing bluetooth devices. Now set your gamepad in pairing mode +by holding down the share button and the playstation button together until the light double flashes. You should see a +new entry for a Wireless Controller.

+
<response>
+[NEW] Device 1C:AA:BB:99:DD:AA Wireless Controller
+
+

Turn off scanning to stop the status reporting

+
scan off
+
+

Connect to your PS4 gamepad

+

You will now connect, pair and trust the PS4 gamepad wireless controller. Trusting the paired devices will allow you to reconnect to the device after the Raspberry Pi reboots. Copy the wireless controller address. You will +type CONNECT "your wireless controller address", TRUST "your wireless controller address". In this case, "your wireless controller address" is 1C:AA:BB:99:DD:AA

+
connect 1C:AA:BB:99:DD:AA
+<response>
+Attempting to connect to 1C:AA:BB:99:DD:AA
+[CHG] Device 1C:AA:BB:99:DD:AA Connected: yes
+[CHG] Device 1C:AA:BB:99:DD:AA UUIDs: 00001124-0000-1000-8000-00805f9b34fb
+[CHG] Device 1C:AA:BB:99:DD:AA UUIDs: 00001200-0000-1000-8000-00805f9b34fb
+[CHG] Device 1C:AA:BB:99:DD:AA ServicesResolved: yes
+[CHG] Device 1C:AA:BB:99:DD:AA Paired: yes
+Connection successful
+
+

The PS4 gamepad light should now be solid. Now TRUST the PS4 gamepad wireless controller.

+
trust 1C:AA:BB:99:DD:AA
+<response>
+[CHG] Device 1C:AA:BB:99:DD:AA Trusted: yes
+Changing 1C:AA:BB:99:DD:AA trust succeeded
+
+

Type devices to see the paired-devices.

+
paired-devices
+<response>
+Device 1C:AA:BB:99:DD:AA Wireless Controller
+
+

Type quit or exit to quit the program bluetoothctl

+
quit
+
+

Use your PS4 gamepad wireless controller

+

After booting your pi, press playstation button once. The light will flash for about 5 seconds and then turn solid. If +the light goes off, try again. If this does work, run bluetoothctl and verify devices and paired-devices.

+
devices
+<response>
+Device 1C:AA:BB:99:DD:AA Wireless Controller
+
+paired-devices
+<response>
+Device 1C:A0:B8:9B:DB:A2 Wireless Controller
+
+

If it fails to connect, while running bluetoothctl, press the playstation button once. A good response will be:

+
<response>
+[CHG] Device 1C:AA:BB:99:DD:AA Connected: yes
+
+

To disconnect the controller from the Raspberry Pi, press and hold the playstation button for 10 seconds.

+

PS4 Controller (for Raspian Stretch)

+

The following instructions are based on RetroPie and ds4drv.

+

Install ds4drv

+

Running on your pi over ssh, you can directly install it:

+
sudo /home/pi/env/bin/pip install ds4drv
+
+

Grant permission to ds4drv

+
sudo wget https://raw.githubusercontent.com/chrippa/ds4drv/master/udev/50-ds4drv.rules -O /etc/udev/rules.d/50-ds4drv.rules
+sudo udevadm control --reload-rules
+sudo udevadm trigger
+
+

Run ds4drv

+
ds4drv --hidraw --led 00ff00
+
+

If you see Failed to create input device: "/dev/uinput" cannot be opened for writing, reboot and retry. Probably granting permission step doesn't take effect until rebooting. +Some controllers don't work with --hidraw. If that's the case try the command without it. --led 00ff00 changes the light bar color, it's optional.

+

Start controller in pairing mode

+

Press and hold Share button, then press and hold PS button until the light bar starts blinking. If it goes green after a few seconds, pairing is successful.

+

Run ds4drv in background on startup once booted

+
sudo nano /etc/rc.local
+
+

paste:

+
/home/pi/env/bin/ds4drv --led 00ff00
+
+

Save and exit. Again, with or without --hidraw, depending on the particular controller you are using.

+

To disconnect, kill the process ds4drv and hold PS for 10 seconds to power off the controller.

+

XBox One Controller

+

bluetooth pairing

+

This code presumes the built-in linux driver for 'Xbox Wireless Controller'; this is pre-installed on Raspbian, so there is no need to install any other drivers. This will generally show up on /dev/input/js0. There is another userland driver called xboxdrv; this code has not been tested with that driver.

+

The XBox One controller requires that the bluetooth disable_ertm parameter be set to true; to do this:

+

Jetson Nano

+

Adapted from: https://www.roboticsbuildlog.com/hardware/xbox-one-controller-with-nvidia-jetson-nano

+
    +
  1. Install these python libraries before we disable ertm.
  2. +
+
sudo apt-get install nano
+
+
    +
  1. Add Non-root access to your input folder:
  2. +
+
sudo usermod -a -G dialout $USER
+sudo reboot
+
+
    +
  1. Install sysfsutils
  2. +
+
sudo apt-get install sysfsutils
+
+
    +
  1. Edit the config to disable bluetooth ertm
  2. +
+
sudo nano /etc/sysfs.conf
+
+
    +
  • Append this to the end of the config
  • +
+
/module/bluetooth/parameters/disable_ertm=1
+
+
    +
  1. Reboot your computer
  2. +
+
sudo reboot
+
+
    +
  1. Re-pair the Xbox One Bluetooth Controller
  2. +
  3. +

    Unpair (forget) the controller first if you already tried to pair it, then pair it again. You can do this with the Bluetooth Manager GUI appliation that ships with Jetpack or if you are using command line, then use bluetoothctl:

    +
  4. +
  5. +

    Open terminal and type: + bluetoothctl

    +
  6. +
  7. then you should see the list of devices you have paired with and their corresponding MAC address. If you do not, type: + paired-devices
  8. +
  9. To un-pair a device type (replace aa:bb:cc:dd:ee:ff with the MAC address of the device to un-pair): + remove aa:bb:cc:dd:ee:ff + exit
  10. +
  11. Pair your device using either Bluetooth Manager GUI or bluetoothctl (see RaspberryPi OS instruction starting with sudo bluetoothctl)
  12. +
+

Once paired you should have a solid light on the xbox button and a stable bluetooth connection.

+

RaspberryPi OS

+
    +
  • edit the file /etc/modprobe.d/xbox_bt.conf (that may create the file; it is commonly not there by default)
  • +
  • add the line: options bluetooth disable_ertm=1
  • +
  • reboot so that this takes affect.
  • +
  • after reboot you can verify that disable_ertm is set to true entering this + command in a terminal:
  • +
+

bash + cat /sys/module/bluetooth/parameters/disable_ertm

+
    +
  • the result should print 'Y'. If not, make sure the above steps have been done correctly.
  • +
+

Once that is done, you can pair your controller to your Raspberry Pi using the bluetooth tool. Enter the following command into a bash shell prompt:

+
sudo bluetoothctl
+
+

That will start blue tooth pairing in interactive mode. The remaining commands will be entered in that interactive session. Enter the following commands:

+
agent on
+default-agent
+scan on
+
+

That last command will start the Raspberry Pi scanning for new bluetooth devices. At this point, turn on your XBox One controller using the big round 'X' button on top, then start the pairing mode by pressing the 'sync' button on the front of the controller. Within a few minutes, you should see the controller show up in the output something like this;

+
[NEW] Device B8:27:EB:A4:59:08 XBox One Wireless Controller
+
+

Write down the MAC address, you will need it for the following steps. Enter this command to pair with your controller:

+
connect YOUR_MAC_ADDRESS
+
+

where YOUR_MAC_ADDRESS is the MAC address you copied previously. If it does not connect on the first try, try again. It can take a few tries. If your controller connects, but then immediately disconnects, your disable_ertm setting might be wrong (see above).

+

Once your controller is connected, the big round 'X' button on the top of your controller should be solid white. Enter the following commands to finish:

+
trust YOUR_MAC_ADDRESS
+quit
+
+

Now that your controller is trusted, it should automatically connect with your Raspberry Pi when they are both turned on. If your controller fails to connect, run the bluetoothctl steps again to reconnect.

+

Creating a New or Custom Game Controller

+

To discover or modify the button and axis mappings for your controller, you can use the Joystick Wizard. The Joystick Wizard will write a custom controller named 'my_joystick.py' to your mycar folder. To use the custom controller, set CONTROLLER_TYPE="custom" in your myconfig.py.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/fastai/index.html b/parts/fastai/index.html new file mode 100644 index 00000000..3c588957 --- /dev/null +++ b/parts/fastai/index.html @@ -0,0 +1,240 @@ + + + + + + + + Fastai - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Fastai(PyTorch) Parts

+

These parts encapsulate models defined using the FastAi high level api. They are intended to be used with the PyTorch backend. This allows you to build models using PyTorch or transfer learning.

+
+

Note This part is interchangeable with the Keras part but does not have TensorRT or TfLite support.

+
+

The parts are designed to use the trained artificial neural network to reproduce the steering and throttle given the image the camera sees. They are created by using the train command.

+

FastAi Linear

+

This model type is created with the --type=fastai_linear.

+

The FastAILinear pilot uses one neuron to output a continuous value via the +PyTorch Dense layer with linear activation. One each for steering and throttle. +The output is not bounded.

+

Pros

+
    +
  • Steers smoothly.
  • +
  • It has been very robust.
  • +
  • Performs well in a limited compute environment like the Pi3.
  • +
  • No arbitrary limits to steering or throttle.
  • +
+

Cons

+
    +
  • May sometimes fail to learn throttle well.
  • +
+

Model Summary

+

Input: Image

+

Network: 5 Convolution layers followed by two dense layers before output

+

Output: Two dense layers with one scalar output each with linear activation for steering and throttle.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/imu/index.html b/parts/imu/index.html new file mode 100644 index 00000000..91d4cc4c --- /dev/null +++ b/parts/imu/index.html @@ -0,0 +1,273 @@ + + + + + + + + IMU - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

IMU

+

IMUs or inertial measurement units are parts that sense the inertial forces on a robot. They vary depending on sensor, but may commonly include linear and rotational accelleration. They may sometimes include magnetometer to give global compasss facing dir. Frequently temperature is available from these as it affects their sensitivity.

+

MPU6050/MPU9250

+

This is a cheap, small, and moderately precise imu. Commonly available at Amazon.

+

MPU9250 offers additional integrated magnetometer.

+
    +
  • Typically uses the I2C interface and can be chained off the default PWM PCA9685 board. This configuration will also provide power.
  • +
  • MPU6050: Outputs acceleration X, Y, Z, Gyroscope X, Y, Z, and temperature.
  • +
  • MPU6250: Outputs acceleration X, Y, Z, Gyroscope X, Y, Z, Magnetometer X, Y, Z and temperature.
  • +
  • Chip built-in 16bit AD converter, 16bit data output
  • +
  • Gyroscopes range: +/- 250 500 1000 2000 degree/sec
  • +
  • Acceleration range: ±2 ±4 ±8 ±16g
  • +
+

Software Setup

+

Install smbus

+
    +
  • either from package:
  • +
+
 sudo apt install python3-smbus
+
+
    +
  • or from source:
  • +
+
sudo apt-get install i2c-tools libi2c-dev python-dev python3-dev
+git clone https://github.com/pimoroni/py-smbus.git
+cd py-smbus/library
+python setup.py build
+sudo python setup.py install
+
+

For MPU6050:

+

Install pip lib for mpu6050:

+
pip install mpu6050-raspberrypi
+
+

For MPU9250:

+

Install pip lib for mpu9250-jmdev:

+
pip install mpu9250-jmdev
+
+

Configuration

+

Enable the following configurations to your myconfig.py:

+
#IMU
+HAVE_IMU = True
+IMU_SENSOR = 'mpu9250'          # (mpu6050|mpu9250)
+IMU_DLP_CONFIG = 3
+
+

IMU_SENSOR can be either mpu6050 or mpu9250 based on the sensor you are using.

+

IMU_DLP_CONFIG allows to change the digital lowpass filter settings for your IMU. Lower frequency settings (see below) can filter high frequency noise at the expense of increased latency in IMU sensor data. +Valid settings are from 0 to 6:

+
    +
  • 0 250Hz
  • +
  • 1 184Hz
  • +
  • 2 92Hz
  • +
  • 3 41Hz
  • +
  • 4 20Hz
  • +
  • 5 10Hz
  • +
  • 6 5Hz
  • +
+

Notes on MPU9250

+

At startup the MPU9250 driver performs calibration to zero accel and gyro bias. Usually the process takes less than 10 seconds, and in that time avoid moving or touching the car. +Please place the car on the ground before starting Donkey.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/keras/index.html b/parts/keras/index.html new file mode 100644 index 00000000..ed4cafd2 --- /dev/null +++ b/parts/keras/index.html @@ -0,0 +1,434 @@ + + + + + + + + Keras - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Keras Parts

+

These parts encapsulate models defined using the Keras high level api. They are intended to be used with the Tensorflow backend. The parts are designed to use the trained artificial neural network to reproduce the steering and throttle given the image the camera sees. They are created by using the train command.

+

Keras Categorical

+

This model type is created with the --type=categorical.

+

The KerasCategorical pilot breaks the steering and throttle decisions into discreet angles and then uses categorical cross entropy to train the network to activate a single neuron for each steering and throttle choice. This can be interesting because we get the confidence value as a distribution over all choices. +This uses the dk.utils.linear_bin and dk.utils.linear_unbin to transform continuous real numbers into a range of discreet values for training and runtime. +The input and output are therefore bounded and must be chosen wisely to match the data. +The default ranges work for the default setup. But cars which go faster may want to enable a higher throttle range. And cars with larger steering throw may want more bins.

+

This model was the original model, with some modifications, when Donkey was first created.

+

Pros

+
    +
  • It has some benefits of showing the confidense as a distribution via the makemovie command.
  • +
  • It has been very robust.
  • +
  • In some cases this model has learned thottle control better than other models.
  • +
  • Performs well in a limited compute environment like the Pi3.
  • +
+

Cons

+
    +
  • Suffers from some arbitrary limitations of the chosen limits for number of categories, and thottle upper limit.
  • +
+

Model Summary

+

Input: Image

+

Network: 5 Convolution layers followed by two dense layers before output

+

Output: Two dense layers, 16, and 20 w categorical output

+

Keras Linear

+

This model type is created with the --type=linear.

+

The KerasLinear pilot uses one neuron to output a continous value via the +Keras Dense layer with linear activation. One each for steering and throttle. +The output is not bounded.

+

Pros

+
    +
  • Steers smoothly.
  • +
  • It has been very robust.
  • +
  • Performs well in a limited compute environment like the Pi3.
  • +
  • No arbitrary limits to steering or throttle.
  • +
+

Cons

+
    +
  • May sometimes fail to learn throttle well.
  • +
+

Model Summary

+

Input: Image

+

Network: 5 Convolution layers followed by two dense layers before output

+

Output: Two dense layers with one scalar output each with linear activation for steering and throttle.

+

Keras IMU

+

This model type is created with the --type=imu.

+

The KerasIMU pilot is very similar to the KerasLinear model, except that it takes intertial measurment data in addition to images when learning to drive. +This gives our stateless model some additional information about the motion of the vehicle.

+

This can be a good starting point example of ingesting more data into your models.

+

Pros

+
    +
  • Steers very smoothly.
  • +
  • Performs well in a limited compute environment like the Pi3.
  • +
  • No arbitrary limits to steering or throttle.
  • +
  • Gives additional state to the model, which might help it come to a stop at a stop sign.
  • +
+

Cons

+
    +
  • Driving quality will suffer if noisy imu is used.
  • +
+

Model Summary

+

Input: Image, vector of linear and angular acceleration

+

Network: 5 Convolution layers followed by two dense layers before output, Vector data is followed by 3 dense layers then concatenating before 2 dense control layers and after conv2d layers.

+

Output: Two dense layers with one scalar output each with linear activation for steering and throttle.

+

Keras Latent

+

This model type is created with the --type=latent.

+

The KerasLatent pilot tries to force the model to learn a latent vector in addition to driving. This latent vector is a bottleneck in a CNN that then tries to reproduce the given input image and produce driving commands. These dual tasks could produce a model that learns to distill the driving scene and perhaps better abstract to a new track.

+

Pros

+
    +
  • Steers smoothly.
  • +
  • Performs well in a limited compute environment like the Pi3.
  • +
  • No arbitrary limits to steering or throttle.
  • +
  • Image output a measure of what the model has deemed important in the scene.
  • +
+

Cons

+
    +
  • Needs more testing to prove theory.
  • +
+

Model Summary

+

Input: Image

+

Network: 5 Convolution layers bottleneck to a 10x1x1 vector, followed by 6Conv2dTranspose layers before outputing to a image and 3 dense layers and driving controls.

+

Output: Two dense layers with one scalar output each with linear activation for steering and throttle. Outputs an image.

+

Keras RNN

+

This model type is created with the --type=rnn.

+

The KerasRNN pilot uses a sequence of images to control driving rather than just a single frame. The number of images used is controlled by the SEQUENCE_LENGTH value in myconfig.py.

+

Pros

+
    +
  • Steers very smoothly.
  • +
  • Can train to a lower loss
  • +
+

Cons

+
    +
  • Performs worse in a limited compute environment like the Pi3.
  • +
  • Takes longer to train.
  • +
+

Model Summary

+

Input: Image

+

Network: 4 time distributed Convolution layers, followed by 2 LSTM layers, 3 dense layers, and driving controls.

+

Output: One dense layer with two scalar outputs for steering and throttle.

+

Keras 3D

+

This model type is created with the --type=3d.

+

The Keras3D_CNN pilot uses a sequence of images to control driving rather than just a single frame. The number of images used is controlled by the SEQUENCE_LENGTH value in myconfig.py. Instead of 2d convolutions like most other models, this uses a 3D convolution across layers.

+

Pros

+
    +
  • Steers very smoothly.
  • +
  • Can train to a lower loss.
  • +
+

Cons

+
    +
  • Performs worse in a limited compute environment like the Pi3.
  • +
  • Takes longer to train.
  • +
+

Model Summary

+

Input: Image

+

Network: 4 3D Convolution layers each followed by max pooling, followed by 2 dense layers, and driving controls.

+

Output: One dense layer with two scalar outputs for steering and throttle.

+

Keras Behavior

+

This model type is created with the --type=behavior.

+

The KerasBehavioral pilot takes an image and a vector as input. The vector is one hot activated vector of commands. This vector might be of length two and have two states, one for left lane driving and one for right lane driving. Then during training one element of the vector is activated while the desired behavior is demonstrated. This vector is defined in myconfig.py BEHAVIOR_LIST. BEHAVIOR_LED_COLORS must match the same length and can be useful when showing the current state. TRAIN_BEHAVIORS must be set to True.

+

Pros

+
    +
  • Can create a model which can perform multiple tasks
  • +
+

Cons

+
    +
  • Takes more effort to train.
  • +
+

Model Summary

+

Input: Image, Behavior vector

+

Network: 5 Convolution layers, followed by 2 dense layers, and driving controls.

+

Output: Categorical steering, throttle output similar to Categorical keras model.

+

Keras Localizer

+

This model type is not created without some code modification.

+

The KerasLocalizer pilot is very similar to the Keras Linear model, except that it learns to output it's location as a category. +This category is arbitrary, but has only been tested as a 0-9 range segment of the track. This requires that the driving data is marked up with a category label for location. This could supply some higher level logic with track location, for driving stategy, lap counting, or other.

+

Pros

+
    +
  • Steers smoothly.
  • +
  • Performs well in a limited compute environment like the Pi3.
  • +
  • No arbitrary limits to steering or throttle.
  • +
  • Location to supply some higher level logic.
  • +
+

Cons

+
    +
  • May sometimes fail to learn throttle well.
  • +
+

Model Summary

+

Input: Image

+

Network: 5 Convolution layers followed by two dense layers before output

+

Output: Two dense layers with one scalar output each with linear activation for steering and throttle. One categorical output for location.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/lidar/index.html b/parts/lidar/index.html new file mode 100644 index 00000000..f95c0cda --- /dev/null +++ b/parts/lidar/index.html @@ -0,0 +1,234 @@ + + + + + + + + Lidar - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Lidar

+

A Lidar sensor can be used with Donkeycar to provide obstacle avoidance or to help navigate on tracks with walls. It records data along with the camera during training and this can be used for training

+

NOTE: Lidar is currently only supported in the Dev branch. To use it, after you git clone donkeycar, do a git checkout dev

+

Donkey lidar

+

Supported Lidars

+

We currently only support the RPLidar series of sensors, but will be adding support for the similar YDLidar series soon.

+

We recommend the $99 A1M8 (12m range)

+

Hardware Setup

+

Mount the Lidar underneath the camera canopy as shown above (the RPLidar A2M8 is used there, but the A1M8 mounting is the same). You can velcro the USB adapter under the Donkey plate and use a short USB cable to connect to one of your RPi or Nano USB ports. It can be powered by the USB port so there's no need for an additional power supply.

+

Software Setup

+

Lidar requires the glob library to be installed. If you don't already have that, install it with pip3 install glob2

+

Also install the Lidar driver: pip install Adafruit_CircuitPython_RPLIDAR

+

Then go to the lidarcar directory and edit the myconfig.py file to ensure that the Lidar is turned on. The upper and lower limits should be set to reflect the areas you want your Lidar to "look at", omitting the areas that are blocked by parts of the car body. An example is shown below. For the RPLidar series, 0 degrees is in the direction of the motor (in the case of the A1M8) or cable (in the case of the A2M8)

+
# LIDAR
+USE_LIDAR = True
+LIDAR_TYPE = 'RP' #(RP|YD)
+LIDAR_LOWER_LIMIT = 90 # angles that will be recorded. Use this to block out obstructed areas on your car and/or to avoid looking backwards. Note that for the RP A1M8 Lidar, "0" is in the direction of the motor 
+LIDAR_UPPER_LIMIT = 270
+
+

Lidar limits

+

Template support

+

Neither the deep learning template nor the path follow template supports Lidar data directly. There is an issue to add Lidar data to the deep learning template. Lidar would also be very useful in the path follow template for obstacle detection and avoidance. If you are interested in working on such projects, please join the discord community and let us know; we will be happy to provide you with support.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/odometry/index.html b/parts/odometry/index.html new file mode 100644 index 00000000..286d34d7 --- /dev/null +++ b/parts/odometry/index.html @@ -0,0 +1,327 @@ + + + + + + + + Odometry/encoders - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Odometry

+

Odometry is a way to calculate the speed and distance travelled of the car by measuring the rotation of its wheels using a sensor called an rotary encoder. This encoder can be on the motor, on the main drive shaft or on individual wheels. The advantage of using an encoder is that it "closes the loop" with your throttle, so your car can reliably command an actual velocity rather than just issuing a motor control which will produce a faster or slower velocity depending on the slope of the track, the surface or mechanical friction in your drive train while turning. In short, an encoder gives you much better control over your speed.

+

Encoders come in various forms

+
    +
  • Quadrature encoders use dual sensors to measure pulses as the shaft turns and have the advantage of being very precise as well as being able to tell the difference between forward and reverse rotation. These may use hall-effect sensors that measure magnetic pulses or optical sensors that pass light through a slotted disk that is fully enclosed to eliminate external interference.
  • +
  • Mono encoders use a single sensor to count pulses so they can not determine the direction of motion. They are typically smaller and cheaper than quadrature encoders. A common example is an optical version that uses a LED emitter/receiver sensor with a slotted disk that is attached to the output shaft. as the output shaft rotates it rotates the slotted disk, and so the light is interrupted and those pulses are counted. These sensors are cheap and easy to install but cannot determine the direction of rotation
  • +
+

There are several ways to read encoders with Donkey:

+

Arduino: +The recommended way is with an Arduino compatible microcontroller running one of the Arduino sketches included with Donkeycar. Since the microcontroller is dedicated to counting pulses it can maintain an accurate count event with very high resolution encoders. This is critical if you are using a high resolution encoder so it does not drop any encoder pulses and so undercount. There are two Arduino sketches available:

+
    +
  • mono_encoder.ino supports single-channel encoders like the simple encoders with the 20 slot encoder disk.
  • +
  • quadrature_encoder.ino support two-channel quadrature encoders that can detect direction as well as encoder counts.
  • +
+

Both sketches support a single encoder or two encoders in a differential drive arrangement. They can be compiled using interrupts for high resolution encoders or as polled-encoders with robust debouncing. Both sketches transmit the count to the RPi via the USB serial port when requested by Donkeycar, which lightens the processing load for the Rpi.

+

GPIO: +If you are using a low-resolution mono-encoder attached to the output shaft of the motor or to the drive shaft then the Raspberry Pi's GPIO pins may be adequate to count the pulses. Remember that the GPIO pins only support 3.3v devices; if you can supply your encoder's VCC at 3.3v then it will generally output 3.3v pulses, in which case you can directly connect it to the RaspberryPi's GPIO pins.

+

Supported Encoders

+

Examples of rotary encoders that are supported:

+ +

Hardware Setup

+

How you attach your encoder is up to you and which kind of encoder you're using. For example, here's one way to put a quadrature encoder on the main drive shaft. Here is a more complex setup with dual encoders.

+

But this is the easiest way to do it, with a cheap and simple optical encoder on the main drive shaft of a standard Donkeycar chassis (if your chassis is different, the same overall approach should work, although you may have to find a different place to mount the sensor):

+

First, unscrew the plate over the main drive shaft. Tilt the rear wheels back a bit and you should be able to remove the shaft.

+

drive shaft

+

Now enlarge the hole in the optical encoder disc that came with your sensor (use a drill or Dremel grinding stone) so you can slip it onto the shaft. Stretch a rubber grommet (you can use the sort typically included with servos to mount them, but any one of the right size will do) over the shaft and push it into the encoder disc hole. If you don't have a grommet, you can wrap tape around the shaft until it's large enough to hold the disc firmly. Once you've ensured it's in the right place, use a few drops of superglue or hot glue to hold it in place)

+

drive shaft

+

drive shaft

+

Cut out a small notch (marked in pencil here) in the plate covering the drive shaft, so you can mount the encoder sensor there, ensuring that the disc can turn freely in the gap in front of the steering servo

+

drive plate

+

Now replace the plate and drill two holes so you can screw in the encoder sensor. Slide the disc along the shaft so that it doesn't bind on the sensor.

+

drive plate

+

Use three female-to-female jumper cables and connect the sensor to your RPi GPIO pins as follows. Connect the GND, V+ (which might say 5V or 3.3V) and data pin (which will say "Out or "D0") to the RPi 5V, Ground and GPIO 13 as shown here (if your sensor encoder has four pins, ignore the one that says "A0"): +wiring diagram

+

Note: if you're already using GPIO 13 for another reason, such as RC input or output, you can use any other free GPIO pin. Just change the ODOM_PIN number accordingly in the myconfig.py file as shown below.

+

Software Setup

+

Enable odometry in myconfig.py.

+
#
+# ODOMETRY
+#
+HAVE_ODOM = False               # Do you have an odometer/encoder
+HAVE_ODOM_2 = False             # Do you have a second odometer/encoder as in a differential drive robot.
+                                # In this case, the 'first' encoder is the left wheel encoder and
+                                # the second encoder is the right wheel encoder.
+ENCODER_TYPE = 'GPIO'           # What kind of encoder? GPIO|arduino.
+                                # - 'GPIO' refers to direct connect of a single-channel encoder to an RPi/Jetson GPIO header pin.
+                                #   Set ODOM_PIN to the gpio pin, based on board numbering.
+                                # - 'arduino' generically refers to any microcontroller connected over a serial port.
+                                #   Set ODOM_SERIAL to the serial port that connects the microcontroller.
+                                #   See 'arduino/encoder/encoder.ino' for an Arduino sketch that implements both a continuous and
+                                #    on demand protocol for sending readings from the microcontroller to the host.
+ENCODER_PPR = 20                # encoder's pulses (ticks) per revolution of encoder shaft.
+ENCODER_DEBOUNCE_NS = 0         # nanoseconds to wait before integrating subsequence encoder pulses.
+                                # For encoders with noisy transitions, this can be used to reject extra interrupts caused by noise.
+                                # If necessary, the exact value can be determined using an oscilliscope or logic analyzer or
+                                # simply by experimenting with various values.
+FORWARD_ONLY = 1
+FORWARD_REVERSE = 2
+FORWARD_REVERSE_STOP = 3
+TACHOMETER_MODE=FORWARD_REVERSE # FORWARD_ONLY, FORWARD_REVERSE or FORWARD_REVERSE_STOP
+                                # For dual channel quadrature encoders, 'FORWARD_ONLY' is always the correct mode.
+                                # For single-channel encoders, the tachometer mode depends upon the application.
+                                # - FORWARD_ONLY always increments ticks; effectively assuming the car is always moving forward
+                                #   and always has a positive throttle. This is best for racing on wide open circuits where
+                                #   the car is always under throttle and where we are not trying to model driving backwards or stopping.
+                                # - FORWARD_REVERSE uses the throttle value to decide if the car is moving forward or reverse
+                                #   increments or decrements ticks accordingly.  In the case of a zero throttle, ticks will be
+                                #   incremented or decremented based on the last non-zero throttle; effectively modelling 'coasting'.
+                                #   This can work well in situations where the car will be making progress even when the throttle
+                                #   drops to zero.  For instance, in a race situatino where the car may coast to slow down but not
+                                #   actually stop.
+                                # - FORWARD_REVERSE_STOP uses the throttle value to decide if the car is moving forward or reverse or stopped.
+                                #   This works well for a slower moving robot in situations where the robot is changing direction; for instance4
+                                #   when doing SLAM, the robot will explore the room slowly and may need to backup.
+MM_PER_TICK = WHEEL_RADIUS * 2 * 3.141592653589793 * 1000 / ENCODER_PPR           # How much travel with a single encoder tick, in mm. Roll you car a meter and divide total ticks measured by 1,000
+ODOM_SERIAL = '/dev/ttyACM0'    # serial port when ENCODER_TYPE is 'arduino'
+ODOM_SERIAL_BAUDRATE = 115200   # baud rate for serial port encoder
+ODOM_PIN = 13                   # if using ENCODER_TYPE=GPIO, which GPIO board mode pin to use as input
+ODOM_PIN_2 = 14                 # GPIO for second encoder in differential drivetrains
+ODOM_SMOOTHING = 1              # number of odometer readings to use when calculating velocity
+ODOM_DEBUG = False              # Write out values on vel and distance as it runs
+
+

If you are using an Arduino compatible microcontroller to read your encoder, set ENCODER_TYPE = 'arduino' in the myconfig.py file. The microcontroller should be flashed using the Arduino IDE with one of the sketches in the arduino folder. The sketches can be checked in the Arduino IDE by using the serial console after flashing the microcontroller. The sketches implement the r/p/c command protocol for on-demand sending of encoder value and continuous sending with provided delay. Commands are sent one per line (ending in '\n'):

+
    +
  • r command resets position to zero
  • +
  • p command sends position immediately
  • +
  • c command starts/stops continuous mode
      +
    • if it is followed by an integer, then use this as the delay in ms between readings.
    • +
    • if it is not followed by an integer then stop continuous mode
    • +
    +
  • +
+

With a single encoder setup the encoder sends the tick count and a timestamp as a comma delimited pair over the serial/USB port:

+

{ticks},{ticksMs}

+

In a dual encoder setup the second encoder values as separated from the first by a semicolon:

+

{ticks},{ticksMs};{ticks},{ticksMs}

+

The tachometer.py file that implements the encoder parts also has a __main__ function, so it can be run directly. After activating the donkey python environment the file can be run to check your hookup and to determine configuration parameters. Run this to get the available arguments:

+
python donkeycar/parts/tachometer.py 
+
+

Odometer and Kinematics for Pose Estimation and Path Following

+

An encoder setup can be used to estimate not only the vehicle's speed, but its position. This requires a few configurations to be set in the myconfig.py; basically measurements of the wheel diameter, the length of the wheel base and the length of the axle. This then allows encoders to be used with the Path Follow template in place of GPS, so it can be used indoors.

+
#
+# MEASURED ROBOT PROPERTIES
+#
+AXLE_LENGTH = 0.03     # length of axle; distance between left and right wheels in meters
+WHEEL_BASE = 0.1       # distance between front and back wheels in meters
+WHEEL_RADIUS = 0.0315  # radius of wheel in meters
+MIN_SPEED = 0.1        # minimum speed in meters per second; speed below which car stalls
+MAX_SPEED = 3.0        # maximum speed in meters per second; speed at maximum throttle (1.0)
+MIN_THROTTLE = 0.1     # throttle (0 to 1.0) that corresponds to MIN_SPEED, throttle below which car stalls
+MAX_STEERING_ANGLE = 3.141592653589793 / 4  # for car-like robot; maximum steering angle in radians (corresponding to tire angle at steering == -1)
+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/oled/index.html b/parts/oled/index.html new file mode 100644 index 00000000..4d50ca51 --- /dev/null +++ b/parts/oled/index.html @@ -0,0 +1,248 @@ + + + + + + + + OLED - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

OLED Displays

+

OLED displays can be used to show information about the current state of the car. This is especially useful in the when collecting data for training, and when racing.

+

The OLED display currently displays the following information: +* The IP address of the car (eth and wlan) +* The number of records collected, for training. +* The driving mode.

+

Supported displays

+

Examples of displays that are currently supported are:

+ +

Hardware Setup

+

Simply connect the display to the I2C pins on the Raspberry Pi or the Jetson Nano. Use bus 1 so the display can be inserted directly on the pins. Here is an example of what that looks like.

+

Software Setup

+

Enable the display in myconfig.py by uncommenting this line USE_SSD1306_128_32 = False by removing the # at the start and change False to True. If you have a 128x32 OLED select resolution 1, if you have 128x64 select resolution 2 and don't forget to remove the # in front of that line, too, to make it active.

+

This part of your myconfig.py file should now look like this.

+
USE_SSD1306_128_32 = True    # Enable the SSD_1306 OLED Display
+# SSD1306_128_32_I2C_ROTATION = 0 # 0 = text is right-side up, 1 = rotated 90 degrees clockwise, 2 = 180 degrees (flipped), 3 = 270 degrees
+SSD1306_RESOLUTION = 2 # 1 = 128x32; 2 = 128x64
+
+

Showing your IP address on startup.

+

One of the cool things about having an OLED screen is that you can show your car's IP address on startup, so you can connect to it. Instructions to set that up are here

+

Troubleshooting

+

If you are unable to start the car, ensure that the Adafruit_SSD1306 package is installed in your virtual environment. This should automatically be installed, if you are using a recent version of donkeycar.

+
pip install Adafruit_SSD1306
+
+

Known Issues

+
    +
  • The Adafruit_SSD1306 library is incompatible with steering/motor configurations when the Duty Cycle/PWM is supplied directly from GPIO header using the RPI_GPIO pin provider. This is because internally the Adafruit library sets a GPIO pin mode that is incompatible with our GPIO library. In this case you have a couple of options:
  • +
  • Use a PCA9685 to generate the necessary duty cycle/PWM for throttle and steering.
  • +
  • Use the PIGPIO pin provider to generate the necessary duty cycle/PWM for throttle and steering from the GPIO. See PIGPIO for how to set up the pigpio library.
  • +
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/path_following/index.html b/parts/path_following/index.html new file mode 100644 index 00000000..6a9e6992 --- /dev/null +++ b/parts/path_following/index.html @@ -0,0 +1,249 @@ + + + + + + + + Path Following with the Intel Realsense T265 sensor - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+
    +
  • + +
  • + Edit on GitHub +
  • +
+
+
+
+
+ +

Path Following with the Intel Realsense T265 sensor

+

Rather than using a standard camera and training a network to drive, Donkeycar supports using the Intel Realsense T265 "tracking camera" to follow a path instead. In this application, you simply drive a path once manually, and Donkeycar will "remember" that path and repeat it autonomously.

+

The Intel T265 uses a combination of stereo cameras and an internal Inertial Measurement Unit (IMU) plus its own Myriad X processor to do Visual Inertial Odometry, which is a fancy way of saying that it knows where it is by looking at the scene around it as it moves and correlating that with the IMU's sensing to localize itself, outputting an X,Y,Z position to Donkey, much as a GPS sensor would (but ideally much more accurately, to a precision of centemeters)

+
+
    +
  • Note Although the Realsense T265 can be used with a Nvidia Jetson Nano, it's a bit easier to set up with a Raspberry Pi (we recommend the RPi 4, with at least 4GB memory). Also, the Intel Realsense D4XX series can also be used with Donkeycar as a regular camera (with the use of its depth sensing data coming soon), and we'll add instructions for that when it's ready.
  • +
+

Original T265 path follower code by Tawn Kramer

+

Step 1: Setup Librealsense on Ubuntu Machine

+

Using the latest version of Raspian (tested with Raspian Buster) on the RPi, follow these instructions to set up Intel's Realsense libraries (Librealsense) and dependencies. Although those instructions discuss another Realsense sensor, they work equally well for the T265. There are also video instructions

+

Step 2: Setup Donkeycar

+

Follow the standard instructions here. With the Path Follower, there is no need to install Tensorflow for this particular Donkeycar configuration however do install numpy/upgrade before running "pip install -e .[pi]"

+

Step 3: Create the Donkeycar path follower app

+

```donkey createcar --path ~/follow --template path_follow

+

Step 4: Check/change your config settings

+

cd ~follow +sudo nano myconfig.py

+

Make sure you agree with the default values or adjust them to your liking (ie. "throttle", "steering", PIDs, etc.). Uncomment (remove the #) for any line you've changed. In Nano press cntrl-o to save the file and cntrl-x to exit.

+

Step 5: Run the Donkeycar path follower app

+

Running +ssh pi@<your pi’s IP address or "raspberrypi.local"> +cd ~/follow +python3 manage.py drive

+

Keep the terminal open to see the printed output of the app while it is running.

+

If you get an error saying that it can't find the T265, unplug the sensor, plug it back in and try again. Ensure that your gamepad is on and connected, too (blue light is on the controller)

+

Once it’s running, open a browser on your laptop and enter this in the URL bar: http://:8890

+

When you drive, the Web interface will draw a red line for the path, a green circle for the robot location. If you're seeing the green dot but not the red line, that means that a path file has already been written. Delete “donkey_path.pkl” (rm donkey_path.pkl), restart and the red line should show up

+

PS4 Gamepad controls are as follows: ++------------------+--------------------------+ +| control | action | ++------------------+--------------------------+ +| share | toggle auto/manual mode | +| circle | save_path | +| triangle | erase_path | +| cross | emergency_stop | +| L1 | increase_max_throttle | +| R1 | decrease_max_throttle | +| options | toggle_constant_throttle | +| square | reset_origin | +| L2 | dec_pid_d | +| R2 | inc_pid_d | +| left_stick_horz | set_steering | +| right_stick_vert | set_throttle | ++------------------+--------------------------+

+

Step 6: Driving instructions

+

1) Mark a nice starting spot for your robot. Be sure to put it right back there each time you start. +2) Drive the car in some kind of loop. You see the red line show the path. +3) Hit circle on the PS3/4 controller to save the path. +4) Put the bot back at the start spot. +5) Then hit the “select” button (on a PS3 controller) or “share” (on a PS4 controller) twice to go to pilot mode. This will start driving on the path. If you want it go faster or slower, change this line in the myconfig.py file: THROTTLE_FORWARD_PWM = 400

+

Check the bottom of myconfig.py for some settings to tweak. PID values, map offsets and scale. things like that. You might want to start by downloading and using the myconfig.py file from my repo, which has some known-good settings and is otherwise a good place to start.

+

Some tips:

+

When you start, the green dot will be in the top left corner of the box. You may prefer to have it in the center. If so, change PATH_OFFSET = (0, 0) in the myconfig.py file to PATH_OFFSET = (250, 250)

+

For a small course, you may find that the path is too small to see well. In that case, change PATH_SCALE = 5.0 to PATH_SCALE = 10.0 (or more, if necessary)

+

When you're running in auto mode, the green dot will change to blue

+

It defaults to recording a path point every 0.3 meters. If you want it to be smoother, you can change to a smaller number in myconfig.py with this line: PATH_MIN_DIST = 0.3

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/parts/pins/index.html b/parts/pins/index.html new file mode 100644 index 00000000..5ba11f38 --- /dev/null +++ b/parts/pins/index.html @@ -0,0 +1,353 @@ + + + + + + + + Pins - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Pin Specifiers

+

Control signals are send and received by pins on the Raspberry Pi, Jetson Nano and connected peripherals, like the PCA9685 Servo controller. Starting wit version 5.x, Donkeycar uses 'pin specs' to specify pins including various configuration that is specific to the underlying hardware or library implementation. This allows use to make the underlying logic, like how a motor controller takes throttle values and outputs them to the motor, more independent of the particular hardware or library used to generate the signals.

+

Types of Pins

+

PWM pins generate a square wave, sometimes called a PWM pulse. This is used to control servo motors, electronic speed controllers, and LEDs.

+

TTL Output pins can generate either a high value (1) or a low value (0)

+

TTL Input pins read values as either high (1) or low (0)

+

Pin Providers

+

Donkeycar supports several technologies for specifying pins. Pins are specified as a string that identifies the provider, the pin number and any techology specific configuration.

+

PCA9685

+

The PCA9685 Servo controller supports 16 PWM and TTL output pins. The PCA9685 can only output signals; it does not support input pins. The pin specifier for a PCA9685 pin includes:

+
    +
  • the I2C bus to which the PCA9685 is attached
  • +
  • the address in hex of the PCA9685 on the I2C bus
  • +
  • the channel number 0.15
  • +
+

For example, "PCA9685.1:40.13" specifies channel 13 on the PCA9685 on I2C bus 1 at address 0x40.

+

For example, "PCA9685.0:60.1" specified channel 1 on the PCA9685 on I2C bus 0 at address 0x60

+

RPI_GPIO

+

Donkeycar installs the RPi.GPIO library on the RaspberryPi in the default installation. The Jetson.GPIO library is compatible library installed by default on the Jetson Nano. Both of these libaries work is a similar fashion to support PWM, input and output pins on the 40 pin GPIO bus of the RaspberryPi or Jetson Nano respectively. The pin specifier includes:

+
    +
  • The pin addressing scheme
  • +
  • "BOARD" indicates the pin number is based on the pin numbers, 0 to 39, printed on the RaspberryPi/Jetson Nano circuit board.
  • +
  • "BCM" indicates the pin number is based on the Broadcom GPIO number scheme implemented in the RaspberryPi. This scheme is emulated in the Jetson library, so "BCM" pin numbers can be used on the Jetson.
  • +
  • The pin number, which depends upon the pin addressing scheme.
  • +
+

See details of the RaspberryPi 40 pin header here: https://www.raspberrypi.com/documentation/computers/os.html#gpio-and-the-40-pin-header

+

Jetson Nano 40 pin header uses the same board numbering scheme, although the header is physically flipped on the board, so pay attention to the numbers printed on the board. The Jetson Nano only supports 2 PWM pins and these must be enabled. See Generating PWM from the Jetson Nano

+

For example, "RPI_GPIO.BOARD.33" specifies board pin 33 using the Rpi.GPIO library.

+

For example, "RPI_GPIO.BCM.13" specifies Broadcom GPIO-13 using the Rpi.GPIO library. If you look at the header diagram linked above you will notice that this is the same physical pin as "RPI_GPIO.BOARD.33"; it is a synonymn for physical pin 33.

+

When using the RPI_GPIO pin provider, you can choose to use the BOARD or BCM pin schemes, but all pins must use the same pin scheme. You cannot mix pin schemes.

+

PIGPIO

+

RaspberryPi users can optionally install the PiGPIO library and daemon to manage the pins on the 40 pin GPIO header. Note that this library does NOT work on the Jetson Nano. The library support PWM, Input and Output pins.

+
Installing and Starting PiGPIO
+
    +
  • Install the system daemon
  • +
+
sudo apt-get update
+sudo apt-get install pigpio
+
+
    +
  • Install python support (with donkey environment activated)
  • +
+
pip install pigpio
+
+
    +
  • Start the daemon
  • +
+
sudo systemctl start pigpiod
+
+
    +
  • Enable the daemon on startup
  • +
+
sudo systemctl enable pigpiod
+
+

The PIGPIO pin specifier includes: +- "BCM" PiGPIO used Broadcom (BCM) pin numbering scheme exclusively, so that is baked into the pin specifier. +- The BCM pin number

+

For example, "PIGPIO.BCM.13" specifies Broadcom GPIO-13. As discussed above and shown in the linked header diagram, this is exposed on board pin 33.

+

Generating PWM from the Jetson Nano

+

Both the Jetson Nano and RaspberryPi4 support two hardware PWM pins. On the Jetson Nano, these must be configured.

+

Configure Jetson Expansion Header for PWM

+
    +
  • +

    ssh into the donkeycar and run this command sudo /opt/nvidia/jetson-io/jetson-io.py. It should show the Jetson Expansion Header Tool that allows you to change GPIO pin functions (see below).

    +
  • +
  • +

    If your Jetson expansion header configuration does not show any PWM pins, then you will need to enable them.

    +
  • +
+
---
+     =================== Jetson Expansion Header Tool ===================
+     |                                                                    |
+     |                                                                    |
+     |                        3.3V ( 1)  ( 2) 5V                          |
+     |                        i2c2 ( 3)  ( 4) 5V                          |
+     |                        i2c2 ( 5)  ( 6) GND                         |
+     |                      unused ( 7)  ( 8) uartb                       |
+     |                         GND ( 9)  (10) uartb                       |
+     |                      unused (11)  (12) unused                      |
+     |                      unused (13)  (14) GND                         |
+     |                      unused (15)  (16) unused                      |
+     |                        3.3V (17)  (18) unused                      |
+     |                      unused (19)  (20) GND                         |
+     |                      unused (21)  (22) unused                      |
+     |                      unused (23)  (24) unused                      |
+     |                         GND (25)  (26) unused                      |
+     |                        i2c1 (27)  (28) i2c1                        |
+     |                      unused (29)  (30) GND                         |
+     |                      unused (31)  (32) unused                      |
+     |                      unused (33)  (34) GND                         |
+     |                      unused (35)  (36) unused                      |
+     |                      unused (37)  (38) unused                      |
+     |                         GND (39)  (40) unused                      |
+     |                                                                    |
+      ====================================================================
+---
+
+

Choose Configure the 40 pin expansion header to activate pwm0 and pwm2:

+
---
+     =================== Jetson Expansion Header Tool ===================
+     |                                                                    |
+     |                                                                    |
+     |                        3.3V ( 1)  ( 2) 5V                          |
+     |                        i2c2 ( 3)  ( 4) 5V                          |
+     |                        i2c2 ( 5)  ( 6) GND                         |
+     |                      unused ( 7)  ( 8) uartb                       |
+     |                         GND ( 9)  (10) uartb                       |
+     |                      unused (11)  (12) unused                      |
+     |                      unused (13)  (14) GND                         |
+     |                      unused (15)  (16) unused                      |
+     |                        3.3V (17)  (18) unused                      |
+     |                      unused (19)  (20) GND                         |
+     |                      unused (21)  (22) unused                      |
+     |                      unused (23)  (24) unused                      |
+     |                         GND (25)  (26) unused                      |
+     |                        i2c1 (27)  (28) i2c1                        |
+     |                      unused (29)  (30) GND                         |
+     |                      unused (31)  (32) pwm0                        |
+     |                        pwm2 (33)  (34) GND                         |
+     |                      unused (35)  (36) unused                      |
+     |                      unused (37)  (38) unused                      |
+     |                         GND (39)  (40) unused                      |
+     |                                                                    |
+      ====================================================================
+---
+
+

After enabling, pwm0 is board pin-32 and pwm2 is board pin-33.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/rc/index.html b/parts/rc/index.html new file mode 100644 index 00000000..0fafbb7a --- /dev/null +++ b/parts/rc/index.html @@ -0,0 +1,256 @@ + + + + + + + + RC control - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

RC control

+

Donkey RC connections +(This only works with the RaspberryPi. The Jetson Nano does not provide the necessary GPIO pin support)

+

You can drive Donkey with nothing more than the RC controller your car probably came with! The secret is that, thanks to the cool Pigpio library, the RaspberryPi pins can read and generate the RC signals necessary to read your RC receiver and drive your servo and motor controllers.

+

To do this you need to either connect some jumper cables from your RC receiver to the RPi GPIO pins and then do the same for your steering servo and motor controller (it's a little fiddly but works fine) or use our Donkeycar RC Hat (shown above), which is plug and play and includes other nice stuff like a OLED screen, a fan, encoder support and even an e-stop option (like a remote kill switch) if you happen to have a 3Ch (or more) RC transmitter.

+

Hardware Setup

+

If you're using the RC Hat above, you can skip this hardware part -- the hat does it all for you!

+

Note that you will want your RC controller to be well trimmed prior to using it to control your Donkeycar. You want the throttle trim, steering trim and steering range to be well adjusted; see this video for how to do that.

+

You can use the GPIO pins for RC input, output or both. In the case of RC input, the RC controller replaces a bluetooth joystick. In the case of RC output, it replaces the I2C servo driver board.

+

The easiest way to connect RC is via the custom "hat" that we've designed (see above). But if you're doing it yourself, follow this wiring guide. It's a bit of a forest of jumper cables if you're doing both input and output, but remember that you only have to connect one ground and V+ cable to the RC reciever (on any channel), rather than one for every channel.

+

Also note the the RC receiver should be connected to the 3.3v pins, while the output servo and motor controller are connected to the 5v pins.

+

Warning: The RC receiver PWM signal is generated from the receiver input voltage, so connecting the RC receiver to 5V or even 6V from the ESC will fry the RPi!

+

Donkey RC connections

+

Here's what the RC receiver connection should look like

+

Donkey RC connections

+

Software Setup

+

First, make sure PIGPIO is installed; see pins You probably want the PIGPIO daemon to allows be started whe the RaspberrpyPi start. On the command line enter this to set the PIGPIO daemon to always run on startup:

+
sudo systemctl enable pigpiod & sudo systemctl start pigpiod
+
+

Next, in your mycar directory, edit the myconfig.py files as follows:

+
    +
  • For RC input, select pigpio_rc as your controller type in your myconfig.py file. Uncomment the line (remove the leading #) and edit it as follows:
  • +
+
CONTROLLER_TYPE = 'pigpio_rc'
+
+

Also set use joystick to True

+
USE_JOYSTICK_AS_DEFAULT = True
+
+
    +
  • For RC output, select PWM_STEERING_THROTTLE as your drive train type in your myconfig.py file. Uncomment the line (remove the leading #) and edit it as follows:
  • +
+
DRIVE_TRAIN_TYPE =  "PWM_STEERING_THROTTLE"
+
+

For both of these, there are additional settings you can change, such as reversing the direction of output or the pins connected:

+

Input options:

+
#PIGPIO RC control
+STEERING_RC_GPIO = 26
+THROTTLE_RC_GPIO = 20
+DATA_WIPER_RC_GPIO = 19
+PIGPIO_STEERING_MID = 1500         # Adjust this value if your car cannot run in a straight line
+PIGPIO_MAX_FORWARD = 2000          # Max throttle to go fowrward. The bigger the faster
+PIGPIO_STOPPED_PWM = 1500
+PIGPIO_MAX_REVERSE = 1000          # Max throttle to go reverse. The smaller the faster
+PIGPIO_SHOW_STEERING_VALUE = False
+PIGPIO_INVERT = False
+PIGPIO_JITTER = 0.025   # threshold below which no signal is reported
+
+

If you are using the RC hat then the PWM output pins shown below (and defaulted in myconfig.py) must be used. +If you are not using the RC hat then you are free to choose different PWM output pins. +NOTE: you must install pigpio to use this configuration. See PIGPIO

+

Output options:

+
PWM_STEERING_PIN = "PIGPIO.BCM.13"           # PWM output pin for steering servo
+PWM_THROTTLE_PIN = "PIGPIO.BCM.18"           # PWM output pin for ESC
+
+STEERING_LEFT_PWM = int(4096 * 1 / 20)       # pwm value for full left steering (1ms pulse)
+STEERING_RIGHT_PWM = int(4096 * 2 / 20)      # pwm value for full right steering (2ms pulse)
+
+THROTTLE_FORWARD_PWM = int(4096 * 2 / 20)    # pwm value for max forward (2ms pulse)
+THROTTLE_STOPPED_PWM = int(4096 * 1.5 / 20)  # pwm value for no movement (1.5ms pulse)
+THROTTLE_REVERSE_PWM = int(4096 * 1 / 20)    # pwm value for max reverse throttle (1ms pulse)
+
+

Troubleshooting

+

If one channel is reversed (steering left goes right, etc), either reverse that channel on your RC transmitter (that's usually a switch or setting) or change it in the output options shown above by channging the PWM_INVERTED value for that channel to True.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/parts/rc_hat/index.html b/parts/rc_hat/index.html new file mode 100644 index 00000000..dcd333c1 --- /dev/null +++ b/parts/rc_hat/index.html @@ -0,0 +1,278 @@ + + + + + + + + Donkeycar RC Hat - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Donkeycar RC Hat

+

RC Hat for RaspberryPi

+

If you started with a ready-to-run RC car, it probably came with a RC controller. Good news: you can use it with Donkeycar, using the RC controller for manual driving. You can also plug in the car's servo and motor controller directly into the RaspberryPi without the need for a PCA9685 motor/servo controller board.

+

Note that you will want your RC controller to be well trimmed prior to using it with the RC hat. You want the throttle trim, steering trim and steering range to be well adjusted; see this video for how to do that.

+

To do so, you can either wire up it up manually as shown in this tutorial (which works, but has a lot of fiddly wires that can fall off) or do it far more neatly with the Donkeycar RC hat, shown above, which handles all the wiring for you, along with including an OLED screen and a fan.

+

The Donkeycar RC hat can be purchased from the Donkeycar Store. Note that it only works with the RaspberryPi, not the Jetson Nano, due to limitations with the way the Jetson handles its I/O pins.

+

If you're using a standard wheel encoder, you can plug it into the "Encoder" pins. You can also power the RaspberryPi from this board if you have a 5V source with the "Optional 5v power in" pins

+

Once you've plugged in all the cables, you can move to the software setup

+

There are two parts to the software setup. The first part is setting up to read the RC Controller using the RC Hat. The Second, optional, part is setting up the drive train so we can control the ESC and SERVO using the RC hat (so you don't need a PCA9685 anymore)/

+

Install PiGPIO

+

In both cases we are going to use the PiGPIO library to control the I/O pins (remember, this only works on a RaspberryPi). Install PiGPIO from a command prompt as follows;

+
sudo apt-get update
+sudo apt-get install pigpio
+
+

Then, on the command line enter this to set the PIGPIO daemon to always run on startup:

+
sudo systemctl enable pigpiod & sudo systemctl start pigpiod
+
+

Reading RC Controller with RC Hat

+

The RC Hat can route the PWM signals generated by your RC Receiver to the RaspberryPi's gpio pins, so software can measure the length of the PWM pulse and then use that to determine steering and throttle. This allows you to use the RC controller that came with your RC rather than using a game controller.

+

Connection: +To use the RC hat to read your RC controller, use the included 3-wire cables to connect your RC receiver to the RC 1 and RC 2 pins (corresponding to the RC receiver's Channel 1 and Channel 2). In all cases, make sure you plug them in the right way, noting the +,- and S (Signal) markings. Typically the black wire is "-", the red wire in the middle is "+" and the white wire is "S".

+

Configuration: +Now edit your myconfig.py file to use the RC Hat to read the RC Controller. In your mycar directory, edit the myconfig.py files as follows:

+

Use pigpio_rc as your controller type in your myconfig.py file. Uncomment the CONTROLLER_TYPE line (remove the leading #) and edit it as follows:

+
CONTROLLER_TYPE = 'pigpio_rc'
+
+

Also set use joystick to True

+
USE_JOYSTICK_AS_DEFAULT = True
+
+

There are additional settings you can change in the #PIGPIO RC control section, such as reversing the direction of output or the pins connected, or adjusting the expect PWM pulse width (see Standard RC with ESC and Steering Servo for a discussion of Pulse Width Modulation); TLDR - a 1000 nanosecond pulse means full left/reverse, a 1500 nano second pulse means straight/stopped and a 2000 nanosecond pulse means full right/forward. The defaults are generally good and you can start with them. If you see any issues when calibrating then read the Troubleshooting section to see how you might change one or more of these values to compensate.

+

Input options for reading RC controller:

+
#PIGPIO RC control
+STEERING_RC_GPIO = 26              # gpio pin (in broadcom numbering) for reading the RC controller's steering
+THROTTLE_RC_GPIO = 20              # gpio pin (in broadcom numbering) for reading the RC Controller's throttle
+DATA_WIPER_RC_GPIO = 19            # gpio pin (in broadcom numbering) for reading the RC Controller's button
+PIGPIO_STEERING_MID = 1500         # PWM pulse in nanoseconds for 'straight` steering.  Adjust this value if your car cannot run in a straight line.
+PIGPIO_MAX_FORWARD = 2000          # PWM pulse in nanoseconds for max forward throttle.
+PIGPIO_STOPPED_PWM = 1500          # PWM pulse in nanoseconds for zero throttle
+PIGPIO_MAX_REVERSE = 1000          # PWM pulse in nanoseconds for max reverse throttle. 
+PIGPIO_SHOW_STEERING_VALUE = False
+PIGPIO_INVERT = False              # rarely a controller uses an inverted pulse; if so then set to True
+PIGPIO_JITTER = 0.025              # threshold below which no signal is reported (debounce/noise rejection)
+
+

Controlling ESC and Steering Servo with RC Hat

+

Optionally, you can use the RaspberryPi to generate PWM (see Standard RC with ESC and Steering Servo) for controlling the motor speed and steering rather than using a PCA9685 board (there, see you just paid for the RC Hat!).

+

Connection: +The RC hat includes two 3-pin headers compatible with the servo cables that connect to the ESC and the steering servo. Plug your car's servo into the Servo pins and the Motor Controller into the Motor pins. In all cases, make sure you plug them in the right way, noting the +,- and S (Signal) markings. Typically the black wire is "-", the red wire in the middle is "+" and the white wire is "S".

+

Configuration: +For RC output, select PWM_STEERING_THROTTLE as your drive train type in your myconfig.py file. Uncomment the DRIVE_TRAIN_TYPE line (remove the leading #) and edit it as follows:

+
DRIVE_TRAIN_TYPE =  "PWM_STEERING_THROTTLE"
+
+

Then uncomment the entire PWM_STEERING_THROTTLE configuration block and make sure steering uses "PIGPIO.BCM.13" and throttle uses "PIGPIO.BCM.18" because that is how the pins on the RC Hat are connected to the RaspberryPi 40 pin header.

+
PWM_STEERING_THROTTLE = {
+    "PWM_STEERING_PIN": "PIGPIO.BCM.13",    # PWM output pin for steering servo
+    "PWM_STEERING_SCALE": 1.0,              # used to compensate for PWM frequency differents from 60hz; NOT for adjusting steering range
+    "PWM_STEERING_INVERTED": False,         # True if hardware requires an inverted PWM pulse
+    "PWM_THROTTLE_PIN": "PIGPIO.BCM.18",    # PWM output pin for ESC
+    "PWM_THROTTLE_SCALE": 1.0,              # used to compensate for PWM frequence differences from 60hz; NOT for increasing/limiting speed
+    "PWM_THROTTLE_INVERTED": False,         # True if hardware requires an inverted PWM pulse
+    "STEERING_LEFT_PWM": 400,               #pwm value for full left steering
+    "STEERING_RIGHT_PWM": 200,              #pwm value for full right steering
+    "THROTTLE_FORWARD_PWM": 400,            #pwm value for max forward throttle
+    "THROTTLE_STOPPED_PWM": 300,            #pwm value for no movement
+    "THROTTLE_REVERSE_PWM": 220,            #pwm value for max reverse throttle
+}
+
+

Calibration

+

After configuring the RC hat to read the RC controller and optionally control the ESC and steering servo you should do the normal calibration step to figure out the correct steering and throttle PWM values for your car (and to make sure you've hooked things up correctly).

+

Troubleshooting

+

If one channel is reversed (steering left goes right, etc), either reverse that channel on your RC transmitter (that's usually a switch or setting) or change it in the output options shown above by changing the PWM_INVERTED value for that channel to True.

+

OLED setup

+

Enable the display in myconfig.py.

+
# SSD1306_128_32
+USE_SSD1306_128_32 = True     # Enable the SSD_1306 OLED Display
+SSD1306_128_32_I2C_BUSNUM = 1 # I2C bus number
+SSD1306_RESOLUTION = 1 # 1 = 128x32; 2 = 128x64
+
+

Showing your IP address on startup.

+

One of the cool things about having an OLED screen is that you can show your car's IP address on startup, so you can connect to it. Instructions to set that up are here

+

Troubleshooting

+

If you are unable to start the car, ensure that the Adafruit_SSD1306 package is installed in your virtual environment. This should automatically be installed, if you are using a recent version of donkeycar.

+
pip install Adafruit_SSD1306
+
+

Encoder

+

If you're using a standard wheel encoder, you can plug it into the "Encoder" pins, then setup the encoder configuration in your myconfig.py to use the pin that is exposed by the RC hat's encoder header.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + + +
+ + + + + + + + diff --git a/parts/stop_sign_detection/index.html b/parts/stop_sign_detection/index.html new file mode 100644 index 00000000..fa9b67ee --- /dev/null +++ b/parts/stop_sign_detection/index.html @@ -0,0 +1,248 @@ + + + + + + + + Stop Sign Detection - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Stop Sign Detection

+

This part utilize a Google Coral accelerator and a pre-trained object detection model by Coral project to perform stop sign detection. If the donkey car see a stop sign, it will override the pilot/throttle to 0. In addition, a bounding box will be annotated to the cam/image_array.

+ + +
+

Requirement

+

To use this part, you must have:

+ +

How to use

+

Put the following lines in myconfig.py

+
STOP_SIGN_DETECTOR = True
+STOP_SIGN_MIN_SCORE = 0.2
+STOP_SIGN_SHOW_BOUNDING_BOX = True
+
+

Install Edge TPU dependencies

+

Follow the Coral Edge TPU get started instructions to install the necessary software. For the RaspberryPi follow the Linux instructions.

+

The stop sign detector uses a pre-compiled model, so we only need the inference runtime to make this work. However, if you are creating your own model then you will need the Edge TPU Compiler on your RaspberryPi (or Linux laptop if you are training on that). Note that the compiler only runs on Linux.

+

Detecting other objects

+

Since the pre-trained model are trained on coco, there are 80 objects that the model is able to detect. You can simply change the STOP_SIGN_CLASS_ID in stop_sign_detector.py to try.

+

Accuracy

+

Since SSD is not good at detecting small objects, the accuracy of detecting the stop sign from far away may not be good. There are some ways that we can make enhancement but this is out of the scope of this part.

+

Getting this to work without the Coral Edge TPU

+

There is an issue in the Github for making this work without the Coral Edge TPU. If you get this working please submit a pull request.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/stores/index.html b/parts/stores/index.html new file mode 100644 index 00000000..bd90df59 --- /dev/null +++ b/parts/stores/index.html @@ -0,0 +1,231 @@ + + + + + + + + Stores - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Tub

+

This is the standard donkey data store. The "data" folder is what we call a "tub"

+

Accepted Types

+

The following datatypes are supported.

+
    +
  • str
  • +
  • int
  • +
  • float / np.float
  • +
  • image_arrays and arrays (np.ndarray)
  • +
  • image (jpeg / png)
  • +
+

The Tub is an append only format, that is optimized for reads (to speed up training models). +It maintains indexes for records, and uses memory mapped files.

+

The Tub exposes an Iterator that can be used to read records. These iterators can be further used by Pipelines to do arbitrary transformations of data prior to training (for data augumentation).

+

Example

+
from donkeycar.parts.tub_v2 import Tub
+
+# Here we define records that have a single `input` of type `int`.
+inputs = ['input']
+types = ['int']
+tub = Tub(path, inputs, types)
+
+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/parts/voice_control/index.html b/parts/voice_control/index.html new file mode 100644 index 00000000..557683e0 --- /dev/null +++ b/parts/voice_control/index.html @@ -0,0 +1,297 @@ + + + + + + + + Voice Control - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Alexa Support

+

Overview

+

This part works together with a public Alexa skill that we have released. When +you say a command, the Alexa skill will forward this command to a server hosted +by us to temporarily store it. Your donkey car, installed with this part and +with proper configuration, poll our server for any new command from Alexa.

+

Overview

+

Demo

+

Click the image below to open the video on youtube

+

Demo

+

Command Supported

+
    +
  • Report device code
  • +
  • autopilot
  • +
  • slowdown
  • +
  • speedup
  • +
  • stop/manual
  • +
+

Get Started

+
    +
  1. Use your Alexa app, navigate to Skills and Games
  2. +
  3. Search for "Donkey Car Control"
  4. +
  5. Enable the Skill
  6. +
  7. Say "Open car control and report device code". Use a pencil to write down the + device code.
  8. +
  9. Follow the instructions below to install the part in donkey car software + running on Pi
  10. +
+

Installation

+

To install this part, add the following lines to manage.py, right after the +controller setup. In manage.py:

+

+if cfg.USE_ALEXA_CONTROL:
+  from donkeycar.parts.voice_control.alexa import AlexaController
+  V.add(AlexaController(ctr, cfg), threaded=True)
+
+

In myconfig.py, add the following parameters:

+
USE_ALEXA_CONTROL = True
+ALEXA_DEVICE_CODE = "123456"
+
+

Commands

+

Autopilot

+

Phrases: autopilot, start autopilot

+

If you use this command, it is expected that the donkey car is started with a +model. This command will set the variable mode of the controller to local.

+

Slowdown / Speedup

+

Phrases: slow down, speed up, go faster, go slower

+

This command alters the cfg.AI_THROTTLE_MULT variable passed from the +constructor. Each time this command is received, the AI_THROTTLE_MULT is +increased/decreased by 0.05.

+

Note: Since this command alters AI_THROTTLE_MULT, it won't speed up when you +are running in user or local_angle mode.

+

Stop/Manual

+

Phrases: human control, user mode, stop autopilot, manual

+

This command will set the variable mode of the controller to user

+

Report device code

+

Phrases: report device code, what is your device code, device code

+

Device code is a 6-digit numeric string derived by a hash function from your +Alexa device ID. In order to distinguish commands from multiple Alexa devices, +commands sent to our server would require an identifier, which is the device code. +When donkey car poll for new command, the part will use this device code to poll +for new commands.

+

Backend

+

Check here for our web service source code, it is open source too.

+

https://github.com/robocarstore/donkeycar-alexa-backend

+ +

Copyright (c) 2020 Robocar Ltd

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/sitemap.xml b/sitemap.xml new file mode 100644 index 00000000..0f8724ef --- /dev/null +++ b/sitemap.xml @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/sitemap.xml.gz b/sitemap.xml.gz new file mode 100644 index 00000000..a0d3ab27 Binary files /dev/null and b/sitemap.xml.gz differ diff --git a/support/faq/index.html b/support/faq/index.html new file mode 100644 index 00000000..c0dd8412 --- /dev/null +++ b/support/faq/index.html @@ -0,0 +1,270 @@ + + + + + + + + FAQ - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

FAQ

+

What types of RC cars work with the donkey platform?

+

Most hobby grade RC cars will work fine with the electronics, but you'll need to make your own base-plate and camera +holder. To make sure the car will work with Donkey check these things.

+
    +
  • it has a separate ESC and receiver. Some cheaper cars have these combined so it would require soldering to connect the Donkey motor controller to the ESC.
  • +
  • The ESC uses three-wire connectors. This will make it easy to just plug into the Donkey hardware.
  • +
  • Brushed motors are easier because they can go slower but sensored brushless motors(w/sensored ESC) can work as well.
  • +
+

For more information, see Roll Your Own.

+

What car can I use if I'm not in the USA?

+

The easiest thing to do would be to take your parts down to your local RC / hobby shop and check that the car you want +works with the parts. Here are some parts people have said work in other countries.

+
    +
  • Australia: KAOS (functionally equivalent to the Exceed Magnet)
  • +
  • China: HSP 94186 (functionally equivalent to the Exceed Magnet)
  • +
  • Add your country to this list (click edit this in top left corner)
  • +
+

How can I make my own track?

+

You can use tape, ribbon or even rope. The most popular tracks are 4ft wide and have 2in white borders with a dashed yellow center line. The Oakland track is about 70 feet around the center line. Key race characteristics include:

+
    +
  • straightaways.
  • +
  • left and right turns
  • +
  • hairpin turn
  • +
  • start/finish line.
  • +
+

Will Donkey Work on different hardware?

+

Yes. It's all python so you can run it on any system. Usually the hard part of porting Donkey will be getting the hardware working. +Here are a couple systems that people have tried or talked about.

+
    +
  • +

    NVIDA TX2 - This was implemented with a webcam and used a teensy to control the motor/servos. I2c control of PCA9685 works as well.

    +
  • +
  • +

    Pi-Zero - Yes, try following the steps for the PiB/B+. They should work for the PiZero.

    +
  • +
+

After a reboot, I don't see the (donkey) in front of the prompt, and I get python errors when I run

+
    +
  1. If you used this disc setup guide above, you used conda to manage your virtual environment. You need to activate the donkey conda environment with:
  2. +
+
conda activate donkey
+
+
    +
  1. optionally you can add that line to the last line of your ~/.bashrc to have it active each time you login.
  2. +
+

How to get latest Donkey source

+
    +
  1. When donkey has changed you can get the latest source. You've installed it directly from the github repo, so getting latest is easy:
  2. +
+
cd donkeycar
+git pull origin main
+donkey createcar --path ~/mycar --overwrite
+
+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/support/legacy/index.html b/support/legacy/index.html new file mode 100644 index 00000000..4ebafbfc --- /dev/null +++ b/support/legacy/index.html @@ -0,0 +1,282 @@ + + + + + + + + Legacy - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Legacy

+

This part of documentation it was left as reference to old original classic design, because it may still bring value for some Do-It-Yourself users.

+

In the future this should be moved to another sections in the docs.

+

Hardware

+

If you purchased parts from the Donkey Car Store, skip to step 3.

+

Step 1: Print Parts

+

thingiverse

+

I printed parts in black PLA, with .3mm layer height with a .5mm nozzle and no supports. The top roll bar is designed to be printed upside down.

+

Step 2: Clean up parts

+

Almost all 3D Printed parts will need clean up. Re-drill holes, and clean up excess plastic.

+

donkey

+

In particular, clean up the slots in the side of the roll bar, as shown in the picture below:

+

donkey

+

Step 3: Assemble Top plate and Roll Cage

+

If you have an Exceed Short Course Truck, Blaze or Desert Monster watch this video

+

Slide the nut into the slot in the side of the roll cage. This is not particularly easy. You may need to clean out the hole again and use a small screwdriver to push the screw in such that it lines up with the hole in the bottom of the roll cage.

+

donkey

+

Once you have slid the nut in, you can attach the bottom plate. Once again, this may be tricky. I use the small screwdriver to push against the nut to keep it from spinning in the slot. Good news: you should never have to do this again.

+

donkey

+

Step 4: Connect Servo Shield to Raspberry Pi

+

You could do this after attaching the Raspberry Pi to the bottom plate, I just think it is easier to see the parts when they are laying on the workbench. Connect the parts as you see below:

+

donkey

+

For reference, below is the Raspberry Pi Pinout for reference. You will notice we connect to 3.3v, the two I2C pins (SDA and SCL) and ground:

+

donkey

+

Step 5: Attach Raspberry Pi to 3D Printed bottom plate

+

Before you start, now is a good time to insert the already flashed SD card and bench test the electronics. Once that is done, attaching the Raspberry Pi and Servo is as simple as running screws through the board into the screw bosses on the top plate. The M2.5x12mm screws should be the perfect length to go through the board, the plastic and still have room for a washer. The “cap” part of the screw should be facing up and the nut should be on the bottom of the top plate. The ethernet and USB ports should face forward. This is important as it gives you access to the SD card and makes the camera ribbon cable line up properly.

+

Attach the USB battery to the underside of the printed bottom plate using cable ties or velcro.

+

donkey

+

Step 6: Attach Camera

+

There are two versions of the donkey chassis, the newer one does not have screws, the older one does. This includes instructions for both:

+

Screwless Design +The newer design is pretty simple, just slip the camera into the slot, cable end first. However, be careful not to push on the camera lens and instead press the board. +donkey

+

If you need to remove the camera the temptation is to push on the lens, instead push on the connector as is shown in these pictures.
+donkey +donkey

+

Design with Screws

+

Attaching the camera is a little tricky, the M2 screws can be screwed into the plastic but it is a little hard. I recommend drilling the holes out with a 1.5mm bit (1/16th bit in Imperial land) then pre threading them with the screws before putting the camera on. It is only necessary to put two screws in.

+
+

Sometimes using the two top screw holes can result in a short. Put screws in the bottom two holes.

+
+

Before using the car, remove the plastic film from the camera lens.

+

donkey

+

It is easy to put the camera cable in the wrong way so look at these photos and make sure the cable is put in properly. There are loads of tutorials on youtube if you are not used to this.

+

donkey

+

Step 7: Put it all together

+

*** Note if you have a Desert Monster Chassis see 7B section below ***

+

The final steps are straightforward. First attach the roll bar assembly to the car. This is done using the same pins that came with the vehicle.

+

donkey

+

Second run the servo cables up to the car. The throttle cable runs to channel 0 on the servo controller and steering is channel 1.

+

donkey

+

Now you are done with the hardware!!

+

Step 7b: Attach Adapters (Desert Monster only)

+

The Desert monster does not have the same set up for holding the body on the car and needs two adapters mentioned above. To attach the adapters you must first remove the existing adapter from the chassis and screw on the custom adapter with the same screws as is shown in this photo:

+

adapter

+

Once this is done, go back to step 7

+

Software

+

Congrats! Now to get your get your car moving, see the software instructions section.

+

donkey

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + +
+ + + + + + + + diff --git a/utility/donkey/index.html b/utility/donkey/index.html new file mode 100644 index 00000000..2fa98082 --- /dev/null +++ b/utility/donkey/index.html @@ -0,0 +1,401 @@ + + + + + + + + donkey - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Donkey Command-line Utilities

+

The donkey command is created when you install the donkeycar Python package. This is a Python script that adds some important functionality. The operations here are vehicle independent, and should work on any hardware configuration.

+

Create Car

+

This command creates a new dir which will contain the files needed to run and train your robot.

+

Usage:

+
donkey createcar --path <dir> [--overwrite] [--template <donkey2>]
+
+
    +
  • This command may be run from any dir
  • +
  • Run on the host computer or the robot
  • +
  • It uses the --path as the destination dir to create. If .py files exist there, it will not overwrite them, unless the optional --overwrite is used.
  • +
  • --overwrite will update the files in the destination directory except the myconfig.py. This is useful if you have update donkeycar and you want those changes reflected in you mycar folder but you don't want to have to recreate your myconfig.py.
  • +
  • The optional --template will specify the template file to start from. For a list of templates, see the donkeycar/templates dir. This source template will be copied over the manage.py for the user. Common templates are: +
  • +
+

Find Car

+

This command attempts to locate your car on the local network using nmap.

+

Usage:

+
donkey findcar
+
+
    +
  • Run on the host computer
  • +
  • Prints the host computer IP address and the car IP address if found
  • +
  • Requires the nmap utility:
  • +
+
sudo apt install nmap
+
+

Calibrate Car

+

This command allows you to manually enter values to interactively set the PWM values and experiment with how your robot responds. +See also more information.

+

Usage:

+
donkey calibrate --channel <0-15 channel id>
+
+
    +
  • Run on the host computer
  • +
  • Opens the PWM channel specified by --channel
  • +
  • Type integer values to specify PWM values and hit enter
  • +
  • Hit Ctrl + C to exit
  • +
+

Clean data in Tub

+

Opens a web server to delete bad data from a tub.

+

Usage:

+
donkey tubclean <folder containing tubs>
+
+
    +
  • Run on pi or host computer.
  • +
  • Opens the web server to delete bad data.
  • +
  • Hit Ctrl + C to exit
  • +
+

Train the model

+

Note: This section only applies to version >= 4.1 +This command trains the model. There is more detail in Deep Learning Autopilot.

+
donkey train --tub=<tub_path> [--config=<config.py>] [--model=<model path>] [--type=(linear|categorical|inferred)] [--transfer=<transfer model path>]
+
+
    +
  • Uses the data from the --tub datastore. You may specify more than one tub using a comma separated list --tub=foo/data,bar/data or just leaving spaces like --tub foo/data bar/data.
  • +
  • Uses the config file from the --config path (optionally)
  • +
  • Saves the model into path provided by --model. Auto-generates a model name if omitted. Note: There was a regression in version 4.2 where you only had to provide the model name in the model argument, like --model mypilot.h5. This got resolved in version 4.2.1. Please update to that version.
  • +
  • Uses the model type --type
  • +
  • Allows to continue training a model given by --transfer
  • +
  • Supports filtering of records using a function defined in the variable + TRAIN_FILTER in the myconfig.py file. For example:
  • +
+
def filter_record(record):
+    return record.underlying['user/throttle'] > 0
+
+TRAIN_FILTER = filter_record
+
+

only uses records with positive throttle in training.

+
    +
  • In version 4.3.0 and later all 3.x models are supported again:
  • +
+
donkey train --tub=<tub_path> [--config=<config.py>] [--model=<model path>] [--type=(linear|categorical|inferred|rnn|imu|behavior|localizer|3d)] [--transfer=<transfer model path>]
+
+

In addition, a Tflite model is automatically generated in training. This can be suppressed by setting CREATE_TF_LITE = False in your config. Also Tensorrt models can now be generated. To do so, you set CREATE_TENSOR_RT = True.

+
    +
  • Note: The createcar command still creates a train.py file for backward compatibility, but it's not required for training.
  • +
+

Make Movie from Tub

+

This command allows you to create a movie file from the images in a Tub.

+

Usage:

+
donkey makemovie --tub=<tub_path> [--out=<tub_movie.mp4>] [--config=<config.py>] [--model=<model path>] [--model_type=(linear|categorical|inferred|rnn|imu|behavior|localizer|3d)] [--start=0] [--end=-1] [--scale=2] [--salient]
+
+
    +
  • Run on the host computer or the robot
  • +
  • Uses the image records from --tub dir path given
  • +
  • Creates a movie given by --out. Codec is inferred from file extension. Default: tub_movie.mp4
  • +
  • Optional argument to specify a different config.py other than default: config.py
  • +
  • Optional model argument will load the keras model and display prediction as lines on the movie
  • +
  • model_type may optionally give a hint about what model type we are loading. Categorical is default.
  • +
  • optional --salient will overlay a visualization of which pixels excited the NN the most
  • +
  • optional --start and/or --end can specify a range of frame numbers to use.
  • +
  • scale will cause ouput image to be scaled by this amount
  • +
+

Plot Predictions

+

This command allows you to plot steering and throttle against predictions coming from a trained model.

+

Usage:

+
donkey tubplot --tub=<tub_path> --model=<model_path> [--limit=<end_index>] [--type=<model_type>] 
+
+
    +
  • This command may be run from ~/mycar dir
  • +
  • Run on the host computer
  • +
  • Will show a pop-up window showing the plot of steering values in a given tub compared to NN predictions from the trained model
  • +
  • Optional --limit=<end_index> will use all records up to that index, defaults to 1000.
  • +
  • Optional --type=<model_type> will use a different model type than the DEFAULT_MODEL_TYPE
  • +
+

Tub Histogram

+

Note: Requires version >= 4.3

+

This command allows you to plot tub data (usually steering and throttle) as a histogram.

+

Usage:

+
donkey tubhist --tub=<tub_path> --record=<record_name> --out=<output_filename>
+
+
    +
  • This command may be run from ~/mycar dir
  • +
  • Run on the host computer
  • +
  • Will show a pop-up window showing the histogram plot of tub values in a given tub
  • +
  • Optional --record=<record_name> will only show the histogram of a certain data series, for example "user/throttle"
  • +
  • Optional --out=<output_filename> saves histogram under that name, otherwise the name is auto-generated from the tub path
  • +
+

Joystick Wizard

+

This command line wizard will walk you through the steps to create a custom/customized controller.

+

Usage:

+
donkey createjs
+
+
    +
  • Run the command from your ~/mycar dir
  • +
  • First make sure the OS can access your device. The utility jstest can be useful here. Installed via: sudo apt install joystick You must pass this utility the path to your controller's device. Typically this is /dev/input/js0 However, it if is not, you must find the correct device path and provide it to the utility. You will need this for the createjs command as well.
  • +
  • Run the command donkey createjs and it will create a file named my_joystick.py in your ~/mycar folder, next to your manage.py
  • +
  • Modify myconfig.py to set CONTROLLER_TYPE="custom" to use your my_joystick.py controller
  • +
+

Visualize CNN filter activations

+

Shows feature maps of the provided image for each filter in each of the convolutional layers in the model provided. Debugging tool to visualize how well feature extraction is performing.

+

Usage:

+
donkey cnnactivations [--tub=<data_path>] [--model=<path to model>]
+
+

This will open a figure for each Conv2d layer in the model.

+

Example:

+
donkey cnnactivations --model models/model.h5 --image data/tub/1_cam-image_array_.jpg
+
+

Show Models database

+

Note: This is only available in donkeycar >= 4.3.1.

+

This lists the models that are stored in models/database.json. Displays information +like model type, model name, tubs used in training, transfer model and a comment if +--comment was used in training or the model was trained in the UI.

+

Usage:

+
donkey models [--group] 
+
+
    +
  • Run from your ~/mycar directory
  • +
  • If the optional --group flag is given, then the tub path info is combined into groups, +if different models used different tubs. Useful, if you use multiple tubs, and the models +have used different tub combinations because it compresses the output information.
  • +
  • You need to install pandas first if you want to run it on the car
  • +
+

Donkey UI

+

Note: This section only applies to version >= 4.2.0

+

Usage:

+
donkey ui
+
+

This opens a UI to analyse tub data supporting following features:

+
    +
  • show selected data fields live as values and graphical bars
  • +
  • delete or un-delete records
  • +
  • try filters for data selection
  • +
  • plot data of selected data fields
  • +
+

The UI is an alternative to the web based donkey tubclean.

+

Tub UI

+

A full documentation of the UI is here.

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + + diff --git a/utility/ui/index.html b/utility/ui/index.html new file mode 100644 index 00000000..cba1ae49 --- /dev/null +++ b/utility/ui/index.html @@ -0,0 +1,315 @@ + + + + + + + + UI - Donkey Car + + + + + + + + + + + + + + +
+ + +
+ +
+
+ +
+
+
+
+ +

Donkey UI

+

Launch the Donkey graphical training interface by entering donkey ui in the command line. This works on Linux, Mac, and Windows, although if you're on Windows it's recommended that you use WSL (Windows Subsystem for Linux) running Ubuntu 20 instead to get full functionality.

+

The Donkey UI currently contains four screens supporting the following workflows:

+
    +
  1. +

    The tub manager - a replacement for the web-based application launched through donkey tubclean

    +
  2. +
  3. +

    The trainer - a UI based alternative to train the pilot. Note, for longer trainings containing larger tubs or batches it is recommended to perform these in the shell using the donkey train command. The UI based training is geared towards an experimental and rapid analysis cycle consisting of:

    +
      +
    • data manipulation / selection
    • +
    • training
    • +
    • pilot benchmarking
    • +
    +
  4. +
  5. +

    The pilot arena - here you can test two pilots' performance against each other.

    +
  6. +
  7. The car connector - can pull tub data from the car and push back trained pilots or even start/stop the car. The screen does not work under windows.
  8. +
+

Note: Under linux the app depends on xclip, if this is not installed, then please run:

+
sudo apt-get install xclip
+
+

The tub manager

+

Tub_manager UI

+

In the tub manager screen you have to select the car directory that contains the config file myconfig.py first, using the Load car directory button. Then select the tub you want to be working with using Load tub, the tub needs to be inside the car directory. The application remembers the last loaded config and tub.

+

The drop-down menu Add/remove' in the data panel to the left of the image allows to select the record fields, likeuser/angle,user/throttle`, etc.

+

Note: if your tub contains more data than the standard user/angle, user/throttle and you want the progress bars to correctly show the values of these fields, you need to add an entry into the .donkeyrc file in your home directory. This file is automatically created by the Donkey UI app. Here is an example:

+
field_mapping:
+- centered: true
+  field: car/accel
+  max_value_id: IMU_ACCEL_NORM
+
+

This data entry into the field_mapping list contains the name of the tub field, a switch, if the data is centered around 0 and the name of the maximum value of that data field which has to be provided in the myconfig.py file. For example, the data above represents the IMU acceleration of the IMU6050 which ranges between +/- 2g, i.e. ~ +/-20 m/s2. With an IMU_ACCEL_NORM of 20 the progress bar can display these values. Therefore, the myconfig.py should contain:

+
IMU_ACCEL_NORM = 20
+
+

Note: Vectors, i.e. list / arrays are being decomposed by the UI into their components automatically.

+

Here is an example of a tub that has car/accel and car/gyro arrays that hold IMU data, as well as car/distance and car/m_in_lap. The first two show a progress bar because there is a corresponding entry in the field_mapping list as explained above. +Tub_manager UI_more_data

+

The control panel allows moving forward and backward in single steps using <, > and scrolling continuously with <<, >>. These buttons are also linked to the keyboard keys < left >, < right >, < space >.

+

To delete unwanted records press Set left / Set right buttons to determine the range and hit Delete to remove such records from training. To see the impact on the current tub press Reload tub. If you want to resurrect accidentally deleted records, just choose a left/right value outside the deleted range and press Restore.

+

Note: The left/right values are invertible, i.e. left > right operates on all records outside [left, right).

+

In the filter section you can suppress records, say you want to restrict the next training on only right curves, then add user/angle > 0 to select those records. +Note: the filter on only for display in the tub manager. If you want to apply this in training you have to write the predicate as explained in utility

+

The lower panel contains a graph with the currently selected data from the data panel. If nothing is selected, all fields from the record are displayed. The display scales the data between the minimum and maximum value of each record field, hence there are no absolute measurements possible. For more advanced data graphing capabilities, press Browser Graph which opens a plotly history graph in the web browser.

+

The trainer

+

Trainer UI

+

The trainer screen allows to train a model to the tub data. In the Overwrite config section you can set any config parameter by typing an updated value into the text field on the right and hitting return.

+

To train a pilot use the model type dropdown, enter a comment and hit Train. After training, the pilot will appear in the pilot database which is shown underneath. You can also choose a transfer model if you don't want to train from scratch. Note, tensorflow saves the model with the optimiser state, so training will commence where it stopped in the saved state.

+

Pilots might be trained on multiple tubs, this is currently not supported in the trainer. However, if multiple tubs are passed in donkey train then these will show in the database too. In order to not clutter up the view and group different tubs, you can use the Group multiple tubs button to group all tub groups of two and more, and show a group alias instead. The group alias mapping is shown in the lower area of the window then.

+

The pilot arena

+

Pilot Arena UI

+

Here you can benchmark two pilots against each other. Use this panel to test if changes in the training through optimiser parameters or model types, or through deletion of certain records, or augmentations to images have made the pilot better or worse. The last selected pilots will be remembered in the app.

+

Choose a pilot by selecting the Model type and loading the keras model using the file chooser by pressing Choose pilot. The control panel is the same as in the tub manager. The lower right data panel shows the tub record's data. You can select the throttle field as some folks train on car speed instead of throttle values. In such case, the corresponding field name must be added into the .donkeyrc file in the section, see an example here.

+
user_pilot_map:
+  car/speed: pilot/speed
+  user/angle: pilot/angle
+  user/throttle: pilot/throttle
+
+

The 'user/angle' and 'user/throttle' mappings are automatically loaded by the app. In order to show the variable car/speed and compare it to the AI produced pilot/speed the map has to contain the corresponding entry.

+

Under the two pilots there are sliders with pre-defined image augmentations, here Brightness and Blur. You can mix brightness and blur values into the images and compare how well the pilots are reacting to such a modification of the testing data. Press the buttons to activate the sliders for enabling this feature.

+

The application will remember the last two selected pilots.

+

The car connection

+

Car_Connector_UI

+

Note: This screen will only work on linux / OSX as it makes use of ssh and rsync in the background. SSH needs to be configured to allow login from the PC to the car without password. This can be done by running on the PC:

+
ssh-keygen
+
+

When being asked for a passphrase just hit <return>. This creates a public and private key in your ./ssh directory. Then copy the key over to the car with the command below. Here we assume your car's hostname is donkeypi - otherwise replace it with the corresponding hostname.

+
ssh-copy-id -i ~/.ssh/id_rsa.pub pi@donkeypi.local
+
+

Log into your car using:

+
ssh pi@donkeypi.local
+
+

If SSH asks you if that host should be added to the list of known hosts, hit <return> and you are done. From now on, you can ssh into the car without being prompted for the password again. The login-free setup is required for the screen to work.

+
    +
  • You also need to edit your myconfig.py and make sure the fields PI_USERNAME and PI_HOSTNAME are set to your car user's username and the hostname of the car.
  • +
+

With the car connector you can transfer the tub data from the car to your PC and transfer the pilots back to the car.

+
    +
  • +

    Under the Car directory enter the car folder and hit return. This should populate the Select tub drop down. Most likely you want to select the data/ directory but you might have tubs in subfolders. In that case use ~/mycar/data in the Car directory, select the tub you want to pull and enable Create new folder on the button. This will copy a tub on the car like ~/mycar/data/tub_21-04-09_11 into the same location on your PC. Without the Create new folder it would copy the content of the car's tub folder into ~/mycar/data of your PC, possibly overwriting other tub data that might be there.

    +
  • +
  • +

    Press Pull tub data/ to copy the tub from the car

    +
  • +
  • Press Send pilots to sync your local models/ folder with the models/ folder on the car. This command syncs all pilots that are locally stored.
  • +
  • In the Drive car section you can start the car and also select a model for autonomous driving. After starting you have to use the controller, either web or joystick as usual.
  • +
+

Future plans

+
    +
  1. Moving the car connector screen to a web interface, supported on all OS.
  2. +
  3. Handling of multiple tubs.
  4. +
  5. The ability to also use the filter in training without the need to edit the myconfig.py file.
  6. +
  7. Migration of the ~/.donkeyrc file to the kivy internal settings.
  8. +
  9. Support using only a single pilot (or more than two) in the pilot arena.
  10. +
+

Video tutorial

+

You can find a video tutorial for the UI below

+

Video tutorial

+ +
+
+ +
+
+ +
+ +
+ +
+ + + + GitHub + + + + « Previous + + + Next » + + +
+ + + + + + + +