A self-driving car in a simulated environment. Explore various state-of-the-art methods of autonomous self-driving car in a fun visual format.
- Built in Unity3D (free game making engine).
- Add new tracks, change prebuilt scripts like gravity acceleration easily.
Download Links: Linux, Mac, Windows
All required dependencies are neatly packed in the requirements.txt
file.
NOTE: This project was developed with Python 3.6.5 therefore use the appropriate Python interpreter (i.e.
python3
instead ofpython
which could most likely be Python 2.7 &pip3
instead ofpip
).
> pip install --upgrade pip
Then you probably want to work from your local PC:
Start by cloning the project from github:
> git clone https://github.com/victor-iyiola/self-driving-simulation.git
> cd self-driving-simulation
or:
You can download the .zip
project files here and extract the project files.
> cd /path/to/self-driving-simulation
Install these requirements:
> pip install --upgrade -r requirements.txt
- Records images from center, left, and right cameras w/ associated steering angle, speed, throttle and brake.
- Saves to
driving_log.csv
- Ideally you have a joystick, but keyboard works too.
To drive, run simulator Autonomous Mode (click Autonomous Mode in main Menu), then run drive.py
as follows:
> python drive.py model-005.h5
To train, generate training data (press R
while in Training Mode) with the Simulator and save
recordings to path/to/self_driving_car/data/
.
> python3 model.py
This will generate a file model-{epoch}.h5
whenever the performance in the epoch is better
than the previous best. For example, the first epoch will generate a file called model-000.h5
.
A 9 layer convolutional network, based off of Nvidia's End-to-end learning for self driving car paper. 72 hours of driving data was collected in all sorts of conditions from human drivers
- 3 cameras
- The steering command is obtained by tapping into the vehicle’s Controller Area Network (CAN) bus.
- Nvidia's Drive PX onboard computer with GPUs
In order to make the system independent of the car geometry, the steering command is 1/r, where r is the turning radius in meters. 1/r was used instead of r to prevent a singularity when driving straight (the turning radius for driving straight is infinity). 1/r smoothly transitions through zero from left turns (negative values) to right turns (positive values).
Images are fed into a CNN that then computes a proposed steering command. The proposed command is compared to the desired command for that image, and the weights of the CNN are adjusted to bring the CNN output closer to the desired output. The weight adjustment is accomplished using back propagation
Eventually, it generated steering commands using just a single camera.
This project is opened under MIT 2.0 license.