Setup Unity
Installation
Reference1.
- Add the public signing key, run the following command:
$ wget -qO - https://hub.unity3d.com/linux/keys/public | gpg --dearmor | sudo tee /usr/share/keyrings/Unity_Technologies_ApS.gpg > /dev/null
- To add the Unity Hub repository, you need an entry in /etc/apt/sources.list.d. Run the following command to add the Unity Hub repository:
$ sudo sh -c 'echo "deb [signed-by=/usr/share/keyrings/Unity_Technologies_ApS.gpg] https://hub.unity3d.com/linux/repos/deb stable main" > /etc/apt/sources.list.d/unityhub.list'
- Update the package cache and install the package:
$ sudo apt update
$ sudo apt-get install unityhub
Then, intsallation is done.
- start Unity Hub by terminal command
$ unityhub
If installation was done without any problem, you may see the window like this.
Create Account
If you don't have Unity account, you have to create new one.
Install unity editor
After cretating account or loggin, you can install editor for Unity.
Licence (Optional)
Recently, you don't have to add license for personal use. Free Personal License is added default.
Download ML-Agents Tool Kit
- clone tool kit
Run command bellow.
$ git clone --branch release_12 https://github.com/Unity-Technologies/ml-agents.git
- tool kit plaguin for python
$ pip3 install mlagents
After that, run command,
$ mlagents-learn --help
If you get response like
usage: mlagents-learn [-h] [--env ENV_PATH] [--resume] [--deterministic]
[--force] [--run-id RUN_ID] [--initialize-from RUN_ID]
[--seed SEED] [--inference] [--base-port BASE_PORT]
[--num-envs NUM_ENVS] [--num-areas NUM_AREAS] [--debug]
[--env-args ...]
[--max-lifetime-restarts MAX_LIFETIME_RESTARTS]
[--restarts-rate-limit-n RESTARTS_RATE_LIMIT_N]
[--restarts-rate-limit-period-s RESTARTS_RATE_LIMIT_PERIOD_S]
.
.
.
.
you successfuly installed mlagents for python!
Tutorial
- start unity hub
$ unityhub
- push [Add] -> Add project from disk
In this tutorial, we use cloened directory.
- choose the directory which you have cloned before
- .../ml-agents/Project
- ml-agents is the cloned directory
if you see warning, follow the operation.
If you can see this window it is working.
demo
You can see Assets
on the bottom of the previous window.
click "「ML-Agents」→「Examples」→「3DBall」→「Scenes」→「3DBall」"
If you can see this objects, loading successed.
run this command
cd /path/to/Unity/ml-agents
mlagents-learn config/ppo/3DBall.yaml --run-id=first3DBallRun
then, push "start botton" of Unity Editor.
Reinforcement learning starts!!