sibotackm_bringup

sibotackm_bringup

https://github.com/inaciose/sibotackm_bringup

Launch depends

sibotackm_gazebo

sibotackm_tools

Launch

Run world with robot and ackerman publisher node

roslaunch sibotackm_bringup world_bot_twist.launch

roslaunch sibotackm_bringup world_bot_twist_mix.launch

 

Run all nodes for autonomous drive in lanes with panel signals

roslaunch sibotackm_bringup world_bot_mldrive_signal.launch mlmodel:=cleantrack1.h5

end

Automec-AD contribs and docs

Instruções de uso e commandos

Automec-AD use v01

 

Ackermann steering simulation

Adaptação de dois pacotes com a inclusão de um robot com camera e  um node de conversão entre mensagens twist e ackermann, para usar nas simulações do gazebo.

Packages:

ackermann_vehicle_description

  • robot01.gazebo.macro.xacro
  • robot01.urdf.xacro

ackermann_vehicle_gazebo

  • config/em_3905_ackermann_ctrlr_params.yaml
  • config/em_3905_joint_ctrlr_params.yaml
  • launch/ackermann_robot.launch
  • launch/ackermann_robot_with_arena.launch
  • launch/ackermann_robot_with_arena_conversion.launch
  • launch/twist_to_ackermann.launch
  • nodes/ackermann_controller.py
  • nodes/twist_to_ackermann.py

Baseado no seguinte repositório:

https://github.com/csorvagep/ackermann_vehicle

Machine Learning nodes improvements

package: robot_core

Novo node training_node.py, já preparado para receber parâmetros para substituir o TrainingSimulation.py

Alterações ao node write_data para receber parâmetros, e poder usar vários sets de amostras

Alterações ao node ml_driving-dave.py para receber parâmetros, e poder usar vários sets de amostras

Unified and simplified simulation_space

package: simspace

roslaunch simspace arena1.launch

Vehicle second camera for signal recognition

robot02 is and upgrade from 01 with a new camera (camera2)

Panel signal recognition

new rosnode on robot_core for signal recognition and behavior detection

rosrun robot_core signal_recognition.py

 

Automec-ad user commands v01

Instructions and commands for running simulation

updated

Get sample data for ml training

roslaunch ackermann_vehicle_gazebo ackermann_robot_with_arena_conversion_mltrain.launch folder:=/cleantrack1

É obrigatório o argumento folder:=/nome

O folder tem de ser criado dentro da pasta “data”

Lança tudo.

  • O mundo gazebo
  • O carro
  • O conversor twist para ackermann
  • O node de captura de dados
  • O node de rqt para conduzir o carro para as voltas de treino

Training ml model with sample data

roslaunch robot_core training.launch folder:=/cleantrack1 model:=cleantrack1.h5

São obrigatórios os argumentos:

  • folder:=/nome (não esquecer a barra no inicio)
  • model:=nome.h5 (Não esquecer a extensão .h5)

Driving with ml model

Just drive, no signals.

roslaunch ackermann_vehicle_gazebo ackermann_robot_with_arena_conversion_mlsolo.launch model:=cleantrack1.h5

Drive with two signals

roslaunch ackermann_vehicle_gazebo ackermann_robot_with_arena_conversion_mlsignal.launch model:=cleantrack1.h5

Não esquecer a extensão .h5

old

Get sample data for ml training

Run the commands (on each terminal)

roslaunch ackermann_vehicle_gazebo ackermann_robot_with_arena_conversion.launch

rqt

rqt configuration

Plugins > Robot tools > Robot Steering (select namespace/ topic cmd_vel)

Plugins > Visualization > Image Viewer (select namespace/ topic image_raw)

Don’t forget to set the velocity to the fixed value (each set need the same linear velocity,  write it down to use in later manual driving)

run ml sampling collect node

roslaunch robot_core write.launch

May run with arguments

roslaunch robot_core write.launch arg:=value

args

image_raw_topic  (default=/ackermann_vehicle/camera/rgb/image_raw)
twist_cmd_topic (default=/cmd_vel)
base_folder (default=/set1)

The base_folder, if not exists, need to be created inside the data folder on robot core package

Drive the car manually two laps

Ctrl+C to end the the nodes.

Training ml model with sample data

roslaunch robot_core training.launch

May run with arguments

roslaunch robot_core training.launch arg:=value

args

epochs (default=20)
steps_per_epoch (default=100)
base_folder (default=/set1)
modelname (default=model_default.h5)
You may load an old model, or create a new one.
The model is stored in model_files folder

Driving with ml model

Run the commands (on each terminal)

roslaunch ackermann_vehicle_gazebo ackermann_robot_with_arena_conversion.launch

roslaunch robot_core driving.launch

May run with arguments

roslaunch robot_core driving.launch arg:=value

args

image_raw_topic  (default=/ackermann_vehicle/camera/rgb/image_raw)
twist_cmd_topic (default=/cmd_vel)
twist_linear_x (default=0.5)  USE THE SAME VELOCITY AT MANUAL DRIVING
modelname (default=model_default.h5)
Ctrl+C to end the the nodes.

OLD INFO

RUN ML TASKS

A) Robot spawn inside automec contest world and twist to ackerman converter node

roslaunch ackermann_vehicle_gazebo ackermann_robot_with_arena_conversion.launch

 

Run ml agent to store data to learn

RUN: A) before

cd /catkin_ws/src/AutoMec-AD/robot_core/

python3 write_data.py

 

teleop with rqt

RUN: rqt

Plugins > Robot tools > Robot Steering (select namespace/ topic cmd_vel)

Plugins > Visualization > Image Viewer (select namespace/ topic image_raw)

 

run ml tranings_simulation

cd /catkin_ws/src/AutoMec-AD-dev/robot_core/src/cnn/models_python/

python3 TrainingSimulation.py

 

run ml driving

RUN: A) before

cd /catkin_ws/src/AutoMec-AD-dev/robot_core/src/cnn/driving/

python3 ml_driving-dave.py

 

warning

Node::Advertise(): Error advertising topic [/ackermann_vehicle/joint_cmd]. Did you forget to start the discovery service?