|
# Lab 3: Task Execution
|
|
# Lab 3: Task Execution
|
|
|
|
|
|
The goal of this lab is to create an executor for Task-Specification-Trees. The TSTs are defined using the JSON format.
|
|
The goal of this lab is to create an executor for Task-Specification-Trees. The TSTs are defined using the JSON format.
|
|
|
|
|
|
It is *important* to have studied **Lecture 07: Deliberative Architecture** before starting this lab.
|
|
It is *important* to have studied **Lecture 07: Deliberative Architecture** before starting this lab.
|
|
|
|
|
|
## TST
|
|
## TST
|
|
|
|
|
|
A Task-Specification-Tree (TST) is a hierachical tree structure which represents a complex mission. For the purpose of this lab, six types of TST nodes are used:
|
|
A Task-Specification-Tree (TST) is a hierachical tree structure which represents a complex mission. For the purpose of this lab, six types of TST nodes are used:
|
|
|
|
|
|
You can find the specification of the TST nodes in `air_tst/configs`, the directory on the lab computers is `/courses/TDDE05/software/ros2_ws/src/air_tst/configs`, you can quickly access the directory with:
|
|
You can find the specification of the TST nodes in `air_tst/configs`, the directory on the lab computers is `/courses/TDDE05/software/ros2_ws/src/air_tst/configs`, you can quickly access the directory with:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
tdde05-cd air_tst configs
|
|
tdde05-cd air_tst configs
|
|
```
|
|
```
|
|
|
|
|
|
Here is some documentation:
|
|
Here is some documentation:
|
|
|
|
|
|
* `seq` allows to execute a sequence of TST nodes
|
|
* `seq` allows to execute a sequence of TST nodes
|
|
* `conc` allows to execute a set of TST nodes concurrently, it has a parameter `stop-on-first` which controls whether the TST node should wait for all the children to finish or stop when the first TST node is done
|
|
* `conc` allows to execute a set of TST nodes concurrently, it has a parameter `stop-on-first` which controls whether the TST node should wait for all the children to finish or stop when the first TST node is done
|
|
* `dock` and `undock` to dock or undock the robot
|
|
* `dock` and `undock` to dock or undock the robot
|
|
* `drive-to` allows to send a ground robot to a specific position, it has the following parameters:
|
|
* `drive-to` allows to send a ground robot to a specific position, it has the following parameters:
|
|
* `p` the destination
|
|
* `p` the destination
|
|
* `heading` the heading the robot must be facing at the end
|
|
* `heading` the heading the robot must be facing at the end
|
|
* `maximum-speed` the maximum speed that the robot need to drive
|
|
* `maximum-speed` the maximum speed that the robot need to drive
|
|
* `explore` drives the robot to random location near the current location of the robot, according to the following parameters:
|
|
* `explore` drives the robot to random location near the current location of the robot, according to the following parameters:
|
|
* `radius` the radius around the start location used to generate new location. The start location is the position of the robot when the explore starts.
|
|
* `radius` the radius around the start location used to generate new location. The start location is the position of the robot when the explore starts.
|
|
* `count` the number of random location visit
|
|
* `count` the number of random location visit
|
|
|
|
|
|
## TST Editor
|
|
## TST Editor
|
|
|
|
|
|
An editor is available in the `air_tst` module, and can be started with:
|
|
An editor is available in the `air_tst` module, and can be started with:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 run air_tst tst_editor
|
|
ros2 run air_tst tst_editor
|
|
```
|
|
```
|
|
|
|
|
|
You can open a specific file by giving it as an argument, for instance, the following will allow you to open the `explore_record.json` tst:
|
|
You can open a specific file by giving it as an argument, for instance, the following will allow you to open the `explore_record.json` tst:
|
|
|
|
|
|
```
|
|
```
|
|
tdde05-cd air_tst tsts
|
|
tdde05-cd air_tst tsts
|
|
ros2 run air_tst tst_editor explore_record_semantic.json
|
|
ros2 run air_tst tst_editor explore_record_semantic.json
|
|
```
|
|
```
|
|
|
|
|
|

|
|

|
|
|
|
|
|
## TST File format
|
|
## TST File format
|
|
|
|
|
|
TST are defined using the JSON file format.
|
|
TST are defined using the JSON file format.
|
|
|
|
|
|
Bellow is an example, for an exploration mission, in which the robot first move to the location $`(10,2)`$ and then explore 8 locations in a radius $`10`$ while recording some data:
|
|
Bellow is an example, for an exploration mission, in which the robot first move to the location $`(10,2)`$ and then explore 8 locations in a radius $`10`$ while recording some data:
|
|
|
|
|
|
```json
|
|
```json
|
|
{
|
|
{
|
|
"children": [
|
|
"children": [
|
|
{
|
|
{
|
|
"name": "drive-to",
|
|
"name": "drive-to",
|
|
"params": {
|
|
"params": {
|
|
"p": {
|
|
"p": {
|
|
"rostype": "Point",
|
|
"rostype": "Point",
|
|
"x": 10,
|
|
"x": 10,
|
|
"y": 2,
|
|
"y": 2,
|
|
"z": 0
|
|
"z": 0
|
|
},
|
|
},
|
|
"use-motion-planner": false
|
|
"use-motion-planner": false
|
|
}
|
|
}
|
|
},
|
|
},
|
|
{
|
|
{
|
|
"children": [
|
|
"children": [
|
|
{
|
|
{
|
|
"name": "explore",
|
|
"name": "explore",
|
|
"params": {
|
|
"params": {
|
|
"radius": 10,
|
|
"radius": 10,
|
|
"count": 8
|
|
"count": 8
|
|
}
|
|
}
|
|
},
|
|
},
|
|
{
|
|
{
|
|
"name": "record",
|
|
"name": "record",
|
|
"params": {
|
|
"params": {
|
|
"topics": "/husky0/lidar"
|
|
"topics": "/husky0/lidar"
|
|
}
|
|
}
|
|
}
|
|
}
|
|
],
|
|
],
|
|
"name": "conc"
|
|
"name": "conc"
|
|
}
|
|
}
|
|
],
|
|
],
|
|
"name": "seq"
|
|
"name": "seq"
|
|
}
|
|
}
|
|
```
|
|
```
|
|
|
|
|
|
The `type` indicates the type of TST node, `children` are used by the container TST nodes and contains other TST node definition. `params` contains the parameters of each TST nodes.
|
|
The `type` indicates the type of TST node, `children` are used by the container TST nodes and contains other TST node definition. `params` contains the parameters of each TST nodes.
|
|
|
|
|
|
## TstML Library
|
|
## TstML Library
|
|
|
|
|
|
The TstML library is a library that allows to parse the specification files, the TST files, provides some basic functionnality for manipulating TSTs and executing them. Examples of use of the TstML library are presented in the lab instruction. If you have doubts about the behavior of a specific function, you can refer to the [API documentation for C++](https://www.ida.liu.se/~TDDE05/info/labs/2021/tstml_doc.pdf, Python API is similar.
|
|
The TstML library is a library that allows to parse the specification files, the TST files, provides some basic functionnality for manipulating TSTs and executing them. Examples of use of the TstML library are presented in the lab instruction. If you have doubts about the behavior of a specific function, you can refer to the [API documentation for C++](https://www.ida.liu.se/~TDDE05/info/labs/2021/tstml_doc.pdf, Python API is similar.
|
|
|
|
|
|
List of relevant classes for defining TST:
|
|
List of relevant classes for defining TST:
|
|
|
|
|
|
* `TSTNode`: represents an instance of a node in a TST.
|
|
* `TSTNode`: represents an instance of a node in a TST.
|
|
* `TSTNodeModel`: represents the model (or specification) of a type of TST Node.
|
|
* `TSTNodeModel`: represents the model (or specification) of a type of TST Node.
|
|
* `TSTNodeModelsRegistry` registry of `TSTNodeModelsRegistry` used to define TSTs.
|
|
* `TSTNodeModelsRegistry` registry of `TSTNodeModelsRegistry` used to define TSTs.
|
|
|
|
|
|
List of relevant classes for executing TST:
|
|
List of relevant classes for executing TST:
|
|
|
|
|
|
* `AbstractNodeExecutor`: base class to define the execution of a `TSTNode`.
|
|
* `AbstractNodeExecutor`: base class to define the execution of a `TSTNode`.
|
|
* `ExecutionStatus`: holds the execution status
|
|
* `ExecutionStatus`: holds the execution status
|
|
* `Executor`: handles the execution of a tree.
|
|
* `Executor`: handles the execution of a tree.
|
|
* `NodeExecutorRegistry`: contains an association between the `TSTNodeModel` and `AbstractNodeExecutor`. It is used during the execution of a TST to instantiate `AbstractNodeExecutor`.
|
|
* `NodeExecutorRegistry`: contains an association between the `TSTNodeModel` and `AbstractNodeExecutor`. It is used during the execution of a TST to instantiate `AbstractNodeExecutor`.
|
|
|
|
|
|
The `NodeExecutorRegistry` and `TSTNodeModelsRegistry` are common for all TSTs and should be permanent. While `TSTNode` and `Executor` are specific to the execution of a given tree.
|
|
The `NodeExecutorRegistry` and `TSTNodeModelsRegistry` are common for all TSTs and should be permanent. While `TSTNode` and `Executor` are specific to the execution of a given tree.
|
|
|
|
|
|
## TST executor service definition
|
|
## TST executor service definition
|
|
|
|
|
|
We will need to create a new service definition. You should check and understand the [official tutorial](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Custom-ROS2-Interfaces.html) first. It is good practice to create a standalone package for defining messages, so that it is independent of Python and C++. We will need to create a package specially for storing our definitions:
|
|
We will need to create a new service definition. You should check and understand the [official tutorial](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Custom-ROS2-Interfaces.html) first. It is good practice to create a standalone package for defining messages, so that it is independent of Python and C++. We will need to create a package specially for storing our definitions:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 pkg create --build-type ament_cmake air_lab_interfaces --dependencies std_msgs rosidl_default_generators
|
|
ros2 pkg create --build-type ament_cmake air_lab_interfaces --dependencies std_msgs rosidl_default_generators
|
|
```
|
|
```
|
|
|
|
|
|
Then you should create a `ExecuteTst.srv` file in `air_lab_interfaces/srv`, with the following content:
|
|
Then you should create a `ExecuteTst.srv` file in `air_lab_interfaces/srv`, with the following content:
|
|
|
|
|
|
```
|
|
```
|
|
string tst_file # Name of the TST file to execute, it should be a full path
|
|
string tst_file # Name of the TST file to execute, it should be a full path
|
|
---
|
|
---
|
|
bool success # Whether the execution was successful
|
|
bool success # Whether the execution was successful
|
|
string error_message # Error message if the execution was unsuccessful
|
|
string error_message # Error message if the execution was unsuccessful
|
|
```
|
|
```
|
|
|
|
|
|
Then we need to add this file to the build system. In `air_lab_interfaces` open the `CMakeLists.txt` file, and add the following before `ament_package()`:
|
|
Then we need to add this file to the build system. In `air_lab_interfaces` open the `CMakeLists.txt` file, and add the following before `ament_package()`:
|
|
|
|
|
|
```cmake
|
|
```cmake
|
|
rosidl_generate_interfaces(${PROJECT_NAME}
|
|
rosidl_generate_interfaces(${PROJECT_NAME}
|
|
"srv/ExecuteTst.srv"
|
|
"srv/ExecuteTst.srv"
|
|
)
|
|
)
|
|
```
|
|
```
|
|
|
|
|
|
Then we need to add the following to `package.xml` file:
|
|
Then we need to add the following to `package.xml` file:
|
|
|
|
|
|
```xml
|
|
```xml
|
|
<member_of_group>rosidl_interface_packages</member_of_group>
|
|
<member_of_group>rosidl_interface_packages</member_of_group>
|
|
```
|
|
```
|
|
|
|
|
|
Then you can build with `tdde05-build`. Then you should reload the environment with `tdde05-start`. Then you should be able to see the new message with:
|
|
Then you can build with `tdde05-build`. Then you should reload the environment with `tdde05-start`. Then you should be able to see the new message with:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 interface show air_lab_interfaces/srv/ExecuteTst
|
|
ros2 interface show air_lab_interfaces/srv/ExecuteTst
|
|
```
|
|
```
|
|
|
|
|
|
## Implementation
|
|
## Implementation
|
|
|
|
|
|
This lab can be completed using **Python** or **C++**. Follows the instruction corresponding to your choice of programming language.
|
|
This lab can be completed using **Python** or **C++**. Follows the instruction corresponding to your choice of programming language.
|
|
|
|
|
|
You will need to create a package for `air_lab3`, in your `ros2_ws/src/labs` directory, with dependencies on a few packages. For python:
|
|
You will need to create a package for `air_lab3`, in your `ros2_ws/src/labs` directory, with dependencies on a few packages. For python:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 pkg create air_lab3 --build-type ament_python --dependencies rclpy air_lab_interfaces --node-name tst_executor
|
|
ros2 pkg create air_lab3 --build-type ament_python --dependencies rclpy air_lab_interfaces --node-name tst_executor
|
|
```
|
|
```
|
|
|
|
|
|
For C++:
|
|
For C++:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 pkg create air_lab3 --build-type ament_cmake --dependencies rclcpp rclcpp_action air_lab_interfaces irobot_create_msgs --node-name tst_executor
|
|
ros2 pkg create air_lab3 --build-type ament_cmake --dependencies rclcpp rclcpp_action air_lab_interfaces irobot_create_msgs --node-name tst_executor
|
|
```
|
|
```
|
|
|
|
|
|
## TST executor service server
|
|
## TST executor service server
|
|
|
|
|
|
We will create a service server whose goal is to load a TST description from a file (one of the JSON file) and execute it. In the remainer of the lab we will help you implent that service call, in this section, we will start by just seting up an empty service.
|
|
We will create a service server whose goal is to load a TST description from a file (one of the JSON file) and execute it. In the remainer of the lab we will help you implent that service call, in this section, we will start by just seting up an empty service.
|
|
|
|
|
|
You should start by reading and understanding the official tutorials for [Python](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Py-Service-And-Client.html) or [C++](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Cpp-Service-And-Client.html).
|
|
You should start by reading and understanding the official tutorials for [Python](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Py-Service-And-Client.html) or [C++](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Cpp-Service-And-Client.html).
|
|
|
|
|
|
You should follow an incremental approach:
|
|
You should follow an incremental approach:
|
|
|
|
|
|
* Copy/paste the service server example from the respective tutorials into your `tst_executor.py` or `tst_executor.cpp`. *We do not need a client, in this lab, we will use the command line to call the service*.
|
|
* Copy/paste the service server example from the respective tutorials into your `tst_executor.py` or `tst_executor.cpp`. *We do not need a client, in this lab, we will use the command line to call the service*.
|
|
* Make sure you understand what is happening, if you have doubts, ask questions to your lab assistant.
|
|
* Make sure you understand what is happening, if you have doubts, ask questions to your lab assistant.
|
|
* You won't be able to run the tutorial. So you can start modifying to replace the service call with `execute_tst` and the type `air_lab_interfaces/srv/ExecuteTst`.
|
|
* You won't be able to run the tutorial. So you can start modifying to replace the service call with `execute_tst` and the type `air_lab_interfaces/srv/ExecuteTst`.
|
|
* Once you have modified it, you can test your calling your service with:
|
|
* Once you have modified it, you can test your calling your service with:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
```
|
|
```
|
|
Note that we use *\`ros2 pkg prefix air_tst\`* which will be replaced by the full path to the directory where the tsts are installed on the system.
|
|
Note that we use *\`ros2 pkg prefix air_tst\`* which will be replaced by the full path to the directory where the tsts are installed on the system.
|
|
* Don't forget to run `tst_executor` and to start it from your screen file.
|
|
* Don't forget to run `tst_executor` and to start it from your screen file.
|
|
|
|
|
|
## Initialise TstML and a load a TstML file
|
|
## Initialise TstML and a load a TstML file
|
|
|
|
|
|
First we need to load the Tst definitions in a `TSTNodeModelsRegistry`. `TSTNodeModelsRegistry` is a class from the TstML library, which we are using to implement this lab.
|
|
First we need to load the Tst definitions in a `TSTNodeModelsRegistry`. `TSTNodeModelsRegistry` is a class from the TstML library, which we are using to implement this lab.
|
|
|
|
|
|
### Python
|
|
### Python
|
|
|
|
|
|
You will need to create a `TstML.TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
You will need to create a `TstML.TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
|
|
|
|
```python
|
|
```python
|
|
import TstML
|
|
import TstML
|
|
import ament_index_python
|
|
import ament_index_python
|
|
|
|
|
|
tst_registry = TstML.TSTNodeModelsRegistry()
|
|
tst_registry = TstML.TSTNodeModelsRegistry()
|
|
tst_registry.loadDirectory(ament_index_python.get_package_prefix("air_tst") + "/share/air_tst/configs")
|
|
tst_registry.loadDirectory(ament_index_python.get_package_prefix("air_tst") + "/share/air_tst/configs")
|
|
|
|
|
|
```
|
|
```
|
|
|
|
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
```python
|
|
```python
|
|
tst_node = TstML.TSTNode.load(filename, tst_registry)
|
|
tst_node = TstML.TSTNode.load(filename, tst_registry)
|
|
```
|
|
```
|
|
|
|
|
|
### C++
|
|
### C++
|
|
|
|
|
|
You will need to create a `TstML::TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
You will need to create a `TstML::TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
|
|
|
|
```c++
|
|
```c++
|
|
#include <TstML/TSTNodeModelsRegistry.h>
|
|
#include <TstML/TSTNodeModelsRegistry.h>
|
|
#include <ament_index_cpp/get_package_prefix.hpp>
|
|
#include <ament_index_cpp/get_package_prefix.hpp>
|
|
|
|
|
|
TstML::TSTNodeModelsRegistry* tst_registry
|
|
TstML::TSTNodeModelsRegistry* tst_registry
|
|
= new TstML::TSTNodeModelsRegistry();
|
|
= new TstML::TSTNodeModelsRegistry();
|
|
tst_registry->loadDirectory(
|
|
tst_registry->loadDirectory(
|
|
QString::fromStdString(ament_index_cpp::get_package_prefix("air_tst")
|
|
QString::fromStdString(ament_index_cpp::get_package_prefix("air_tst")
|
|
+ "/share/air_tst/configs"));
|
|
+ "/share/air_tst/configs"));
|
|
```
|
|
```
|
|
|
|
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
```c++
|
|
```c++
|
|
TstML::TSTNode* tst_node = TstML::TSTNode::load(
|
|
TstML::TSTNode* tst_node = TstML::TSTNode::load(
|
|
QUrl::fromLocalFile(QString::fromStdString(filename)),
|
|
QUrl::fromLocalFile(QString::fromStdString(filename)),
|
|
tst_registry);
|
|
tst_registry);
|
|
```
|
|
```
|
|
|
|
|
|
To be able to compile, you need to add the following to `CMakeLists.txt`:
|
|
To be able to compile, you need to add the following to `CMakeLists.txt`:
|
|
|
|
|
|
```cmake
|
|
```cmake
|
|
find_package(TstML REQUIRED)
|
|
find_package(TstML REQUIRED)
|
|
...
|
|
...
|
|
target_link_libraries(tst_executor TstMLExecutor)
|
|
target_link_libraries(tst_executor TstMLExecutor)
|
|
```
|
|
```
|
|
|
|
|
|
## Initialise TstML::Executor and start execution
|
|
## Initialise TstML::Executor and start execution
|
|
|
|
|
|
### Python
|
|
### Python
|
|
|
|
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
|
|
|
|
```python
|
|
```python
|
|
import TstML.Executor
|
|
import TstML.Executor
|
|
|
|
|
|
# Create a registry with node executors
|
|
# Create a registry with node executors
|
|
tst_executor_registry = TstML.Executor.NodeExecutorRegistry()
|
|
tst_executor_registry = TstML.Executor.NodeExecutorRegistry()
|
|
|
|
|
|
# Setup the executors for sequence and concurrent
|
|
# Setup the executors for sequence and concurrent
|
|
tst_executor_registry.registerNodeExecutor(
|
|
tst_executor_registry.registerNodeExecutor(
|
|
tst_registry.model("seq"),
|
|
tst_registry.model("seq"),
|
|
TstML.Executor.DefaultNodeExecutor.Sequence)
|
|
TstML.Executor.DefaultNodeExecutor.Sequence)
|
|
tst_executor_registry.registerNodeExecutor(
|
|
tst_executor_registry.registerNodeExecutor(
|
|
tst_registry.model("conc"),
|
|
tst_registry.model("conc"),
|
|
TstML.Executor.DefaultNodeExecutor.Concurrent)
|
|
TstML.Executor.DefaultNodeExecutor.Concurrent)
|
|
```
|
|
```
|
|
|
|
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
|
|
|
|
```python
|
|
```python
|
|
# Create an executor using the executors defined
|
|
# Create an executor using the executors defined
|
|
# in tst_executor_registry
|
|
# in tst_executor_registry
|
|
tst_executor = TstML.Executor.Executor(tst_node,
|
|
tst_executor = TstML.Executor.Executor(tst_node,
|
|
tst_executor_registry)
|
|
tst_executor_registry)
|
|
|
|
|
|
# Start execution
|
|
# Start execution
|
|
tst_executor.start()
|
|
tst_executor.start()
|
|
|
|
|
|
# Block until the execution has finished
|
|
# Block until the execution has finished
|
|
status = tst_executor.waitForFinished()
|
|
status = tst_executor.waitForFinished()
|
|
|
|
|
|
# Display the result of execution
|
|
# Display the result of execution
|
|
if status.type() == TstML.Executor.ExecutionStatus.Type.Finished:
|
|
if status.type() == TstML.Executor.ExecutionStatus.Type.Finished:
|
|
print("Execution successful")
|
|
print("Execution successful")
|
|
elif status.type() == TstML.Executor.ExecutionStatus.Type.Failed:
|
|
elif status.type() == TstML.Executor.ExecutionStatus.Type.Failed:
|
|
print("Execution failed: {}".format(status.message()))
|
|
print("Execution failed: {}".format(status.message()))
|
|
else:
|
|
else:
|
|
print("Execution failed!")
|
|
print("Execution failed!")
|
|
```
|
|
```
|
|
|
|
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
|
|
|
|
### C++
|
|
### C++
|
|
|
|
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
|
|
|
|
```c++
|
|
```c++
|
|
#include <TstML/Executor/NodeExecutorRegistry.h>
|
|
#include <TstML/Executor/NodeExecutorRegistry.h>
|
|
#include <TstML/Executor/DefaultNodeExecutor/Sequence.h>
|
|
#include <TstML/Executor/DefaultNodeExecutor/Sequence.h>
|
|
#include <TstML/Executor/DefaultNodeExecutor/Concurrent.h>
|
|
#include <TstML/Executor/DefaultNodeExecutor/Concurrent.h>
|
|
|
|
|
|
// Create a registry with TST node executors
|
|
// Create a registry with TST node executors
|
|
TstML::Executor::NodeExecutorRegistry* tst_executor_registry
|
|
TstML::Executor::NodeExecutorRegistry* tst_executor_registry
|
|
= new TstML::Executor::NodeExecutorRegistry();
|
|
= new TstML::Executor::NodeExecutorRegistry();
|
|
|
|
|
|
// Setup the executors for sequence and concurrent
|
|
// Setup the executors for sequence and concurrent
|
|
tst_executor_registry->registerNodeExecutor
|
|
tst_executor_registry->registerNodeExecutor
|
|
<TstML::Executor::DefaultNodeExecutor::Sequence>
|
|
<TstML::Executor::DefaultNodeExecutor::Sequence>
|
|
(tst_registry->model("seq"));
|
|
(tst_registry->model("seq"));
|
|
tst_executor_registry->registerNodeExecutor
|
|
tst_executor_registry->registerNodeExecutor
|
|
<TstML::Executor::DefaultNodeExecutor::Concurrent>
|
|
<TstML::Executor::DefaultNodeExecutor::Concurrent>
|
|
(tst_registry->model("conc"));
|
|
(tst_registry->model("conc"));
|
|
```
|
|
```
|
|
|
|
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
|
|
|
|
```c++
|
|
```c++
|
|
// Create an executor using the executors defined
|
|
// Create an executor using the executors defined
|
|
// in tst_executor_registry
|
|
// in tst_executor_registry
|
|
TstML::Executor::Executor* tst_executor
|
|
TstML::Executor::Executor* tst_executor
|
|
= new TstML::Executor::Executor(tst_node, tst_executor_registry)
|
|
= new TstML::Executor::Executor(tst_node, tst_executor_registry)
|
|
|
|
|
|
// Start execution
|
|
// Start execution
|
|
tst_executor->start();
|
|
tst_executor->start();
|
|
|
|
|
|
// Block until the execution has finished
|
|
// Block until the execution has finished
|
|
TstML::Executor::ExecutionStatus status = tst_executor->waitForFinished();
|
|
TstML::Executor::ExecutionStatus status = tst_executor->waitForFinished();
|
|
|
|
|
|
// Display the result of execution
|
|
// Display the result of execution
|
|
if(status == TstML::Executor::ExecutionStatus::Finished())
|
|
if(status == TstML::Executor::ExecutionStatus::Finished())
|
|
{
|
|
{
|
|
ROS_INFO("Execution successful");
|
|
ROS_INFO("Execution successful");
|
|
}
|
|
}
|
|
else
|
|
else
|
|
{
|
|
{
|
|
std::string message = status.message().toStdString();
|
|
std::string message = status.message().toStdString();
|
|
ROS_INFO("Execution successful '%s'", message.c_str());
|
|
ROS_INFO("Execution successful '%s'", message.c_str());
|
|
}
|
|
}
|
|
|
|
|
|
delete tst_executor;
|
|
delete tst_executor;
|
|
```
|
|
```
|
|
|
|
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
|
|
|
|
## Implement a subclass of a TstML::AbstractNodeExecutor
|
|
## Implement a subclass of a TstML::AbstractNodeExecutor
|
|
|
|
|
|
The main goal of the lab is to implements executors that are subclass `AbstractNodeExecutor` for the terminal TST nodes. In this section, I will give you the implementation of the `undock` executor. **It is not enough to copy the file, you need to understand what every lines do. Once you do, call the lab assistant and explain what you think every line does.**
|
|
The main goal of the lab is to implements executors that are subclass `AbstractNodeExecutor` for the terminal TST nodes. In this section, I will give you the implementation of the `undock` executor. **It is not enough to copy the file, you need to understand what every lines do. Once you do, call the lab assistant and explain what you think every line does.**
|
|
|
|
|
|
### Python
|
|
### Python
|
|
|
|
|
|
Executor should inherit the `TstML::AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
Executor should inherit the `TstML::AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
|
|
|
|
```python
|
|
```python
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
```
|
|
```
|
|
|
|
|
|
The following show an example of implementation for `undock`:
|
|
The following show an example of implementation for `undock`:
|
|
|
|
|
|
```python
|
|
```python
|
|
import threading
|
|
import threading
|
|
import rclpy
|
|
import rclpy
|
|
from rclpy.node import Node
|
|
from rclpy.node import Node
|
|
from rclpy.action import ActionClient
|
|
from rclpy.action import ActionClient
|
|
import rclpy.executors
|
|
import rclpy.executors
|
|
|
|
|
|
import TstML
|
|
import TstML
|
|
import TstML.Executor
|
|
import TstML.Executor
|
|
|
|
|
|
from irobot_create_msgs.action import Undock
|
|
from irobot_create_msgs.action import Undock
|
|
|
|
|
|
import ament_index_python
|
|
import ament_index_python
|
|
|
|
|
|
# Work around for ROS not showing error with a MultiThreadedExecutor
|
|
# Work around for ROS not showing error with a MultiThreadedExecutor
|
|
def display_exceptions(func):
|
|
def display_exceptions(func):
|
|
def wrapper(*args):
|
|
def wrapper(*args):
|
|
try:
|
|
try:
|
|
return func(*args)
|
|
return func(*args)
|
|
except Exception as ex:
|
|
except Exception as ex:
|
|
traceback.print_exception(*sys.exc_info())
|
|
traceback.print_exception(*sys.exc_info())
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
return wrapper
|
|
return wrapper
|
|
|
|
|
|
# Ugly hack to get a new name for each ROS node
|
|
# Ugly hack to get a new name for each ROS node
|
|
ros_name_counter = 0
|
|
ros_name_counter = 0
|
|
def gen_name(name):
|
|
def gen_name(name):
|
|
global ros_name_counter
|
|
global ros_name_counter
|
|
ros_name_counter += 1
|
|
ros_name_counter += 1
|
|
return name + str(ros_name_counter)
|
|
return name + str(ros_name_counter)
|
|
|
|
|
|
class UndockExecutor(TstML.Executor.AbstractNodeExecutor):
|
|
class UndockExecutor(TstML.Executor.AbstractNodeExecutor):
|
|
def __init__(self, node, context):
|
|
def __init__(self, node, context):
|
|
super(TstML.Executor.AbstractNodeExecutor, self).__init__(node,
|
|
super(TstML.Executor.AbstractNodeExecutor, self).__init__(node,
|
|
context)
|
|
context)
|
|
|
|
|
|
self.ros_node = Node(gen_name("undock_node"))
|
|
self.ros_node = Node(gen_name("undock_node"))
|
|
self._action_client = ActionClient(self.ros_node, Undock, 'undock')
|
|
self._action_client = ActionClient(self.ros_node, Undock, 'undock')
|
|
self.executor = rclpy.executors.MultiThreadedExecutor()
|
|
self.executor = rclpy.executors.MultiThreadedExecutor()
|
|
self.executor.add_node(self.ros_node)
|
|
self.executor.add_node(self.ros_node)
|
|
self.thread = threading.Thread(target=self.executor.spin)
|
|
self.thread = threading.Thread(target=self.executor.spin)
|
|
self.thread.start()
|
|
self.thread.start()
|
|
|
|
|
|
def finalise(self):
|
|
def finalise(self):
|
|
self.executor.shutdown()
|
|
self.executor.shutdown()
|
|
|
|
|
|
def start(self):
|
|
def start(self):
|
|
goal_msg = Undock.Goal()
|
|
goal_msg = Undock.Goal()
|
|
self._send_goal_future = self._action_client.send_goal_async(goal_msg, self.feedback_callback)
|
|
self._send_goal_future = self._action_client.send_goal_async(goal_msg, self.feedback_callback)
|
|
self._send_goal_future.add_done_callback(self.goal_response_callback)
|
|
self._send_goal_future.add_done_callback(self.goal_response_callback)
|
|
return TstML.Executor.ExecutionStatus.Started()
|
|
return TstML.Executor.ExecutionStatus.Started()
|
|
|
|
|
|
def feedback_callback(self, feedback_msg):
|
|
def feedback_callback(self, feedback_msg):
|
|
print(feedback_msg.feedback)
|
|
print(feedback_msg.feedback)
|
|
|
|
|
|
@display_exceptions
|
|
@display_exceptions
|
|
def goal_response_callback(self, future):
|
|
def goal_response_callback(self, future):
|
|
self._goal_handle = future.result()
|
|
self._goal_handle = future.result()
|
|
if not self._goal_handle.accepted:
|
|
if not self._goal_handle.accepted:
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Aborted())
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Aborted())
|
|
self.ros_node.get_logger().error('Goal rejected :(')
|
|
self.ros_node.get_logger().error('Goal rejected :(')
|
|
else:
|
|
else:
|
|
self.ros_node.get_logger().error('Goal accepted :)')
|
|
self.ros_node.get_logger().error('Goal accepted :)')
|
|
|
|
|
|
self._get_result_future = self._goal_handle.get_result_async()
|
|
self._get_result_future = self._goal_handle.get_result_async()
|
|
self._get_result_future.add_done_callback(self.handle_result_callback)
|
|
self._get_result_future.add_done_callback(self.handle_result_callback)
|
|
|
|
|
|
@display_exceptions
|
|
@display_exceptions
|
|
def handle_result_callback(self, future):
|
|
def handle_result_callback(self, future):
|
|
print("Finished!")
|
|
print("Finished!")
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Finished())
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Finished())
|
|
|
|
|
|
def pause(self):
|
|
def pause(self):
|
|
self.ros_node.get_logger().info('Pause is not possible.')
|
|
self.ros_node.get_logger().info('Pause is not possible.')
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
def resume(self):
|
|
def resume(self):
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
def stop(self):
|
|
def stop(self):
|
|
self._goal_handle.cancel_goal()
|
|
self._goal_handle.cancel_goal()
|
|
return TstML.Executor.ExecutionStatus.Finished()
|
|
return TstML.Executor.ExecutionStatus.Finished()
|
|
def abort(self):
|
|
def abort(self):
|
|
self._goal_handle.cancel_goal()
|
|
self._goal_handle.cancel_goal()
|
|
return TstML.Executor.ExecutionStatus.Aborted()
|
|
return TstML.Executor.ExecutionStatus.Aborted()
|
|
```
|
|
```
|
|
|
|
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`. If you want to import class `NameOfClass` from a Python filed called `name_of_file.py`, you can do the following:
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`. If you want to import class `NameOfClass` from a Python filed called `name_of_file.py`, you can do the following:
|
|
|
|
|
|
```python
|
|
```python
|
|
from .name_of_file import NameOfClass
|
|
from .name_of_file import NameOfClass
|
|
```
|
|
```
|
|
|
|
|
|
### C++
|
|
### C++
|
|
|
|
|
|
Executor should inherit the `AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
Executor should inherit the `AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
|
|
|
|
|
|
|
|
The following show an example of implementation for `undock`:
|
|
The following show an example of implementation for `undock`:
|
|
|
|
|
|
```c++
|
|
```c++
|
|
#include <functional>
|
|
#include <functional>
|
|
#include <future>
|
|
#include <future>
|
|
#include <memory>
|
|
#include <memory>
|
|
#include <string>
|
|
#include <string>
|
|
#include <sstream>
|
|
#include <sstream>
|
|
#include <thread>
|
|
#include <thread>
|
|
|
|
|
|
#include "rclcpp/rclcpp.hpp"
|
|
#include "rclcpp/rclcpp.hpp"
|
|
#include "rclcpp_action/rclcpp_action.hpp"
|
|
#include "rclcpp_action/rclcpp_action.hpp"
|
|
|
|
|
|
#include "irobot_create_msgs/action/undock.hpp"
|
|
#include "irobot_create_msgs/action/undock.hpp"
|
|
|
|
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
#include <TstML/TSTNode.h>
|
|
#include <TstML/TSTNode.h>
|
|
|
|
|
|
// Ugly hack to get a new name for each ROS node
|
|
// Ugly hack to get a new name for each ROS node
|
|
std::string gen_name(const std::string& _name)
|
|
std::string gen_name(const std::string& _name)
|
|
{
|
|
{
|
|
static int counter = 0;
|
|
static int counter = 0;
|
|
return _name + std::to_string(++counter);
|
|
return _name + std::to_string(++counter);
|
|
}
|
|
}
|
|
|
|
|
|
class UndockExecutor : public TstML::Executor::AbstractNodeExecutor
|
|
class UndockExecutor : public TstML::Executor::AbstractNodeExecutor
|
|
{
|
|
{
|
|
public:
|
|
public:
|
|
using Undock = irobot_create_msgs::action::Undock;
|
|
using Undock = irobot_create_msgs::action::Undock;
|
|
using GoalHandleUndock = rclcpp_action::ClientGoalHandle<Undock>;
|
|
using GoalHandleUndock = rclcpp_action::ClientGoalHandle<Undock>;
|
|
|
|
|
|
UndockExecutor(const TstML::TSTNode* _node,
|
|
UndockExecutor(const TstML::TSTNode* _node,
|
|
TstML::Executor::AbstractExecutionContext* _context) :
|
|
TstML::Executor::AbstractExecutionContext* _context) :
|
|
TstML::Executor::AbstractNodeExecutor(_node, _context)
|
|
TstML::Executor::AbstractNodeExecutor(_node, _context)
|
|
{
|
|
{
|
|
m_node = rclcpp::Node::make_shared(gen_name("undock_node"));
|
|
m_node = rclcpp::Node::make_shared(gen_name("undock_node"));
|
|
m_executor.add_node(m_node);
|
|
m_executor.add_node(m_node);
|
|
m_executor_thread = std::thread([this]() { m_executor.spin(); });
|
|
m_executor_thread = std::thread([this]() { m_executor.spin(); });
|
|
|
|
|
|
m_client_ptr = rclcpp_action::create_client<Undock>(m_node, "undock");
|
|
m_client_ptr = rclcpp_action::create_client<Undock>(m_node, "undock");
|
|
}
|
|
}
|
|
~UndockExecutor()
|
|
~UndockExecutor()
|
|
{
|
|
{
|
|
m_executor.cancel();
|
|
m_executor.cancel();
|
|
m_executor_thread.join();
|
|
m_executor_thread.join();
|
|
}
|
|
}
|
|
TstML::Executor::ExecutionStatus start() override
|
|
TstML::Executor::ExecutionStatus start() override
|
|
{
|
|
{
|
|
using namespace std::placeholders;
|
|
using namespace std::placeholders;
|
|
|
|
|
|
auto goal_msg = Undock::Goal();
|
|
auto goal_msg = Undock::Goal();
|
|
|
|
|
|
auto send_goal_options = rclcpp_action::Client<Undock>::SendGoalOptions();
|
|
auto send_goal_options = rclcpp_action::Client<Undock>::SendGoalOptions();
|
|
send_goal_options.goal_response_callback =
|
|
send_goal_options.goal_response_callback =
|
|
std::bind(&UndockExecutor::goal_response_callback, this, _1);
|
|
std::bind(&UndockExecutor::goal_response_callback, this, _1);
|
|
send_goal_options.feedback_callback =
|
|
send_goal_options.feedback_callback =
|
|
std::bind(&UndockExecutor::feedback_callback, this, _1, _2);
|
|
std::bind(&UndockExecutor::feedback_callback, this, _1, _2);
|
|
send_goal_options.result_callback =
|
|
send_goal_options.result_callback =
|
|
std::bind(&UndockExecutor::result_callback, this, _1);
|
|
std::bind(&UndockExecutor::result_callback, this, _1);
|
|
m_client_ptr->async_send_goal(goal_msg, send_goal_options);
|
|
m_client_ptr->async_send_goal(goal_msg, send_goal_options);
|
|
|
|
|
|
return TstML::Executor::ExecutionStatus::Started();
|
|
return TstML::Executor::ExecutionStatus::Started();
|
|
}
|
|
}
|
|
|
|
|
|
void goal_response_callback(std::shared_future<GoalHandleUndock::SharedPtr> future)
|
|
void goal_response_callback(std::shared_future<GoalHandleUndock::SharedPtr> future)
|
|
{
|
|
{
|
|
m_goal_handle = future.get();
|
|
m_goal_handle = future.get();
|
|
if (!m_goal_handle) {
|
|
if (!m_goal_handle) {
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was rejected by server");
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was rejected by server");
|
|
} else {
|
|
} else {
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal accepted by server, waiting for result");
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal accepted by server, waiting for result");
|
|
}
|
|
}
|
|
}
|
|
}
|
|
|
|
|
|
void feedback_callback(
|
|
void feedback_callback(
|
|
GoalHandleUndock::SharedPtr goal_handle,
|
|
GoalHandleUndock::SharedPtr goal_handle,
|
|
const std::shared_ptr<const Undock::Feedback> feedback)
|
|
const std::shared_ptr<const Undock::Feedback> feedback)
|
|
{
|
|
{
|
|
}
|
|
}
|
|
|
|
|
|
void result_callback(const GoalHandleUndock::WrappedResult & result)
|
|
void result_callback(const GoalHandleUndock::WrappedResult & result)
|
|
{
|
|
{
|
|
switch (result.code) {
|
|
switch (result.code) {
|
|
case rclcpp_action::ResultCode::SUCCEEDED:
|
|
case rclcpp_action::ResultCode::SUCCEEDED:
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal was succeeded");
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal was succeeded");
|
|
executionFinished(TstML::Executor::ExecutionStatus::Finished());
|
|
executionFinished(TstML::Executor::ExecutionStatus::Finished());
|
|
break;
|
|
break;
|
|
case rclcpp_action::ResultCode::ABORTED:
|
|
case rclcpp_action::ResultCode::ABORTED:
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was aborted");
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was aborted");
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
return;
|
|
return;
|
|
case rclcpp_action::ResultCode::CANCELED:
|
|
case rclcpp_action::ResultCode::CANCELED:
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was canceled");
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was canceled");
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
return;
|
|
return;
|
|
default:
|
|
default:
|
|
RCLCPP_ERROR(m_node->get_logger(), "Unknown result code");
|
|
RCLCPP_ERROR(m_node->get_logger(), "Unknown result code");
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
return;
|
|
return;
|
|
}
|
|
}
|
|
}
|
|
}
|
|
|
|
|
|
TstML::Executor::ExecutionStatus pause() override
|
|
TstML::Executor::ExecutionStatus pause() override
|
|
{
|
|
{
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
}
|
|
}
|
|
TstML::Executor::ExecutionStatus resume() override
|
|
TstML::Executor::ExecutionStatus resume() override
|
|
{
|
|
{
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
}
|
|
}
|
|
TstML::Executor::ExecutionStatus stop() override
|
|
TstML::Executor::ExecutionStatus stop() override
|
|
{
|
|
{
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
return TstML::Executor::ExecutionStatus::Finished();
|
|
return TstML::Executor::ExecutionStatus::Finished();
|
|
}
|
|
}
|
|
TstML::Executor::ExecutionStatus abort() override
|
|
TstML::Executor::ExecutionStatus abort() override
|
|
{
|
|
{
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
return TstML::Executor::ExecutionStatus::Aborted();
|
|
return TstML::Executor::ExecutionStatus::Aborted();
|
|
}
|
|
}
|
|
private:
|
|
private:
|
|
std::shared_ptr<rclcpp::Node> m_node;
|
|
std::shared_ptr<rclcpp::Node> m_node;
|
|
rclcpp::executors::MultiThreadedExecutor m_executor;
|
|
rclcpp::executors::MultiThreadedExecutor m_executor;
|
|
std::thread m_executor_thread;
|
|
std::thread m_executor_thread;
|
|
rclcpp_action::Client<Undock>::SharedPtr m_client_ptr;
|
|
rclcpp_action::Client<Undock>::SharedPtr m_client_ptr;
|
|
GoalHandleUndock::SharedPtr m_goal_handle;
|
|
GoalHandleUndock::SharedPtr m_goal_handle;
|
|
};
|
|
};
|
|
```
|
|
```
|
|
|
|
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`.
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`.
|
|
|
|
|
|
## Test execution
|
|
## Test execution
|
|
|
|
|
|
You should now start your `tst_executor` ROS node, and we can use a service call to execute our first TST:
|
|
You should now start your `tst_executor` ROS node, and we can use a service call to execute our first TST:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
```
|
|
```
|
|
|
|
|
|
You should see your robot undocking.
|
|
You should see your robot undocking.
|
|
|
|
|
|
## Implement abort/stop/pause/resume
|
|
## Implement abort/stop/pause/resume
|
|
|
|
|
|
You should create services for `abort`, `stop`, `pause` and `resume`. They should be created in a similar way as you created the `execute_tst` service server. We can use `std_srvs/Empty` as a service type for those services.
|
|
You should create services for `abort`, `stop`, `pause` and `resume`. They should be created in a similar way as you created the `execute_tst` service server. We can use `std_srvs/Empty` as a service type for those services.
|
|
|
|
|
|
For `abort`, `stop`, `pause` and `resume`, follow those steps:
|
|
For `abort`, `stop`, `pause` and `resume`, follow those steps:
|
|
|
|
|
|
* Create a callback function, i.e., a new function in your class with a unique name (for instance callback_abort), it should take a `request` and a `response` as argument
|
|
* Create a callback function, i.e., a new function in your class with a unique name (for instance callback_abort), it should take a `request` and a `response` as argument
|
|
* In that callback function you can use `abort`, `stop`, `pause` and `resume` on the `tst_executor` object you created when starting execution of the tree.
|
|
* In that callback function you can use `abort`, `stop`, `pause` and `resume` on the `tst_executor` object you created when starting execution of the tree.
|
|
* In the constructor of your ROS Node, you can use the `create_service` function to create a new service. Use the `create_service` of `execute_tst` as an example.
|
|
* In the constructor of your ROS Node, you can use the `create_service` function to create a new service. Use the `create_service` of `execute_tst` as an example.
|
|
|
|
|
|
You should to call your services with ```ros2 service call```, however you will notice that the call is hanging, and the execution is not aborted, stopped, paused or resumed. This is because, by default, ROS is single threaded and can only handle one request. We need to use a ```MultiThreadedExecutor``` with a reentrant group.
|
|
You should to call your services with ```ros2 service call```, however you will notice that the call is hanging, and the execution is not aborted, stopped, paused or resumed. This is because, by default, ROS is single threaded and can only handle one request. We need to use a ```MultiThreadedExecutor``` with a reentrant group.
|
|
|
|
|
|
### Python
|
|
### Python
|
|
|
|
|
|
Modify your main function to look like this, instead of using `rclpy::spin(some_node)`:
|
|
Modify your main function to look like this, instead of using `rclpy::spin(some_node)`:
|
|
|
|
|
|
```python
|
|
```python
|
|
executor = rclpy.executors.MultiThreadedExecutor()
|
|
executor = rclpy.executors.MultiThreadedExecutor()
|
|
executor.add_node(some_node)
|
|
executor.add_node(some_node)
|
|
executor.spin()
|
|
executor.spin()
|
|
```
|
|
```
|
|
|
|
|
|
Create a reentrant group:
|
|
Create a reentrant group:
|
|
|
|
|
|
|
|
|
|
```python
|
|
```python
|
|
group = ReentrantCallbackGroup()
|
|
group = rclpy.callback_groups.ReentrantCallbackGroup()
|
|
```
|
|
```
|
|
|
|
|
|
And when creating a service use:
|
|
And when creating a service use:
|
|
|
|
|
|
```python
|
|
```python
|
|
service = self.create_service(..., callback_group=self.group)
|
|
service = self.create_service(..., callback_group=self.group)
|
|
```
|
|
```
|
|
|
|
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
|
|
|
|
Also, due to a bug in `rclpy` for the version of ROS installed in the labs computers, python exceptions are not displayed when using a `MultiThreadedExecutor`. To work around that problem, it is possible to use the `display_exception` decorator, at the top of your file for the ROS node, add the following:
|
|
Also, due to a bug in `rclpy` for the version of ROS installed in the labs computers, python exceptions are not displayed when using a `MultiThreadedExecutor`. To work around that problem, it is possible to use the `display_exception` decorator, at the top of your file for the ROS node, add the following:
|
|
|
|
|
|
```python
|
|
```python
|
|
import traceback
|
|
import traceback
|
|
import sys
|
|
import sys
|
|
|
|
|
|
def display_exception(func):
|
|
def display_exception(func):
|
|
def wrapper(*args):
|
|
def wrapper(*args):
|
|
try:
|
|
try:
|
|
func(*args)
|
|
func(*args)
|
|
except Exception as ex:
|
|
except Exception as ex:
|
|
traceback.print_exception(*sys.exc_info())
|
|
traceback.print_exception(*sys.exc_info())
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
return wrapper
|
|
return wrapper
|
|
```
|
|
```
|
|
|
|
|
|
Then just before the function used to define your service callback, add `@display_exceptions`, so that it looks like this:
|
|
Then just before the function used to define your service callback, add `@display_exceptions`, so that it looks like this:
|
|
|
|
|
|
```python
|
|
```python
|
|
@display_exceptions
|
|
@display_exceptions
|
|
def execute_tst_callback(self, request, response):
|
|
def execute_tst_callback(self, request, response):
|
|
```
|
|
```
|
|
|
|
|
|
You can also check the `UndockExecutor` for an example of use of `@display_exceptions`.
|
|
You can also check the `UndockExecutor` for an example of use of `@display_exceptions`.
|
|
|
|
|
|
### C++
|
|
### C++
|
|
|
|
|
|
Modify your main function to look like this, instead of using `rclcpp::spin(some_node)`:
|
|
Modify your main function to look like this, instead of using `rclcpp::spin(some_node)`:
|
|
|
|
|
|
```c++
|
|
```c++
|
|
rclcpp::executors::MultiThreadedExecutor executor;
|
|
rclcpp::executors::MultiThreadedExecutor executor;
|
|
auto node = std::make_shared<TstExecutor>();
|
|
auto node = std::make_shared<TstExecutor>();
|
|
executor.add_node(node);
|
|
executor.add_node(node);
|
|
executor.spin();
|
|
executor.spin();
|
|
```
|
|
```
|
|
|
|
|
|
Create a reentrant group:
|
|
Create a reentrant group:
|
|
|
|
|
|
```c++
|
|
```c++
|
|
rclcpp::CallbackGroup::SharedPtr group = this->create_callback_group(rclcpp::CallbackGroupType::Reentrant);
|
|
rclcpp::CallbackGroup::SharedPtr group = this->create_callback_group(rclcpp::CallbackGroupType::Reentrant);
|
|
```
|
|
```
|
|
|
|
|
|
And when creating a service use:
|
|
And when creating a service use:
|
|
|
|
|
|
```c++
|
|
```c++
|
|
service = create_service<...>("pause_tst", ..., rmw_qos_profile_services_default, group);
|
|
service = create_service<...>("pause_tst", ..., rmw_qos_profile_services_default, group);
|
|
```
|
|
```
|
|
|
|
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
|
|
|
|
## Test cases
|
|
## Test cases
|
|
|
|
|
|
We provide some test cases:
|
|
We provide some test cases:
|
|
|
|
|
|
```bash
|
|
```bash
|
|
tdde05-cd air_tst tsts
|
|
tdde05-cd air_tst tsts
|
|
```
|
|
```
|
|
|
|
|
|
* `undock.json`: undock
|
|
* `undock.json`: undock
|
|
* `dock.json`: dock
|
|
* `dock.json`: dock
|
|
* `drive_to.json`: go to three positions in sequence
|
|
* `drive_to.json`: go to three positions in sequence
|
|
* `drive_to_repeat.json`: go to three positions in sequence and repeat once
|
|
* `drive_to_repeat.json`: go to three positions in sequence and repeat once
|
|
* `drive_home.json`: go to near the home position and dock
|
|
* `drive_home.json`: go to near the home position and dock
|
|
* `explore.json`: execute random exploration motion
|
|
* `explore.json`: execute random exploration motion
|
|
* `drive_to_explore.json`: go to a start position and execute the random exploration
|
|
* `drive_to_explore.json`: go to a start position and execute the random exploration
|
|
|
|
|
|
## Implementation of Executors
|
|
## Implementation of Executors
|
|
|
|
|
|
You need to implement the `dock`, `drive_to` and `explore`.
|
|
You need to implement the `dock`, `drive_to` and `explore`.
|
|
|
|
|
|
**Dock** Inspire yourself from the `UndockExecutor` to create this executor. Essentially duplicating `UndockExecutor` and replacing Undock by Dock is *almost* all you have to do. You can get the type of an action with `ros2 action info -t [action]`.
|
|
**Dock** Inspire yourself from the `UndockExecutor` to create this executor. Essentially duplicating `UndockExecutor` and replacing Undock by Dock is *almost* all you have to do. You can get the type of an action with `ros2 action info -t [action]`.
|
|
|
|
|
|
**Drive to** Drive to a given location and set the maximum velocity. To drive to a given location, you should send a goal to the navigate action server, like you did in *Lab 2*. Also in this lab you will need to se the `maximum_speed` by publishing on the `/speed_limit` topic. You can use `ros2 topic info` to get the information about a topic, including its type. You can use `ros2 interface show` to show the definition of a topic. `drive_to` has three parameters that you need to get: the destination `p`, the `heading` and the `maximum-speed`.
|
|
**Drive to** Drive to a given location and set the maximum velocity. To drive to a given location, you should send a goal to the navigate action server, like you did in *Lab 2*. Also in this lab you will need to se the `maximum_speed` by publishing on the `/speed_limit` topic. You can use `ros2 topic info` to get the information about a topic, including its type. You can use `ros2 interface show` to show the definition of a topic. `drive_to` has three parameters that you need to get: the destination `p`, the `heading` and the `maximum-speed`.
|
|
|
|
|
|
To get parameters for a TST node, in python:
|
|
To get parameters for a TST node, in python:
|
|
|
|
|
|
```python
|
|
```python
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
```
|
|
```
|
|
|
|
|
|
and in C++:
|
|
and in C++:
|
|
|
|
|
|
```c++
|
|
```c++
|
|
QVariant p = node()->getParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
QVariant p = node()->getParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
p.toString(); // convert it to a string
|
|
p.toString(); // convert it to a string
|
|
p.toInt(); // Convert it to an integer
|
|
p.toInt(); // Convert it to an integer
|
|
p.toDouble(); // Convert it to a double
|
|
p.toDouble(); // Convert it to a double
|
|
```
|
|
```
|
|
|
|
|
|
And to test if a parameter is available:
|
|
And to test if a parameter is available:
|
|
|
|
|
|
```python
|
|
```python
|
|
self.node().hasParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
self.node().hasParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
```
|
|
```
|
|
|
|
|
|
and in C++:
|
|
and in C++:
|
|
|
|
|
|
```c++
|
|
```c++
|
|
node()->hasParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
node()->hasParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
```
|
|
```
|
|
|
|
|
|
Heading is equivalent to Yaw. But it needs to be specify as a quaternion, you can use the following formulat to compute a yaw quaternion:
|
|
Heading is equivalent to Yaw. But it needs to be specify as a quaternion, you can use the following formulat to compute a yaw quaternion:
|
|
|
|
|
|
```math
|
|
```math
|
|
(x=0, y=0,z=sin(yaw/2),q=cos(yaw/2))
|
|
(x=0, y=0,z=sin(yaw/2),q=cos(yaw/2))
|
|
```
|
|
```
|
|
|
|
|
|
**Explore** random exploration. This should be very similar to `Lab2` `RandomExploration`.
|
|
**Explore** random exploration. This should be very similar to `Lab2` `RandomExploration`.
|
|
|
|
|
|
## Demonstration
|
|
## Demonstration
|
|
|
|
|
|
* Show the execution of `drive_to_repeat.json`, `drive_to_explore.json` and `drive_home.json` |
|
* Show the execution of `drive_to_repeat.json`, `drive_to_explore.json` and `drive_home.json` |
|
|
|
\ No newline at end of file |