|
|
# Lab 3: Task Execution
|
|
|
|
|
|
The goal of this lab is to create an executor for Task-Specification-Trees. The TSTs are defined using the JSON format.
|
|
|
|
|
|
It is *important* to have studied **Lecture 07: Deliberative Architecture** before starting this lab.
|
|
|
|
|
|
## TST
|
|
|
|
|
|
A Task-Specification-Tree (TST) is a hierachical tree structure which represents a complex mission. For the purpose of this lab, six types of TST nodes are used:
|
|
|
|
|
|
You can find the specification of the TST nodes in `air_tst/configs`, the directory on the lab computers is `/courses/TDDE05/software/ros2_ws/src/air_tst/configs`, you can quickly access the directory with:
|
|
|
|
|
|
```bash
|
|
|
tdde05-cd air_tst configs
|
|
|
```
|
|
|
|
|
|
Here is some documentation:
|
|
|
|
|
|
* `seq` allows to execute a sequence of TST nodes
|
|
|
* `conc` allows to execute a set of TST nodes concurrently, it has a parameter `stop-on-first` which controls whether the TST node should wait for all the children to finish or stop when the first TST node is done
|
|
|
* `dock` and `undock` to dock or undock the robot
|
|
|
* `drive-to` allows to send a ground robot to a specific position, it has the following parameters:
|
|
|
* `p` the destination
|
|
|
* `heading` the heading the robot must be facing at the end
|
|
|
* `maximum-speed` the maximum speed that the robot need to drive
|
|
|
* `explore` drives the robot to random location near the current location of the robot, according to the following parameters:
|
|
|
* `radius` the radius around the start location used to generate new location. The start location is the position of the robot when the explore starts.
|
|
|
* `count` the number of random location visit
|
|
|
|
|
|
## TST Editor
|
|
|
|
|
|
An editor is available in the `air_tst` module, and can be started with:
|
|
|
|
|
|
```bash
|
|
|
ros2 run air_tst tst_editor
|
|
|
```
|
|
|
|
|
|
You can open a specific file by giving it as an argument, for instance, the following will allow you to open the `explore_record.json` tst:
|
|
|
|
|
|
```
|
|
|
tdde05-cd air_tst tsts
|
|
|
ros2 run air_tst tst_editor explore_record_semantic.json
|
|
|
```
|
|
|
|
|
|

|
|
|
|
|
|
## TST File format
|
|
|
|
|
|
TST are defined using the JSON file format.
|
|
|
|
|
|
Bellow is an example, for an exploration mission, in which the robot first move to the location $`(10,2)`$ and then explore 8 locations in a radius $`10`$ while recording some data:
|
|
|
|
|
|
```json
|
|
|
{
|
|
|
"children": [
|
|
|
{
|
|
|
"name": "drive-to",
|
|
|
"params": {
|
|
|
"p": {
|
|
|
"rostype": "Point",
|
|
|
"x": 10,
|
|
|
"y": 2,
|
|
|
"z": 0
|
|
|
},
|
|
|
"use-motion-planner": false
|
|
|
}
|
|
|
},
|
|
|
{
|
|
|
"children": [
|
|
|
{
|
|
|
"name": "explore",
|
|
|
"params": {
|
|
|
"radius": 10,
|
|
|
"count": 8
|
|
|
}
|
|
|
},
|
|
|
{
|
|
|
"name": "record",
|
|
|
"params": {
|
|
|
"topics": "/husky0/lidar"
|
|
|
}
|
|
|
}
|
|
|
],
|
|
|
"name": "conc"
|
|
|
}
|
|
|
],
|
|
|
"name": "seq"
|
|
|
}
|
|
|
```
|
|
|
|
|
|
The `type` indicates the type of TST node, `children` are used by the container TST nodes and contains other TST node definition. `params` contains the parameters of each TST nodes.
|
|
|
|
|
|
## TstML Library
|
|
|
|
|
|
The TstML library is a library that allows to parse the specification files, the TST files, provides some basic functionnality for manipulating TSTs and executing them. Examples of use of the TstML library are presented in the lab instruction. If you have doubts about the behavior of a specific function, you can refer to the [API documentation for C++](https://www.ida.liu.se/~TDDE05/info/labs/2021/tstml_doc.pdf, Python API is similar.
|
|
|
|
|
|
List of relevant classes for defining TST:
|
|
|
|
|
|
* `TSTNode`: represents an instance of a node in a TST.
|
|
|
* `TSTNodeModel`: represents the model (or specification) of a type of TST Node.
|
|
|
* `TSTNodeModelsRegistry` registry of `TSTNodeModelsRegistry` used to define TSTs.
|
|
|
|
|
|
List of relevant classes for executing TST:
|
|
|
|
|
|
* `AbstractNodeExecutor`: base class to define the execution of a `TSTNode`.
|
|
|
* `ExecutionStatus`: holds the execution status
|
|
|
* `Executor`: handles the execution of a tree.
|
|
|
* `NodeExecutorRegistry`: contains an association between the `TSTNodeModel` and `AbstractNodeExecutor`. It is used during the execution of a TST to instantiate `AbstractNodeExecutor`.
|
|
|
|
|
|
The `NodeExecutorRegistry` and `TSTNodeModelsRegistry` are common for all TSTs and should be permanent. While `TSTNode` and `Executor` are specific to the execution of a given tree.
|
|
|
|
|
|
## TST executor service definition
|
|
|
|
|
|
We will need to create a new service definition. You should check and understand the [official tutorial](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Custom-ROS2-Interfaces.html) first. It is good practice to create a standalone package for defining messages, so that it is independent of Python and C++. We will need to create a package specially for storing our definitions:
|
|
|
|
|
|
```bash
|
|
|
ros2 pkg create --build-type ament_cmake air_lab_interfaces --dependencies std_msgs rosidl_default_generators
|
|
|
```
|
|
|
|
|
|
Then you should create a `ExecuteTst.srv` file in `air_lab_interfaces/srv`, with the following content:
|
|
|
|
|
|
```
|
|
|
string tst_file # Name of the TST file to execute, it should be a full path
|
|
|
---
|
|
|
bool success # Whether the execution was successful
|
|
|
string error_message # Error message if the execution was unsuccessful
|
|
|
```
|
|
|
|
|
|
Then we need to add this file to the build system. In `air_lab_interfaces` open the `CMakeLists.txt` file, and add the following before `ament_package()`:
|
|
|
|
|
|
```cmake
|
|
|
rosidl_generate_interfaces(${PROJECT_NAME}
|
|
|
"srv/ExecuteTst.srv"
|
|
|
)
|
|
|
```
|
|
|
|
|
|
Then we need to add the following to `package.xml` file:
|
|
|
|
|
|
```xml
|
|
|
<member_of_group>rosidl_interface_packages</member_of_group>
|
|
|
```
|
|
|
|
|
|
Then you can build with `tdde05-build`. Then you should reload the environment with `tdde05-start`. Then you should be able to see the new message with:
|
|
|
|
|
|
```bash
|
|
|
ros2 interface show air_lab_interfaces/srv/ExecuteTst
|
|
|
```
|
|
|
|
|
|
## Implementation
|
|
|
|
|
|
This lab can be completed using **Python** or **C++**. Follows the instruction corresponding to your choice of programming language.
|
|
|
|
|
|
You will need to create a package for `air_lab3`, in your `ros2_ws/src/labs` directory, with dependencies on a few packages. For python:
|
|
|
|
|
|
```bash
|
|
|
ros2 pkg create air_lab3 --build-type ament_python --dependencies rclpy air_lab_interfaces --node-name tst_executor
|
|
|
```
|
|
|
|
|
|
For C++:
|
|
|
|
|
|
```bash
|
|
|
ros2 pkg create air_lab3 --build-type ament_cmake --dependencies rclcpp rclcpp_action air_lab_interfaces irobot_create_msgs --node-name tst_executor
|
|
|
```
|
|
|
|
|
|
## TST executor service server
|
|
|
|
|
|
We will create a service server whose goal is to load a TST description from a file (one of the JSON file) and execute it. In the remainer of the lab we will help you implent that service call, in this section, we will start by just seting up an empty service.
|
|
|
|
|
|
You should start by reading and understanding the official tutorials for [Python](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Py-Service-And-Client.html) or [C++](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Cpp-Service-And-Client.html).
|
|
|
|
|
|
You should follow an incremental approach:
|
|
|
|
|
|
* Copy/paste the service server example from the respective tutorials into your `tst_executor.py` or `tst_executor.cpp`. *We do not need a client, in this lab, we will use the command line to call the service*.
|
|
|
* Make sure you understand what is happening, if you have doubts, ask questions to your lab assistant.
|
|
|
* You won't be able to run the tutorial. So you can start modifying to replace the service call with `execute_tst` and the type `air_lab_interfaces/srv/ExecuteTst`.
|
|
|
* Once you have modified it, you can test your calling your service with:
|
|
|
|
|
|
```bash
|
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
|
```
|
|
|
Note that we use *\`ros2 pkg prefix air_tst\`* which will be replaced by the full path to the directory where the tsts are installed on the system.
|
|
|
* Don't forget to run `tst_executor` and to start it from your screen file.
|
|
|
|
|
|
## Initialise TstML and a load a TstML file
|
|
|
|
|
|
First we need to load the Tst definitions in a `TSTNodeModelsRegistry`. `TSTNodeModelsRegistry` is a class from the TstML library, which we are using to implement this lab.
|
|
|
|
|
|
### Python
|
|
|
|
|
|
You will need to create a `TstML.TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
|
|
|
|
```python
|
|
|
import TstML
|
|
|
import ament_index_python
|
|
|
|
|
|
tst_registry = TstML.TSTNodeModelsRegistry()
|
|
|
tst_registry.loadDirectory(ament_index_python.get_package_prefix("air_tst") + "/share/air_tst/configs")
|
|
|
|
|
|
```
|
|
|
|
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
```python
|
|
|
tst_node = TstML.TSTNode.load(filename, tst_registry)
|
|
|
```
|
|
|
|
|
|
### C++
|
|
|
|
|
|
You will need to create a `TstML::TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
|
|
|
|
```c++
|
|
|
#include <TstML/TSTNodeModelsRegistry.h>
|
|
|
#include <ament_index_cpp/get_package_prefix.hpp>
|
|
|
|
|
|
TstML::TSTNodeModelsRegistry* tst_registry
|
|
|
= new TstML::TSTNodeModelsRegistry();
|
|
|
tst_registry->loadDirectory(
|
|
|
QString::fromStdString(ament_index_cpp::get_package_prefix("air_tst")
|
|
|
+ "/share/air_tst/configs"));
|
|
|
```
|
|
|
|
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
|
```c++
|
|
|
TstML::TSTNode* tst_node = TstML::TSTNode::load(
|
|
|
QUrl::fromLocalFile(QString::fromStdString(filename)),
|
|
|
tst_registry);
|
|
|
```
|
|
|
|
|
|
To be able to compile, you need to add the following to `CMakeLists.txt`:
|
|
|
|
|
|
```cmake
|
|
|
find_package(TstML REQUIRED)
|
|
|
...
|
|
|
target_link_libraries(tst_executor TstMLExecutor)
|
|
|
```
|
|
|
|
|
|
## Initialise TstML::Executor and start execution
|
|
|
|
|
|
### Python
|
|
|
|
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
|
|
|
|
```python
|
|
|
import TstML.Executor
|
|
|
|
|
|
# Create a registry with node executors
|
|
|
tst_executor_registry = TstML.Executor.NodeExecutorRegistry()
|
|
|
|
|
|
# Setup the executors for sequence and concurrent
|
|
|
tst_executor_registry.registerNodeExecutor(
|
|
|
tst_registry.model("seq"),
|
|
|
TstML.Executor.DefaultNodeExecutor.Sequence)
|
|
|
tst_executor_registry.registerNodeExecutor(
|
|
|
tst_registry.model("conc"),
|
|
|
TstML.Executor.DefaultNodeExecutor.Concurrent)
|
|
|
```
|
|
|
|
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
|
|
|
```python
|
|
|
# Create an executor using the executors defined
|
|
|
# in tst_executor_registry
|
|
|
tst_executor = TstML.Executor.Executor(tst_node,
|
|
|
tst_executor_registry)
|
|
|
|
|
|
# Start execution
|
|
|
tst_executor.start()
|
|
|
|
|
|
# Block until the execution has finished
|
|
|
status = tst_executor.waitForFinished()
|
|
|
|
|
|
# Display the result of execution
|
|
|
if status.type() == TstML.Executor.ExecutionStatus.Type.Finished:
|
|
|
print("Execution successful")
|
|
|
elif status.type() == TstML.Executor.ExecutionStatus.Type.Failed:
|
|
|
print("Execution failed: {}".format(status.message()))
|
|
|
else:
|
|
|
print("Execution failed!")
|
|
|
```
|
|
|
|
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
|
|
|
|
### C++
|
|
|
|
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
|
|
|
|
```c++
|
|
|
#include <TstML/Executor/NodeExecutorRegistry.h>
|
|
|
#include <TstML/Executor/DefaultNodeExecutor/Sequence.h>
|
|
|
#include <TstML/Executor/DefaultNodeExecutor/Concurrent.h>
|
|
|
|
|
|
// Create a registry with TST node executors
|
|
|
TstML::Executor::NodeExecutorRegistry* tst_executor_registry
|
|
|
= new TstML::Executor::NodeExecutorRegistry();
|
|
|
|
|
|
// Setup the executors for sequence and concurrent
|
|
|
tst_executor_registry->registerNodeExecutor
|
|
|
<TstML::Executor::DefaultNodeExecutor::Sequence>
|
|
|
(tst_registry->model("seq"));
|
|
|
tst_executor_registry->registerNodeExecutor
|
|
|
<TstML::Executor::DefaultNodeExecutor::Concurrent>
|
|
|
(tst_registry->model("conc"));
|
|
|
```
|
|
|
|
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
|
|
|
```c++
|
|
|
// Create an executor using the executors defined
|
|
|
// in tst_executor_registry
|
|
|
TstML::Executor::Executor* tst_executor
|
|
|
= new TstML::Executor::Executor(tst_node, tst_executor_registry)
|
|
|
|
|
|
// Start execution
|
|
|
tst_executor->start();
|
|
|
|
|
|
// Block until the execution has finished
|
|
|
TstML::Executor::ExecutionStatus status = tst_executor->waitForFinished();
|
|
|
|
|
|
// Display the result of execution
|
|
|
if(status == TstML::Executor::ExecutionStatus::Finished())
|
|
|
{
|
|
|
ROS_INFO("Execution successful");
|
|
|
}
|
|
|
else
|
|
|
{
|
|
|
std::string message = status.message().toStdString();
|
|
|
ROS_INFO("Execution successful '%s'", message.c_str());
|
|
|
}
|
|
|
|
|
|
delete tst_executor;
|
|
|
```
|
|
|
|
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
|
|
|
|
## Implement a subclass of a TstML::AbstractNodeExecutor
|
|
|
|
|
|
The main goal of the lab is to implements executors that are subclass `AbstractNodeExecutor` for the terminal TST nodes. In this section, I will give you the implementation of the `undock` executor. **It is not enough to copy the file, you need to understand what every lines do. Once you do, call the lab assistant and explain what you think every line does.**
|
|
|
|
|
|
### Python
|
|
|
|
|
|
Executor should inherit the `TstML::AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
|
|
|
|
```python
|
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
|
```
|
|
|
|
|
|
The following show an example of implementation for `undock`:
|
|
|
|
|
|
```python
|
|
|
import threading
|
|
|
import rclpy
|
|
|
from rclpy.node import Node
|
|
|
from rclpy.action import ActionClient
|
|
|
import rclpy.executors
|
|
|
|
|
|
import TstML
|
|
|
import TstML.Executor
|
|
|
|
|
|
from irobot_create_msgs.action import Undock
|
|
|
|
|
|
import ament_index_python
|
|
|
|
|
|
# Work around for ROS not showing error with a MultiThreadedExecutor
|
|
|
def display_exceptions(func):
|
|
|
def wrapper(*args):
|
|
|
try:
|
|
|
return func(*args)
|
|
|
except Exception as ex:
|
|
|
traceback.print_exception(*sys.exc_info())
|
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
|
return wrapper
|
|
|
|
|
|
# Ugly hack to get a new name for each ROS node
|
|
|
ros_name_counter = 0
|
|
|
def gen_name(name):
|
|
|
global ros_name_counter
|
|
|
ros_name_counter += 1
|
|
|
return name + str(ros_name_counter)
|
|
|
|
|
|
class UndockExecutor(TstML.Executor.AbstractNodeExecutor):
|
|
|
def __init__(self, node, context):
|
|
|
super(TstML.Executor.AbstractNodeExecutor, self).__init__(node,
|
|
|
context)
|
|
|
|
|
|
self.ros_node = Node(gen_name("undock_node"))
|
|
|
self._action_client = ActionClient(self.ros_node, Undock, 'undock')
|
|
|
self.executor = rclpy.executors.MultiThreadedExecutor()
|
|
|
self.executor.add_node(self.ros_node)
|
|
|
self.thread = threading.Thread(target=self.executor.spin)
|
|
|
self.thread.start()
|
|
|
|
|
|
def finalise(self):
|
|
|
self.executor.shutdown()
|
|
|
|
|
|
def start(self):
|
|
|
goal_msg = Undock.Goal()
|
|
|
self._send_goal_future = self._action_client.send_goal_async(goal_msg, self.feedback_callback)
|
|
|
self._send_goal_future.add_done_callback(self.goal_response_callback)
|
|
|
return TstML.Executor.ExecutionStatus.Started()
|
|
|
|
|
|
def feedback_callback(self, feedback_msg):
|
|
|
print(feedback_msg.feedback)
|
|
|
|
|
|
@display_exceptions
|
|
|
def goal_response_callback(self, future):
|
|
|
self._goal_handle = future.result()
|
|
|
if not self._goal_handle.accepted:
|
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Aborted())
|
|
|
self.ros_node.get_logger().error('Goal rejected :(')
|
|
|
else:
|
|
|
self.ros_node.get_logger().error('Goal accepted :)')
|
|
|
|
|
|
self._get_result_future = self._goal_handle.get_result_async()
|
|
|
self._get_result_future.add_done_callback(self.handle_result_callback)
|
|
|
|
|
|
@display_exceptions
|
|
|
def handle_result_callback(self, future):
|
|
|
print("Finished!")
|
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Finished())
|
|
|
|
|
|
def pause(self):
|
|
|
self.ros_node.get_logger().info('Pause is not possible.')
|
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
|
def resume(self):
|
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
|
def stop(self):
|
|
|
self._goal_handle.cancel_goal()
|
|
|
return TstML.Executor.ExecutionStatus.Finished()
|
|
|
def abort(self):
|
|
|
self._goal_handle.cancel_goal()
|
|
|
return TstML.Executor.ExecutionStatus.Aborted()
|
|
|
```
|
|
|
|
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`. If you want to import class `NameOfClass` from a Python filed called `name_of_file.py`, you can do the following:
|
|
|
|
|
|
```python
|
|
|
from .name_of_file import NameOfClass
|
|
|
```
|
|
|
|
|
|
### C++
|
|
|
|
|
|
Executor should inherit the `AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
|
|
|
|
|
|
|
The following show an example of implementation for `undock`:
|
|
|
|
|
|
```c++
|
|
|
#include <functional>
|
|
|
#include <future>
|
|
|
#include <memory>
|
|
|
#include <string>
|
|
|
#include <sstream>
|
|
|
#include <thread>
|
|
|
|
|
|
#include "rclcpp/rclcpp.hpp"
|
|
|
#include "rclcpp_action/rclcpp_action.hpp"
|
|
|
|
|
|
#include "irobot_create_msgs/action/undock.hpp"
|
|
|
|
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
|
#include <TstML/TSTNode.h>
|
|
|
|
|
|
// Ugly hack to get a new name for each ROS node
|
|
|
std::string gen_name(const std::string& _name)
|
|
|
{
|
|
|
static int counter = 0;
|
|
|
return _name + std::to_string(++counter);
|
|
|
}
|
|
|
|
|
|
class UndockExecutor : public TstML::Executor::AbstractNodeExecutor
|
|
|
{
|
|
|
public:
|
|
|
using Undock = irobot_create_msgs::action::Undock;
|
|
|
using GoalHandleUndock = rclcpp_action::ClientGoalHandle<Undock>;
|
|
|
|
|
|
UndockExecutor(const TstML::TSTNode* _node,
|
|
|
TstML::Executor::AbstractExecutionContext* _context) :
|
|
|
TstML::Executor::AbstractNodeExecutor(_node, _context)
|
|
|
{
|
|
|
m_node = rclcpp::Node::make_shared(gen_name("undock_node"));
|
|
|
m_executor.add_node(m_node);
|
|
|
m_executor_thread = std::thread([this]() { m_executor.spin(); });
|
|
|
|
|
|
m_client_ptr = rclcpp_action::create_client<Undock>(m_node, "undock");
|
|
|
}
|
|
|
~UndockExecutor()
|
|
|
{
|
|
|
m_executor.cancel();
|
|
|
m_executor_thread.join();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus start() override
|
|
|
{
|
|
|
using namespace std::placeholders;
|
|
|
|
|
|
auto goal_msg = Undock::Goal();
|
|
|
|
|
|
auto send_goal_options = rclcpp_action::Client<Undock>::SendGoalOptions();
|
|
|
send_goal_options.goal_response_callback =
|
|
|
std::bind(&UndockExecutor::goal_response_callback, this, _1);
|
|
|
send_goal_options.feedback_callback =
|
|
|
std::bind(&UndockExecutor::feedback_callback, this, _1, _2);
|
|
|
send_goal_options.result_callback =
|
|
|
std::bind(&UndockExecutor::result_callback, this, _1);
|
|
|
m_client_ptr->async_send_goal(goal_msg, send_goal_options);
|
|
|
|
|
|
return TstML::Executor::ExecutionStatus::Started();
|
|
|
}
|
|
|
|
|
|
void goal_response_callback(std::shared_future<GoalHandleUndock::SharedPtr> future)
|
|
|
{
|
|
|
m_goal_handle = future.get();
|
|
|
if (!m_goal_handle) {
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was rejected by server");
|
|
|
} else {
|
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal accepted by server, waiting for result");
|
|
|
}
|
|
|
}
|
|
|
|
|
|
void feedback_callback(
|
|
|
GoalHandleUndock::SharedPtr goal_handle,
|
|
|
const std::shared_ptr<const Undock::Feedback> feedback)
|
|
|
{
|
|
|
}
|
|
|
|
|
|
void result_callback(const GoalHandleUndock::WrappedResult & result)
|
|
|
{
|
|
|
switch (result.code) {
|
|
|
case rclcpp_action::ResultCode::SUCCEEDED:
|
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal was succeeded");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Finished());
|
|
|
break;
|
|
|
case rclcpp_action::ResultCode::ABORTED:
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was aborted");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
return;
|
|
|
case rclcpp_action::ResultCode::CANCELED:
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was canceled");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
return;
|
|
|
default:
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Unknown result code");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
return;
|
|
|
}
|
|
|
}
|
|
|
|
|
|
TstML::Executor::ExecutionStatus pause() override
|
|
|
{
|
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus resume() override
|
|
|
{
|
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus stop() override
|
|
|
{
|
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
|
return TstML::Executor::ExecutionStatus::Finished();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus abort() override
|
|
|
{
|
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
|
return TstML::Executor::ExecutionStatus::Aborted();
|
|
|
}
|
|
|
private:
|
|
|
std::shared_ptr<rclcpp::Node> m_node;
|
|
|
rclcpp::executors::MultiThreadedExecutor m_executor;
|
|
|
std::thread m_executor_thread;
|
|
|
rclcpp_action::Client<Undock>::SharedPtr m_client_ptr;
|
|
|
GoalHandleUndock::SharedPtr m_goal_handle;
|
|
|
};
|
|
|
```
|
|
|
|
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`.
|
|
|
|
|
|
## Test execution
|
|
|
|
|
|
You should now start your `tst_executor` ROS node, and we can use a service call to execute our first TST:
|
|
|
|
|
|
```bash
|
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
|
```
|
|
|
|
|
|
You should see your robot undocking.
|
|
|
|
|
|
## Implement abort/stop/pause/resume
|
|
|
|
|
|
You should create services for `abort`, `stop`, `pause` and `resume`. They should be created in a similar way as you created the `execute_tst` service server. We can use `std_srvs/Empty` as a service type for those services.
|
|
|
|
|
|
For `abort`, `stop`, `pause` and `resume`, follow those steps:
|
|
|
|
|
|
* Create a callback function, i.e., a new function in your class with a unique name (for instance callback_abort), it should take a `request` and a `response` as argument
|
|
|
* In that callback function you can use `abort`, `stop`, `pause` and `resume` on the `tst_executor` object you created when starting execution of the tree.
|
|
|
* In the constructor of your ROS Node, you can use the `create_service` function to create a new service. Use the `create_service` of `execute_tst` as an example.
|
|
|
|
|
|
You should to call your services with ```ros2 service call```, however you will notice that the call is hanging, and the execution is not aborted, stopped, paused or resumed. This is because, by default, ROS is single threaded and can only handle one request. We need to use a ```MultiThreadedExecutor``` with a reentrant group.
|
|
|
|
|
|
### Python
|
|
|
|
|
|
Modify your main function to look like this, instead of using `rclpy::spin(some_node)`:
|
|
|
|
|
|
```python
|
|
|
executor = rclpy.executors.MultiThreadedExecutor()
|
|
|
executor.add_node(some_node)
|
|
|
executor.spin()
|
|
|
```
|
|
|
|
|
|
Create a reentrant group:
|
|
|
|
|
|
|
|
|
```python
|
|
|
group = ReentrantCallbackGroup()
|
|
|
```
|
|
|
|
|
|
And when creating a service use:
|
|
|
|
|
|
```python
|
|
|
service = self.create_service(..., callback_group=self.group)
|
|
|
```
|
|
|
|
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
|
|
|
|
Also, due to a bug in `rclpy` for the version of ROS installed in the labs computers, python exceptions are not displayed when using a `MultiThreadedExecutor`. To work around that problem, it is possible to use the `display_exception` decorator, at the top of your file for the ROS node, add the following:
|
|
|
|
|
|
```python
|
|
|
import traceback
|
|
|
import sys
|
|
|
|
|
|
def display_exception(func):
|
|
|
def wrapper(*args):
|
|
|
try:
|
|
|
func(*args)
|
|
|
except Exception as ex:
|
|
|
traceback.print_exception(*sys.exc_info())
|
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
|
return wrapper
|
|
|
```
|
|
|
|
|
|
Then just before the function used to define your service callback, add `@display_exceptions`, so that it looks like this:
|
|
|
|
|
|
```python
|
|
|
@display_exceptions
|
|
|
def execute_tst_callback(self, request, response):
|
|
|
```
|
|
|
|
|
|
You can also check the `UndockExecutor` for an example of use of `@display_exceptions`.
|
|
|
|
|
|
### C++
|
|
|
|
|
|
Modify your main function to look like this, instead of using `rclcpp::spin(some_node)`:
|
|
|
|
|
|
```c++
|
|
|
rclcpp::executors::MultiThreadedExecutor executor;
|
|
|
auto node = std::make_shared<TstExecutor>();
|
|
|
executor.add_node(node);
|
|
|
executor.spin();
|
|
|
```
|
|
|
|
|
|
Create a reentrant group:
|
|
|
|
|
|
```c++
|
|
|
rclcpp::CallbackGroup::SharedPtr group = this->create_callback_group(rclcpp::CallbackGroupType::Reentrant);
|
|
|
```
|
|
|
|
|
|
And when creating a service use:
|
|
|
|
|
|
```c++
|
|
|
service = create_service<...>("pause_tst", ..., rmw_qos_profile_services_default, group);
|
|
|
```
|
|
|
|
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
|
|
|
|
## Test cases
|
|
|
|
|
|
We provide some test cases:
|
|
|
|
|
|
```bash
|
|
|
tdde05-cd air_tst tsts
|
|
|
```
|
|
|
|
|
|
* `undock.json`: undock
|
|
|
* `dock.json`: dock
|
|
|
* `drive_to.json`: go to three positions in sequence
|
|
|
* `drive_to_repeat.json`: go to three positions in sequence and repeat once
|
|
|
* `drive_home.json`: go to near the home position and dock
|
|
|
* `explore.json`: execute random exploration motion
|
|
|
* `drive_to_explore.json`: go to a start position and execute the random exploration
|
|
|
|
|
|
## Implementation of Executors
|
|
|
|
|
|
You need to implement the `dock`, `drive_to` and `explore`.
|
|
|
|
|
|
**Dock** Inspire yourself from the `UndockExecutor` to create this executor. Essentially duplicating `UndockExecutor` and replacing Undock by Dock is *almost* all you have to do. You can get the type of an action with `ros2 action info -t [action]`.
|
|
|
|
|
|
**Drive to** Drive to a given location and set the maximum velocity. To drive to a given location, you should send a goal to the navigate action server, like you did in *Lab 2*. Also in this lab you will need to se the `maximum_speed` by publishing on the `/speed_limit` topic. You can use `ros2 topic info` to get the information about a topic, including its type. You can use `ros2 interface show` to show the definition of a topic. `drive_to` has three parameters that you need to get: the destination `p`, the `heading` and the `maximum-speed`.
|
|
|
|
|
|
To get parameters for a TST node, in python:
|
|
|
|
|
|
```python
|
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
|
```
|
|
|
|
|
|
and in C++:
|
|
|
|
|
|
```c++
|
|
|
QVariant p = node()->getParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
|
p.toString(); // convert it to a string
|
|
|
p.toInt(); // Convert it to an integer
|
|
|
p.toDouble(); // Convert it to a double
|
|
|
```
|
|
|
|
|
|
And to test if a parameter is available:
|
|
|
|
|
|
```python
|
|
|
self.node().hasParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
|
```
|
|
|
|
|
|
and in C++:
|
|
|
|
|
|
```c++
|
|
|
node()->hasParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
|
```
|
|
|
|
|
|
Heading is equivalent to Yaw. But it needs to be specify as a quaternion, you can use the following formulat to compute a yaw quaternion:
|
|
|
|
|
|
```math
|
|
|
(x=0, y=0,z=sin(yaw/2),q=cos(yaw/2))
|
|
|
```
|
|
|
|
|
|
**Explore** random exploration. This should be very similar to `Lab2` `RandomExploration`.
|
|
|
|
|
|
## Demonstration
|
|
|
|
|
|
# Lab 3: Task Execution
|
|
|
|
|
|
The goal of this lab is to create an executor for Task-Specification-Trees. The TSTs are defined using the JSON format.
|
|
|
|
|
|
It is *important* to have studied **Lecture 07: Deliberative Architecture** before starting this lab.
|
|
|
|
|
|
## TST
|
|
|
|
|
|
A Task-Specification-Tree (TST) is a hierachical tree structure which represents a complex mission. For the purpose of this lab, six types of TST nodes are used:
|
|
|
|
|
|
You can find the specification of the TST nodes in `air_tst/configs`, the directory on the lab computers is `/courses/TDDE05/software/ros2_ws/src/air_tst/configs`, you can quickly access the directory with:
|
|
|
|
|
|
```bash
|
|
|
tdde05-cd air_tst configs
|
|
|
```
|
|
|
|
|
|
Here is some documentation:
|
|
|
|
|
|
* `seq` allows to execute a sequence of TST nodes
|
|
|
* `conc` allows to execute a set of TST nodes concurrently, it has a parameter `stop-on-first` which controls whether the TST node should wait for all the children to finish or stop when the first TST node is done
|
|
|
* `dock` and `undock` to dock or undock the robot
|
|
|
* `drive-to` allows to send a ground robot to a specific position, it has the following parameters:
|
|
|
* `p` the destination
|
|
|
* `heading` the heading the robot must be facing at the end
|
|
|
* `maximum-speed` the maximum speed that the robot need to drive
|
|
|
* `explore` drives the robot to random location near the current location of the robot, according to the following parameters:
|
|
|
* `radius` the radius around the start location used to generate new location. The start location is the position of the robot when the explore starts.
|
|
|
* `count` the number of random location visit
|
|
|
|
|
|
## TST Editor
|
|
|
|
|
|
An editor is available in the `air_tst` module, and can be started with:
|
|
|
|
|
|
```bash
|
|
|
ros2 run air_tst tst_editor
|
|
|
```
|
|
|
|
|
|
You can open a specific file by giving it as an argument, for instance, the following will allow you to open the `explore_record.json` tst:
|
|
|
|
|
|
```
|
|
|
tdde05-cd air_tst tsts
|
|
|
ros2 run air_tst tst_editor explore_record_semantic.json
|
|
|
```
|
|
|
|
|
|

|
|
|
|
|
|
## TST File format
|
|
|
|
|
|
TST are defined using the JSON file format.
|
|
|
|
|
|
Bellow is an example, for an exploration mission, in which the robot first move to the location $`(10,2)`$ and then explore 8 locations in a radius $`10`$ while recording some data:
|
|
|
|
|
|
```json
|
|
|
{
|
|
|
"children": [
|
|
|
{
|
|
|
"name": "drive-to",
|
|
|
"params": {
|
|
|
"p": {
|
|
|
"rostype": "Point",
|
|
|
"x": 10,
|
|
|
"y": 2,
|
|
|
"z": 0
|
|
|
},
|
|
|
"use-motion-planner": false
|
|
|
}
|
|
|
},
|
|
|
{
|
|
|
"children": [
|
|
|
{
|
|
|
"name": "explore",
|
|
|
"params": {
|
|
|
"radius": 10,
|
|
|
"count": 8
|
|
|
}
|
|
|
},
|
|
|
{
|
|
|
"name": "record",
|
|
|
"params": {
|
|
|
"topics": "/husky0/lidar"
|
|
|
}
|
|
|
}
|
|
|
],
|
|
|
"name": "conc"
|
|
|
}
|
|
|
],
|
|
|
"name": "seq"
|
|
|
}
|
|
|
```
|
|
|
|
|
|
The `type` indicates the type of TST node, `children` are used by the container TST nodes and contains other TST node definition. `params` contains the parameters of each TST nodes.
|
|
|
|
|
|
## TstML Library
|
|
|
|
|
|
The TstML library is a library that allows to parse the specification files, the TST files, provides some basic functionnality for manipulating TSTs and executing them. Examples of use of the TstML library are presented in the lab instruction. If you have doubts about the behavior of a specific function, you can refer to the [API documentation for C++](https://www.ida.liu.se/~TDDE05/info/labs/2021/tstml_doc.pdf, Python API is similar.
|
|
|
|
|
|
List of relevant classes for defining TST:
|
|
|
|
|
|
* `TSTNode`: represents an instance of a node in a TST.
|
|
|
* `TSTNodeModel`: represents the model (or specification) of a type of TST Node.
|
|
|
* `TSTNodeModelsRegistry` registry of `TSTNodeModelsRegistry` used to define TSTs.
|
|
|
|
|
|
List of relevant classes for executing TST:
|
|
|
|
|
|
* `AbstractNodeExecutor`: base class to define the execution of a `TSTNode`.
|
|
|
* `ExecutionStatus`: holds the execution status
|
|
|
* `Executor`: handles the execution of a tree.
|
|
|
* `NodeExecutorRegistry`: contains an association between the `TSTNodeModel` and `AbstractNodeExecutor`. It is used during the execution of a TST to instantiate `AbstractNodeExecutor`.
|
|
|
|
|
|
The `NodeExecutorRegistry` and `TSTNodeModelsRegistry` are common for all TSTs and should be permanent. While `TSTNode` and `Executor` are specific to the execution of a given tree.
|
|
|
|
|
|
## TST executor service definition
|
|
|
|
|
|
We will need to create a new service definition. You should check and understand the [official tutorial](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Custom-ROS2-Interfaces.html) first. It is good practice to create a standalone package for defining messages, so that it is independent of Python and C++. We will need to create a package specially for storing our definitions:
|
|
|
|
|
|
```bash
|
|
|
ros2 pkg create --build-type ament_cmake air_lab_interfaces --dependencies std_msgs rosidl_default_generators
|
|
|
```
|
|
|
|
|
|
Then you should create a `ExecuteTst.srv` file in `air_lab_interfaces/srv`, with the following content:
|
|
|
|
|
|
```
|
|
|
string tst_file # Name of the TST file to execute, it should be a full path
|
|
|
---
|
|
|
bool success # Whether the execution was successful
|
|
|
string error_message # Error message if the execution was unsuccessful
|
|
|
```
|
|
|
|
|
|
Then we need to add this file to the build system. In `air_lab_interfaces` open the `CMakeLists.txt` file, and add the following before `ament_package()`:
|
|
|
|
|
|
```cmake
|
|
|
rosidl_generate_interfaces(${PROJECT_NAME}
|
|
|
"srv/ExecuteTst.srv"
|
|
|
)
|
|
|
```
|
|
|
|
|
|
Then we need to add the following to `package.xml` file:
|
|
|
|
|
|
```xml
|
|
|
<member_of_group>rosidl_interface_packages</member_of_group>
|
|
|
```
|
|
|
|
|
|
Then you can build with `tdde05-build`. Then you should reload the environment with `tdde05-start`. Then you should be able to see the new message with:
|
|
|
|
|
|
```bash
|
|
|
ros2 interface show air_lab_interfaces/srv/ExecuteTst
|
|
|
```
|
|
|
|
|
|
## Implementation
|
|
|
|
|
|
This lab can be completed using **Python** or **C++**. Follows the instruction corresponding to your choice of programming language.
|
|
|
|
|
|
You will need to create a package for `air_lab3`, in your `ros2_ws/src/labs` directory, with dependencies on a few packages. For python:
|
|
|
|
|
|
```bash
|
|
|
ros2 pkg create air_lab3 --build-type ament_python --dependencies rclpy air_lab_interfaces --node-name tst_executor
|
|
|
```
|
|
|
|
|
|
For C++:
|
|
|
|
|
|
```bash
|
|
|
ros2 pkg create air_lab3 --build-type ament_cmake --dependencies rclcpp rclcpp_action air_lab_interfaces irobot_create_msgs --node-name tst_executor
|
|
|
```
|
|
|
|
|
|
## TST executor service server
|
|
|
|
|
|
We will create a service server whose goal is to load a TST description from a file (one of the JSON file) and execute it. In the remainer of the lab we will help you implent that service call, in this section, we will start by just seting up an empty service.
|
|
|
|
|
|
You should start by reading and understanding the official tutorials for [Python](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Py-Service-And-Client.html) or [C++](https://docs.ros.org/en/galactic/Tutorials/Beginner-Client-Libraries/Writing-A-Simple-Cpp-Service-And-Client.html).
|
|
|
|
|
|
You should follow an incremental approach:
|
|
|
|
|
|
* Copy/paste the service server example from the respective tutorials into your `tst_executor.py` or `tst_executor.cpp`. *We do not need a client, in this lab, we will use the command line to call the service*.
|
|
|
* Make sure you understand what is happening, if you have doubts, ask questions to your lab assistant.
|
|
|
* You won't be able to run the tutorial. So you can start modifying to replace the service call with `execute_tst` and the type `air_lab_interfaces/srv/ExecuteTst`.
|
|
|
* Once you have modified it, you can test your calling your service with:
|
|
|
|
|
|
```bash
|
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
|
```
|
|
|
Note that we use *\`ros2 pkg prefix air_tst\`* which will be replaced by the full path to the directory where the tsts are installed on the system.
|
|
|
* Don't forget to run `tst_executor` and to start it from your screen file.
|
|
|
|
|
|
## Initialise TstML and a load a TstML file
|
|
|
|
|
|
First we need to load the Tst definitions in a `TSTNodeModelsRegistry`. `TSTNodeModelsRegistry` is a class from the TstML library, which we are using to implement this lab.
|
|
|
|
|
|
### Python
|
|
|
|
|
|
You will need to create a `TstML.TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
|
|
|
|
```python
|
|
|
import TstML
|
|
|
import ament_index_python
|
|
|
|
|
|
tst_registry = TstML.TSTNodeModelsRegistry()
|
|
|
tst_registry.loadDirectory(ament_index_python.get_package_prefix("air_tst") + "/share/air_tst/configs")
|
|
|
|
|
|
```
|
|
|
|
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
```python
|
|
|
tst_node = TstML.TSTNode.load(filename, tst_registry)
|
|
|
```
|
|
|
|
|
|
### C++
|
|
|
|
|
|
You will need to create a `TstML::TSTNodeModelsRegistry` and use `loadDirectory` to load all the configuration file. It is best if this `tst_registry` is kept as a class member of your ROS node.
|
|
|
|
|
|
```c++
|
|
|
#include <TstML/TSTNodeModelsRegistry.h>
|
|
|
#include <ament_index_cpp/get_package_prefix.hpp>
|
|
|
|
|
|
TstML::TSTNodeModelsRegistry* tst_registry
|
|
|
= new TstML::TSTNodeModelsRegistry();
|
|
|
tst_registry->loadDirectory(
|
|
|
QString::fromStdString(ament_index_cpp::get_package_prefix("air_tst")
|
|
|
+ "/share/air_tst/configs"));
|
|
|
```
|
|
|
|
|
|
When you want to load a tree for execution from a file, for instance, from within your service callback, you can do the following:
|
|
|
```c++
|
|
|
TstML::TSTNode* tst_node = TstML::TSTNode::load(
|
|
|
QUrl::fromLocalFile(QString::fromStdString(filename)),
|
|
|
tst_registry);
|
|
|
```
|
|
|
|
|
|
To be able to compile, you need to add the following to `CMakeLists.txt`:
|
|
|
|
|
|
```cmake
|
|
|
find_package(TstML REQUIRED)
|
|
|
...
|
|
|
target_link_libraries(tst_executor TstMLExecutor)
|
|
|
```
|
|
|
|
|
|
## Initialise TstML::Executor and start execution
|
|
|
|
|
|
### Python
|
|
|
|
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
|
|
|
|
```python
|
|
|
import TstML.Executor
|
|
|
|
|
|
# Create a registry with node executors
|
|
|
tst_executor_registry = TstML.Executor.NodeExecutorRegistry()
|
|
|
|
|
|
# Setup the executors for sequence and concurrent
|
|
|
tst_executor_registry.registerNodeExecutor(
|
|
|
tst_registry.model("seq"),
|
|
|
TstML.Executor.DefaultNodeExecutor.Sequence)
|
|
|
tst_executor_registry.registerNodeExecutor(
|
|
|
tst_registry.model("conc"),
|
|
|
TstML.Executor.DefaultNodeExecutor.Concurrent)
|
|
|
```
|
|
|
|
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
|
|
|
```python
|
|
|
# Create an executor using the executors defined
|
|
|
# in tst_executor_registry
|
|
|
tst_executor = TstML.Executor.Executor(tst_node,
|
|
|
tst_executor_registry)
|
|
|
|
|
|
# Start execution
|
|
|
tst_executor.start()
|
|
|
|
|
|
# Block until the execution has finished
|
|
|
status = tst_executor.waitForFinished()
|
|
|
|
|
|
# Display the result of execution
|
|
|
if status.type() == TstML.Executor.ExecutionStatus.Type.Finished:
|
|
|
print("Execution successful")
|
|
|
elif status.type() == TstML.Executor.ExecutionStatus.Type.Failed:
|
|
|
print("Execution failed: {}".format(status.message()))
|
|
|
else:
|
|
|
print("Execution failed!")
|
|
|
```
|
|
|
|
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
|
|
|
|
### C++
|
|
|
|
|
|
First you need to create a `NodeExecutorRegistry`, and use the `registerNodeExecutor` function to associate `TSTNodeModel` with `AbstractNodeExecutor`. We provide you with a default executor for `sequence` and `concurrent`.
|
|
|
|
|
|
```c++
|
|
|
#include <TstML/Executor/NodeExecutorRegistry.h>
|
|
|
#include <TstML/Executor/DefaultNodeExecutor/Sequence.h>
|
|
|
#include <TstML/Executor/DefaultNodeExecutor/Concurrent.h>
|
|
|
|
|
|
// Create a registry with TST node executors
|
|
|
TstML::Executor::NodeExecutorRegistry* tst_executor_registry
|
|
|
= new TstML::Executor::NodeExecutorRegistry();
|
|
|
|
|
|
// Setup the executors for sequence and concurrent
|
|
|
tst_executor_registry->registerNodeExecutor
|
|
|
<TstML::Executor::DefaultNodeExecutor::Sequence>
|
|
|
(tst_registry->model("seq"));
|
|
|
tst_executor_registry->registerNodeExecutor
|
|
|
<TstML::Executor::DefaultNodeExecutor::Concurrent>
|
|
|
(tst_registry->model("conc"));
|
|
|
```
|
|
|
|
|
|
When you want to execute a tree, for instance, from within your service callback, you can do the following:
|
|
|
|
|
|
|
|
|
```c++
|
|
|
// Create an executor using the executors defined
|
|
|
// in tst_executor_registry
|
|
|
TstML::Executor::Executor* tst_executor
|
|
|
= new TstML::Executor::Executor(tst_node, tst_executor_registry)
|
|
|
|
|
|
// Start execution
|
|
|
tst_executor->start();
|
|
|
|
|
|
// Block until the execution has finished
|
|
|
TstML::Executor::ExecutionStatus status = tst_executor->waitForFinished();
|
|
|
|
|
|
// Display the result of execution
|
|
|
if(status == TstML::Executor::ExecutionStatus::Finished())
|
|
|
{
|
|
|
ROS_INFO("Execution successful");
|
|
|
}
|
|
|
else
|
|
|
{
|
|
|
std::string message = status.message().toStdString();
|
|
|
ROS_INFO("Execution successful '%s'", message.c_str());
|
|
|
}
|
|
|
|
|
|
delete tst_executor;
|
|
|
```
|
|
|
|
|
|
Instead of displaying the results, you should set the result in the response of your service call.
|
|
|
|
|
|
## Implement a subclass of a TstML::AbstractNodeExecutor
|
|
|
|
|
|
The main goal of the lab is to implements executors that are subclass `AbstractNodeExecutor` for the terminal TST nodes. In this section, I will give you the implementation of the `undock` executor. **It is not enough to copy the file, you need to understand what every lines do. Once you do, call the lab assistant and explain what you think every line does.**
|
|
|
|
|
|
### Python
|
|
|
|
|
|
Executor should inherit the `TstML::AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
|
|
|
|
```python
|
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
|
```
|
|
|
|
|
|
The following show an example of implementation for `undock`:
|
|
|
|
|
|
```python
|
|
|
import threading
|
|
|
import rclpy
|
|
|
from rclpy.node import Node
|
|
|
from rclpy.action import ActionClient
|
|
|
import rclpy.executors
|
|
|
|
|
|
import TstML
|
|
|
import TstML.Executor
|
|
|
|
|
|
from irobot_create_msgs.action import Undock
|
|
|
|
|
|
import ament_index_python
|
|
|
|
|
|
# Work around for ROS not showing error with a MultiThreadedExecutor
|
|
|
def display_exceptions(func):
|
|
|
def wrapper(*args):
|
|
|
try:
|
|
|
return func(*args)
|
|
|
except Exception as ex:
|
|
|
traceback.print_exception(*sys.exc_info())
|
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
|
return wrapper
|
|
|
|
|
|
# Ugly hack to get a new name for each ROS node
|
|
|
ros_name_counter = 0
|
|
|
def gen_name(name):
|
|
|
global ros_name_counter
|
|
|
ros_name_counter += 1
|
|
|
return name + str(ros_name_counter)
|
|
|
|
|
|
class UndockExecutor(TstML.Executor.AbstractNodeExecutor):
|
|
|
def __init__(self, node, context):
|
|
|
super(TstML.Executor.AbstractNodeExecutor, self).__init__(node,
|
|
|
context)
|
|
|
|
|
|
self.ros_node = Node(gen_name("undock_node"))
|
|
|
self._action_client = ActionClient(self.ros_node, Undock, 'undock')
|
|
|
self.executor = rclpy.executors.MultiThreadedExecutor()
|
|
|
self.executor.add_node(self.ros_node)
|
|
|
self.thread = threading.Thread(target=self.executor.spin)
|
|
|
self.thread.start()
|
|
|
|
|
|
def finalise(self):
|
|
|
self.executor.shutdown()
|
|
|
|
|
|
def start(self):
|
|
|
goal_msg = Undock.Goal()
|
|
|
self._send_goal_future = self._action_client.send_goal_async(goal_msg, self.feedback_callback)
|
|
|
self._send_goal_future.add_done_callback(self.goal_response_callback)
|
|
|
return TstML.Executor.ExecutionStatus.Started()
|
|
|
|
|
|
def feedback_callback(self, feedback_msg):
|
|
|
print(feedback_msg.feedback)
|
|
|
|
|
|
@display_exceptions
|
|
|
def goal_response_callback(self, future):
|
|
|
self._goal_handle = future.result()
|
|
|
if not self._goal_handle.accepted:
|
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Aborted())
|
|
|
self.ros_node.get_logger().error('Goal rejected :(')
|
|
|
else:
|
|
|
self.ros_node.get_logger().error('Goal accepted :)')
|
|
|
|
|
|
self._get_result_future = self._goal_handle.get_result_async()
|
|
|
self._get_result_future.add_done_callback(self.handle_result_callback)
|
|
|
|
|
|
@display_exceptions
|
|
|
def handle_result_callback(self, future):
|
|
|
print("Finished!")
|
|
|
self.executionFinished(TstML.Executor.ExecutionStatus.Finished())
|
|
|
|
|
|
def pause(self):
|
|
|
self.ros_node.get_logger().info('Pause is not possible.')
|
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
|
def resume(self):
|
|
|
return TstML.Executor.ExecutionStatus.Running()
|
|
|
def stop(self):
|
|
|
self._goal_handle.cancel_goal()
|
|
|
return TstML.Executor.ExecutionStatus.Finished()
|
|
|
def abort(self):
|
|
|
self._goal_handle.cancel_goal()
|
|
|
return TstML.Executor.ExecutionStatus.Aborted()
|
|
|
```
|
|
|
|
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`. If you want to import class `NameOfClass` from a Python filed called `name_of_file.py`, you can do the following:
|
|
|
|
|
|
```python
|
|
|
from .name_of_file import NameOfClass
|
|
|
```
|
|
|
|
|
|
### C++
|
|
|
|
|
|
Executor should inherit the `AbstractNodeExecutor` class, they need to implement a `start`, `pause`, `resume`, `stop` and `abort` function. They can access parameters for the TST node in this way:
|
|
|
|
|
|
|
|
|
The following show an example of implementation for `undock`:
|
|
|
|
|
|
```c++
|
|
|
#include <functional>
|
|
|
#include <future>
|
|
|
#include <memory>
|
|
|
#include <string>
|
|
|
#include <sstream>
|
|
|
#include <thread>
|
|
|
|
|
|
#include "rclcpp/rclcpp.hpp"
|
|
|
#include "rclcpp_action/rclcpp_action.hpp"
|
|
|
|
|
|
#include "irobot_create_msgs/action/undock.hpp"
|
|
|
|
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
|
#include <TstML/Executor/AbstractNodeExecutor.h>
|
|
|
#include <TstML/TSTNode.h>
|
|
|
|
|
|
// Ugly hack to get a new name for each ROS node
|
|
|
std::string gen_name(const std::string& _name)
|
|
|
{
|
|
|
static int counter = 0;
|
|
|
return _name + std::to_string(++counter);
|
|
|
}
|
|
|
|
|
|
class UndockExecutor : public TstML::Executor::AbstractNodeExecutor
|
|
|
{
|
|
|
public:
|
|
|
using Undock = irobot_create_msgs::action::Undock;
|
|
|
using GoalHandleUndock = rclcpp_action::ClientGoalHandle<Undock>;
|
|
|
|
|
|
UndockExecutor(const TstML::TSTNode* _node,
|
|
|
TstML::Executor::AbstractExecutionContext* _context) :
|
|
|
TstML::Executor::AbstractNodeExecutor(_node, _context)
|
|
|
{
|
|
|
m_node = rclcpp::Node::make_shared(gen_name("undock_node"));
|
|
|
m_executor.add_node(m_node);
|
|
|
m_executor_thread = std::thread([this]() { m_executor.spin(); });
|
|
|
|
|
|
m_client_ptr = rclcpp_action::create_client<Undock>(m_node, "undock");
|
|
|
}
|
|
|
~UndockExecutor()
|
|
|
{
|
|
|
m_executor.cancel();
|
|
|
m_executor_thread.join();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus start() override
|
|
|
{
|
|
|
using namespace std::placeholders;
|
|
|
|
|
|
auto goal_msg = Undock::Goal();
|
|
|
|
|
|
auto send_goal_options = rclcpp_action::Client<Undock>::SendGoalOptions();
|
|
|
send_goal_options.goal_response_callback =
|
|
|
std::bind(&UndockExecutor::goal_response_callback, this, _1);
|
|
|
send_goal_options.feedback_callback =
|
|
|
std::bind(&UndockExecutor::feedback_callback, this, _1, _2);
|
|
|
send_goal_options.result_callback =
|
|
|
std::bind(&UndockExecutor::result_callback, this, _1);
|
|
|
m_client_ptr->async_send_goal(goal_msg, send_goal_options);
|
|
|
|
|
|
return TstML::Executor::ExecutionStatus::Started();
|
|
|
}
|
|
|
|
|
|
void goal_response_callback(std::shared_future<GoalHandleUndock::SharedPtr> future)
|
|
|
{
|
|
|
m_goal_handle = future.get();
|
|
|
if (!m_goal_handle) {
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was rejected by server");
|
|
|
} else {
|
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal accepted by server, waiting for result");
|
|
|
}
|
|
|
}
|
|
|
|
|
|
void feedback_callback(
|
|
|
GoalHandleUndock::SharedPtr goal_handle,
|
|
|
const std::shared_ptr<const Undock::Feedback> feedback)
|
|
|
{
|
|
|
}
|
|
|
|
|
|
void result_callback(const GoalHandleUndock::WrappedResult & result)
|
|
|
{
|
|
|
switch (result.code) {
|
|
|
case rclcpp_action::ResultCode::SUCCEEDED:
|
|
|
RCLCPP_INFO(m_node->get_logger(), "Goal was succeeded");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Finished());
|
|
|
break;
|
|
|
case rclcpp_action::ResultCode::ABORTED:
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was aborted");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
return;
|
|
|
case rclcpp_action::ResultCode::CANCELED:
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Goal was canceled");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
return;
|
|
|
default:
|
|
|
RCLCPP_ERROR(m_node->get_logger(), "Unknown result code");
|
|
|
executionFinished(TstML::Executor::ExecutionStatus::Aborted());
|
|
|
return;
|
|
|
}
|
|
|
}
|
|
|
|
|
|
TstML::Executor::ExecutionStatus pause() override
|
|
|
{
|
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus resume() override
|
|
|
{
|
|
|
return TstML::Executor::ExecutionStatus::Running();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus stop() override
|
|
|
{
|
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
|
return TstML::Executor::ExecutionStatus::Finished();
|
|
|
}
|
|
|
TstML::Executor::ExecutionStatus abort() override
|
|
|
{
|
|
|
m_client_ptr->async_cancel_goal(m_goal_handle);
|
|
|
return TstML::Executor::ExecutionStatus::Aborted();
|
|
|
}
|
|
|
private:
|
|
|
std::shared_ptr<rclcpp::Node> m_node;
|
|
|
rclcpp::executors::MultiThreadedExecutor m_executor;
|
|
|
std::thread m_executor_thread;
|
|
|
rclcpp_action::Client<Undock>::SharedPtr m_client_ptr;
|
|
|
GoalHandleUndock::SharedPtr m_goal_handle;
|
|
|
};
|
|
|
```
|
|
|
|
|
|
Do not forget to add the `UndockExecutor` to the executor registry, like it was done for `seq` and `conq`.
|
|
|
|
|
|
## Test execution
|
|
|
|
|
|
You should now start your `tst_executor` ROS node, and we can use a service call to execute our first TST:
|
|
|
|
|
|
```bash
|
|
|
ros2 service call /execute_tst air_lab_interfaces/srv/ExecuteTst "tst_file: '`ros2 pkg prefix air_tst`/share/air_tst/tsts/undock.json'"
|
|
|
```
|
|
|
|
|
|
You should see your robot undocking.
|
|
|
|
|
|
## Implement abort/stop/pause/resume
|
|
|
|
|
|
You should create services for `abort`, `stop`, `pause` and `resume`. They should be created in a similar way as you created the `execute_tst` service server. We can use `std_srvs/Empty` as a service type for those services.
|
|
|
|
|
|
For `abort`, `stop`, `pause` and `resume`, follow those steps:
|
|
|
|
|
|
* Create a callback function, i.e., a new function in your class with a unique name (for instance callback_abort), it should take a `request` and a `response` as argument
|
|
|
* In that callback function you can use `abort`, `stop`, `pause` and `resume` on the `tst_executor` object you created when starting execution of the tree.
|
|
|
* In the constructor of your ROS Node, you can use the `create_service` function to create a new service. Use the `create_service` of `execute_tst` as an example.
|
|
|
|
|
|
You should to call your services with ```ros2 service call```, however you will notice that the call is hanging, and the execution is not aborted, stopped, paused or resumed. This is because, by default, ROS is single threaded and can only handle one request. We need to use a ```MultiThreadedExecutor``` with a reentrant group.
|
|
|
|
|
|
### Python
|
|
|
|
|
|
Modify your main function to look like this, instead of using `rclpy::spin(some_node)`:
|
|
|
|
|
|
```python
|
|
|
executor = rclpy.executors.MultiThreadedExecutor()
|
|
|
executor.add_node(some_node)
|
|
|
executor.spin()
|
|
|
```
|
|
|
|
|
|
Create a reentrant group:
|
|
|
|
|
|
|
|
|
```python
|
|
|
group = rclpy.callback_groups.ReentrantCallbackGroup()
|
|
|
```
|
|
|
|
|
|
And when creating a service use:
|
|
|
|
|
|
```python
|
|
|
service = self.create_service(..., callback_group=self.group)
|
|
|
```
|
|
|
|
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
|
|
|
|
Also, due to a bug in `rclpy` for the version of ROS installed in the labs computers, python exceptions are not displayed when using a `MultiThreadedExecutor`. To work around that problem, it is possible to use the `display_exception` decorator, at the top of your file for the ROS node, add the following:
|
|
|
|
|
|
```python
|
|
|
import traceback
|
|
|
import sys
|
|
|
|
|
|
def display_exception(func):
|
|
|
def wrapper(*args):
|
|
|
try:
|
|
|
func(*args)
|
|
|
except Exception as ex:
|
|
|
traceback.print_exception(*sys.exc_info())
|
|
|
print(f"Unhandled exception was caught: '{ex}'")
|
|
|
return wrapper
|
|
|
```
|
|
|
|
|
|
Then just before the function used to define your service callback, add `@display_exceptions`, so that it looks like this:
|
|
|
|
|
|
```python
|
|
|
@display_exceptions
|
|
|
def execute_tst_callback(self, request, response):
|
|
|
```
|
|
|
|
|
|
You can also check the `UndockExecutor` for an example of use of `@display_exceptions`.
|
|
|
|
|
|
### C++
|
|
|
|
|
|
Modify your main function to look like this, instead of using `rclcpp::spin(some_node)`:
|
|
|
|
|
|
```c++
|
|
|
rclcpp::executors::MultiThreadedExecutor executor;
|
|
|
auto node = std::make_shared<TstExecutor>();
|
|
|
executor.add_node(node);
|
|
|
executor.spin();
|
|
|
```
|
|
|
|
|
|
Create a reentrant group:
|
|
|
|
|
|
```c++
|
|
|
rclcpp::CallbackGroup::SharedPtr group = this->create_callback_group(rclcpp::CallbackGroupType::Reentrant);
|
|
|
```
|
|
|
|
|
|
And when creating a service use:
|
|
|
|
|
|
```c++
|
|
|
service = create_service<...>("pause_tst", ..., rmw_qos_profile_services_default, group);
|
|
|
```
|
|
|
|
|
|
Note that the ```execute_tst``` call is not reentrant, and should not be added to the group.
|
|
|
|
|
|
## Test cases
|
|
|
|
|
|
We provide some test cases:
|
|
|
|
|
|
```bash
|
|
|
tdde05-cd air_tst tsts
|
|
|
```
|
|
|
|
|
|
* `undock.json`: undock
|
|
|
* `dock.json`: dock
|
|
|
* `drive_to.json`: go to three positions in sequence
|
|
|
* `drive_to_repeat.json`: go to three positions in sequence and repeat once
|
|
|
* `drive_home.json`: go to near the home position and dock
|
|
|
* `explore.json`: execute random exploration motion
|
|
|
* `drive_to_explore.json`: go to a start position and execute the random exploration
|
|
|
|
|
|
## Implementation of Executors
|
|
|
|
|
|
You need to implement the `dock`, `drive_to` and `explore`.
|
|
|
|
|
|
**Dock** Inspire yourself from the `UndockExecutor` to create this executor. Essentially duplicating `UndockExecutor` and replacing Undock by Dock is *almost* all you have to do. You can get the type of an action with `ros2 action info -t [action]`.
|
|
|
|
|
|
**Drive to** Drive to a given location and set the maximum velocity. To drive to a given location, you should send a goal to the navigate action server, like you did in *Lab 2*. Also in this lab you will need to se the `maximum_speed` by publishing on the `/speed_limit` topic. You can use `ros2 topic info` to get the information about a topic, including its type. You can use `ros2 interface show` to show the definition of a topic. `drive_to` has three parameters that you need to get: the destination `p`, the `heading` and the `maximum-speed`.
|
|
|
|
|
|
To get parameters for a TST node, in python:
|
|
|
|
|
|
```python
|
|
|
self.node().getParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
|
```
|
|
|
|
|
|
and in C++:
|
|
|
|
|
|
```c++
|
|
|
QVariant p = node()->getParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
|
p.toString(); // convert it to a string
|
|
|
p.toInt(); // Convert it to an integer
|
|
|
p.toDouble(); // Convert it to a double
|
|
|
```
|
|
|
|
|
|
And to test if a parameter is available:
|
|
|
|
|
|
```python
|
|
|
self.node().hasParameter(TstML.TSTNode.ParameterType.Specific, "nameoftheparameter")
|
|
|
```
|
|
|
|
|
|
and in C++:
|
|
|
|
|
|
```c++
|
|
|
node()->hasParameter(TstML::TSTNode::ParameterType::Specific, "nameoftheparameter");
|
|
|
```
|
|
|
|
|
|
Heading is equivalent to Yaw. But it needs to be specify as a quaternion, you can use the following formulat to compute a yaw quaternion:
|
|
|
|
|
|
```math
|
|
|
(x=0, y=0,z=sin(yaw/2),q=cos(yaw/2))
|
|
|
```
|
|
|
|
|
|
**Explore** random exploration. This should be very similar to `Lab2` `RandomExploration`.
|
|
|
|
|
|
## Demonstration
|
|
|
|
|
|
* Show the execution of `drive_to_repeat.json`, `drive_to_explore.json` and `drive_home.json` |
|
|
\ No newline at end of file |