Skip to content
This repository was archived by the owner on Apr 24, 2023. It is now read-only.
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions documents/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ This folder contains research and design documents for the repo.

## Research Docs
* [Barrel detection](research/barrel_detection.md)
* [IOP](research/iop.md)

## Design Docs
* [Mapping refactor](design/mapping.md)
Expand Down
43 changes: 43 additions & 0 deletions documents/design/NM_ray_casting.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Stop raycasting at first obstacle

*Issue #543*

**Author:**
- Vikram Tholakapalli

## The Problem

The current raycasting continues until the max range. It should stop at the first obstacle in the map if it is within the raycasting radius (as that would also result in a "miss" because it is within the minimum range).
Essentially it should extend our Lidar range into the minimum range, and if there is an object in the minimum range then make sure the scan shows that the arc behind that object is blocked. This is important because right now if an object is closer than the LiDAR range, the LiDAR thinks that there is no object and there is open space in that region which influences the path planning.

## Proposed Solution

1. Before we start scanning from the endpoint to startpoint we need to make sure that there is no object within the minimum range.
2. So I'll need to add this code to the LiDAR layer, specifically insertFreeSpace.
3. Get the robot's current position
4. Iterate through the line from the robot's origin to the minRange. If an occupied node exists in this range, mark minRange as occupied.
5. If nothing in this range is occupied, then search from minRange to endpoint.


## Questions & Research
- I also need a bag file I can test my changes on to see if my solution works.

### Affected Packages

- What parts of the software will you have to change (if any)?
- I'll have to change the LidarLayer::insertFreeSpace method within lidar_layer.cpp
- Which packages are relevant to the success of the project?
- the igvc_navigation layer


### Schedule

1. Subtask 1: Figure out to get bag files to work and fully understand the code base. (By Nov 27th)
2. Subtask 2: Get it to iterate from the robot's origin to the mapping point and figure out if this range is blocked. (By Nov 27th)
3. Subtask 3: If the range is not blocked scan from startpoint to endpoint. (By Dec 4th)
4. Subtask 4: Fix bugs + extra time for unexpected problems (By Dec 4th)

Code Review (Date): Dec 11th

*(School work will change over the semester and software often takes longer than you expect to write,
so don't worry if you fall behind. Just be sure to update this document with your progress.)*
60 changes: 60 additions & 0 deletions documents/design/cone_of_shame.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Project Title

*Issue #Number*
571

**Author:**
- Jiajun Mao

## The Problem

In last year's IGVC competition, Jessie turned 180 degrees before starting traveling on the course. This is presumably caused by the current navigation model ignoring everything behind Jessie and default that nothing exists there. This caused the problem with the path planning algorithm thinking that the shortest path is to walk backward to the place where "no line exists".

## Proposed Solution

- The problem could potentially be solved by creating a semi-circle behind the robot with the diameter of the distance between the left and right lines of the course and make the robot think that that is a part of the line defining the course. Thus the robot would not travel backwards.
- How to measure the location of the cone
- Obtain the layered_costmap_ from the costmap2d::layer which contains a getPlugin method that will return a vector of **boost::shared_ptr``<Layer``>** with each shared_ptr pointing to a layer
- From the vector of layers we can obtain the LineLayer representing the lines the robot is seeing around itself
- detect in a small radius of that line whether a point belongs to the same line still exist, until reaching the endpoint of the line
- when both endpoints of the line on the right and left are reached, use a grid-based search algorithm to draw a line between the two points (or a cone)
- Create a new layer in the 2D cost map and artificially insert points on the line/cone created above to act as an obstacle behind the robot
- Create a new layer in igvc_navigation/mapper
- Incorporate the new layer into the cost map
- Create a ROS service that generate the new cost2d map
- call when service when the robot reaches a waypoint and at the start of the course
- the service will also contain a remove method that will remove the cone created at the course start after the robot has reached the first way point [or some other specific period of time]


## Questions & Research

- ROS Service
- Similar to subscriber and pubisher but is a one time call
- written with a .srv file and a structure with response/request classes will be generated
- ServiceClient calls a service with
- ros::ServiceClient srvClient = nh.serviceClient<serviceType>("service_name");
- srvClient.call(serviceInstantiation); [returns true if service calls with success and serviceInstantion is filled, and false otherwise]
- Service contains a callback method used to process the request upon received. It is called with
- ros::ServiceServer srvServer = nh.advertiseService(serviceType, <callback>);

- Costmap2d layer
- A map that indicates discretized information about the world the robot is in
- consist of a flattened array that represent a two-dimensional map
- contains useful functions such as:
- void GridLayer::updateCosts - update the cost in the master grid map according to the value in this layer
- void GridLayer::updateBounds - expand the cost map layer to include new cells that the robot is in

## Overall Scope

### Affected Packages

- igvc_navigation/src/mapper


### Schedule

Subtask 1 (11/24/2019): Finish generating the line/cone

Subtask 2 (11/28/2019): Modify mapper.cpp to insert the new layer

Code Review (12/2/2019): Most likely I would have everything figured out by then and ready for review. Hopefully it could be sooner.
3 changes: 2 additions & 1 deletion documents/research/README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
# Research Docs
* [Barrel detection](barrel_detection.md)
* [Barrel detection](barrel_detection.md)
* [IOP](iop.md)
227 changes: 227 additions & 0 deletions documents/research/iop.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,227 @@
# IOP Research

tl;dr: There's a repo `fkie/iop_core` that implements all the JAUS services we need to implement
for the IOP challenge, so all we need to do (software-wise) is create a "Conformance Verification
Tool" (CVT) to make sure that `fkie/iop_core` implements all the specs outlined in the rules.
Because most of the work has been done for us, I think that it is worthwhile for us to do IOP this
year.

**Author: Oswin So**

## Table of Contents
1. [Overview of IOP](#overview-of-iop)
2. [Tooling](#tooling)
3. [fkie/iop_core](#fkieiop_core)
4. [Tasks for other subteams](#tasks-for-other-subteams)
5. [My thoughts on IOP](#my-thoughts-on-iop)

## Overview of IOP
**IOP**, which stands for **I**nter**o**perability **P**rofiles, is a challenge for the IGVC competition.
Essentially, the challenge is in conforming to the **J**oint **A**rchitecture for **U**nmanned **S**ystems (JAUS),
which is an architecture for Unmanned Ground Systems started by the DOD in 1998.

JAUS defines a bunch of things that need to happen which is very similar to ROS.

The main concepts (IMO) in JAUS:
- JAUS is separated into **Subsystems**, **Nodes**, and **Components**
- A **subsystem** is a vehicle, robot, or operator control unit (OCU).
- A **node** is usually a single computing resource such as a PC or embedded processor
- A **component** is a software element that can communicate with other JAUS components.
So, like ROS nodes.
- **Server**: Provide one or more **services**
- **Client**: Uses one or more **services**
- There's a DSL (**J**AUS **S**ervice **I**nterface **D**efinition **L**anguage) for defining the layout of each **message**.
So like ROS messages.

## Tooling
AFAIK there are two main tools for **JAUS**:
1. [OpenJAUS](http://openjaus.com/)
- They've got no prices
- [Manipal](http://www.igvc.org/design/2019/6.pdf) claims that this is better than JAUS Toolset
2. [JAUS Toolset (JTS)](http://jaustoolset.org/)
- It's actually open source.
- [Some German dudes](https://github.com/fkie/) made this [really nice framework that easily allows ROS nodes to
communicate with IOP services](https://github.com/fkie/iop_core)
- It generates boilerplate code for a component / service given the JSIDL (C++, C#, Java)

I don't know about OpenJAUS. Given that their website looks more modern, and they're paid, it seems like their product is
pretty good. JAUS Toolset, on the other hand, seems less modern and more clunky, and requires users to define services
and components by drawing a state machine.

![](http://jaustoolset.org/wp-content/uploads/2012/12/standards-990x397.png)


## `fkie/iop_core`
[fkie/iop_core](https://github.com/fkie/iop_core) is a framework that wraps around JTS.

> This repository lets your ROS software communicate with IOP services

I've tried it out, and it looks like quite a lot of work was put in to this project. You can read
their README (which is pretty good) for an overview of how the framework works, but essentially:

- The default JSIDL is modified from the default ones (still not sure what the difference is)
- JTS is used to generate boilerplate C++ from the JSIDLs
- Each _JAUS service_ is written as plugin for [pluginlib](http://wiki.ros.org/pluginlib),
extending `iop::PluginInterface`
- These _pluginlib JAUS services_ are then specified as a YAML parameter for the `iop_component` ROS node, ie
```yaml
[
fkie_iop_discovery: "DiscoveryClient",
# added client for primitive driver and also his subservices
fkie_iop_client_primitive_driver: "PrimitiveDriverClient",
]
```
with parameters for each service also specified in YAML as a parameter, ie.
```yaml
EventsClient:
use_queries: false
DiscoveryClient:
register_own_services: false
enable_ros_interface: true
# configuration for primitive client
# see https://github.com/fkie/iop_jaus_mobility_clients#fkie_iop_client_primitive_driver-primitivedriverclient
PrimitiveDriverClient:
# do not use the stamped twist messages
use_stamped: false
# new parameter added to handle velocities greater than 1.0
# you should update the https://github.com/fkie/iop_jaus_mobility_clients repository
max_linear: 2
max_angular: 2
# added remap to catch commands from the right topic or use 'remap' of launch file
topic_sub_joy_cmd_vel: cmd_vel
```
- There's a `nm.cfg` file that defines network configs, ie.
```xml
<UDP_Configuration

UDP_Port = "3794"
MulticastTTL = "16"
MulticastAddr = "239.255.0.1"
MaxBufferSize = "70000"
/>
```
- You run a `JTSNodeManager` that handles all the communications between the components
and the outside world (hmm `roscore`?)
- fkie has their [own version](https://github.com/fkie/iop_node_manager)

## IOP challenge requirements for IGVC
Coming back to the IOP challenge for IGVC:

> Each entry will interface with the **Judges Testing Client (JTC)** as specified in the sections that follow.
> The **JTC** will be running testing software called the **Conformance Verification Tool (CVT)**, which evaluates
> JAUS services interfaces, and will also be running Common Operating Picture (COP) software that will display
> information coming from the entrant’s platform while it is executing the tasks defined by the Interoperability
> Profiles Challenge.

Also **CVT**
(Taken from [OpenJaus](https://support.openjaus.com/support/solutions/articles/35000112742-accessing-the-conformance-verification-tool-cvt-):
> OpenJAUS is aware of 3 ways to obtain a copy of the **CVT**:
> - If you are a member of the **National Advanced Mobility Consortium (NAMC)**, you can request a copy through NAMC
> - If you are **under contract with the Government** and that contract contains the proper clauses and scope for access to the CVT, you can request a copy through the relevant contracting office
> - If you are a **sub-contractor to a company that meets the requirements for #2 above**, the primary contractor can get written permission to provide the CVT to you

So, since we're not military or government, we probably won't have access to a CVT.

> A team will provide **two JAUS Components** - a **Navigation and Reporting JAUS Component** that contains all
> the services defined by the Navigation and Reporting capability, and **platform management JAUS Component**
> specified by the Platform Management attribute basic value.

So we need two JAUS components. The services that are required of each component are below:
- Platform Management
- Transport
- Events
- Access Control
- Liveness
- Discovery
- Navigation and Reporting
- Transport
- Events
- Access Control
- Management
- Liveness
- Waypoint Driver
- Waypoint List Driver
- Velocity State Sensor
- Local Pose Sensor
- Primitive Driver

All of which are standard services defined in JAUS.

There are 2 main parts to the scoring:
1. IOP Interfaces Task
- Basically we get points for implementing the required services, and get penalized if there's some error
2. Performance Tasks
- The judges use the interfaces to, at the end, drive around a few waypoints, after making sure that the system is
safe.

The time for the performance task waypoint run is used to break ties.

Now, back to the services.

Thankfully, `fkie/iop_core` has implementations of some services:
- `fkie_iop_transport`
- `fkie_iop_events`
- `fkie_iop_accesscontrol`
- `fkie_iop_management`
- `fkie_iop_liveness`
- `fkie_iop_discovery`
- `fkie_iop_local_waypoint_driver`
- `fkie_iop_local_waypoint_list_driver`
- `fkie_iop_velocity_state_sensor`
- `fkie_iop_local_pose_sensor`
- `fkie_iop_primitive_driver`

Which overlaps with all the services that we need, so we shouldn't need to write any extra services.

So, the list of thing(s) that we would need to do would be:
- Implement our own version of the CVT
- Thankfully some Indians part of Manipal asked about this last year on an issue for `fkie/iop-core` repo,
and the German dudes gave [a really good response](https://github.com/fkie/iop_core/issues/2).
- Basically, we can use [wireshark](https://www.wireshark.org/) and some
[LUA plugin that some other repo trying to do ROS+JAUS integration did](https://github.com/udmamrl/ROSJAUS/blob/master/Wireshark-dissector/Wireshark_JAUS_dissector.lua)
- Then it'like unit test writing time, to check that all the specs mentioned in the IGVC competition manual are
fulfilled

## Tasks for other subteams
The other important tasks that need to be done:

> The teams will implement a wireless 802.11 b/g or **hardwired Ethernet data link**

We didn't include a router in our budget, so hardwired Ethernet data link it is

> For the Wired Network, the judges will provide a **Gigabit Ethernet switch** that a team may plug their subsystem
> into using a standard RJ-45 Ethernet connector

> The team shall provide one **Connector Type A connector as specified in the Payloads IOP**.
> This connector shall be provided at a location that is easily accessible to the judges.
> At some point in the future of this competition, this connector may be used to add a judges’ or team payload to the
> platform. For the purposes of this IOP Challenge, the team **does not need to provide Gigabit Ethernet** at this
> connector – **Fast Ethernet** (**Translator's note**: Fast Ethernet means that we can
> "_carry traffic at the nominal rate of 100 Mbit/s_") is acceptable.
> The team **shall NOT connect power to this connector** at this time – **only the data lines shall be populated**

Regarding the "Connector Type A connector" (lmao weird wording), I have no idea what this is. But then also from
a previous section:

> #### 1.5.2 Payloads Requirements
> There are currently no payloads requirements

Googling "Connector Type A" yields only USB Type A Connector, while "Payloads IOP" yields to
[this document](https://apps.dtic.mil/dtic/tr/fulltext/u2/a558824.pdf), which contains

> #### 5.1.2 Connectors
> This section defines requirements associated with the **physical/electrical connectors** employed to integrate
> subsystems and payload(s) to the UGV platform. This is defined in the **UGV IOP Payloads Profile**

From Wikipedia:

> The National Advanced Mobility Consortium (NAMC) makes the IOPs available at the
> https://namcgroups.org website for registered users.

So..... for now I think it's a safe bet to say that this "Connector Type A connector" refers to just normal RJ45, since
it seems like thats what the payloads have on them.

## My thoughts on IOP
Since all the hard work has been done for us with `fkie/iop_core`, and we only need to write a CVT to verify that
everything works properly, I think that **it is worth doing this year**, though it probably **won't be that high of a
priority**.
3 changes: 2 additions & 1 deletion igvc_gazebo/launch/autonav.launch
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@

<include file="$(find igvc_gazebo)/launch/simulation.launch">
<arg name="world_name" value="$(find igvc_description)/urdf/worlds/autonav.world"/>
<arg name="waypoints" value="$(find igvc_gazebo)/config/waypoints_autonav_$(arg track).csv" />
<arg name="waypoint_folder_path" value="$(find igvc_gazebo)/config"/>
<arg name="waypoint_file_name" value="waypoints_autonav_$(arg track).csv"/>

<arg name="x" value="$(arg x)"/>
<arg name="y" value="$(arg y)"/>
Expand Down
3 changes: 2 additions & 1 deletion igvc_gazebo/launch/qualification.launch
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,8 @@

<include file="$(find igvc_gazebo)/launch/simulation.launch">
<arg name="world_name" value="$(find igvc_description)/urdf/worlds/qualification.world"/>
<arg name="waypoints" value="$(find igvc_gazebo)/config/waypoints_qual_$(arg track).csv" />
<arg name="waypoint_folder_path" value="$(find igvc_gazebo)/config"/>
<arg name="waypoint_file_name" value="waypoints_qual_$(arg track).csv"/>

<arg name="x" value="$(arg x)"/>
<arg name="y" value="$(arg y)"/>
Expand Down
13 changes: 6 additions & 7 deletions igvc_gazebo/launch/simulation.launch
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,8 @@
<launch>
<env name="GAZEBO_MODEL_PATH" value="$(find igvc_description)"/>
<arg name="world_name" default="$(find igvc_description)/urdf/worlds/qualification.world"/>
<arg name="waypoints" default="$(find igvc_gazebo)/config/waypoints_qual_0.csv" />
<arg name="waypoint_folder_path" default="$(find igvc_gazebo)/config"/>
<arg name="waypoint_file_name" default="waypoints_qual_0.csv"/>
<arg name="cmd_timeout" default="0.5"/>
<!-- accelerate lidar with GPU -->
<arg name="gpu" default="false"/>
Expand All @@ -18,12 +19,10 @@

<param name="/use_sim_time" value="true"/>

<!-- WARNING: this waypoint instance will always use ground truth -->
<node name="waypoint_source" pkg="igvc_navigation" type="waypoint_source" output="screen" >
<param name="file" value="$(arg waypoints)" />
<param name="localization_topic" value="/ground_truth" />
<param name="gps_topic" value="/fix" />
</node>
<include file="$(find igvc_navigation)/launch/set_waypoint_file_path.launch">
<arg name="folder_path" value="$(arg waypoint_folder_path)"/>
<arg name="file_name" value="$(arg waypoint_file_name)"/>
</include>

<!-- Create the world. -->
<include file="$(find gazebo_ros)/launch/empty_world.launch">
Expand Down
Loading