-
Notifications
You must be signed in to change notification settings - Fork 4
Software Overview
Basic info on how to use git and GitHub can be found in the club's general resources document here. A good tutorial if you're just getting started is this one. We do all of our stuff for IARC on branches, so even if you know how to use git but haven't used branches much, it would be good to take a look at the part of that tutorial that goes through making a branch and a pull request.
All of the repositories in the Pitt-RAS that start with iarc7_ belong to the IARC team, along with the cleanflight repository. If you're a member of the Pitt-RAS organization, our repositories are all owned by the IARC7-2017 team.
To control the drone, we will be utilizing ROS (Robot Operating System). ROS provides a framework to abstract different tasks for robot control and allow for easy code reuse. It accomplishes this by organizing code into Nodes and grouping them into packages. Each node performs a specific task and each package groups related nodes.
Since having a lot of isolated nodes will not do us much good, nodes need to communicate. The communicating nodes do no necessarily need to be running on the same machine. ROS allows nodes to communicate through topics. A topic is a channel which nodes can send messages along. To receive messages from a certain topic, nodes will subscribe from to a topic and to send message to a certain topic, nodes will publish to a certain topic. As long as the name is known, any node can publish and subscribe to any topic. Multiple nodes can publish and subscribe to one topic. Similarly, a single node can publish and subscribe multiple topics. All topics and messages are a specific type.
For more information on these topics you can click the ROS line in the first link to visit the ROS wiki, where they have explanations and tutorials. Additionally, I found the tutorial here to very helpful. The next videos in the series introduce the basics for C++ and Python.
ROS has a transform framework which lets you publish and use data about where different parts of your robot are in relation to each other. For more information, look here. Here's our TF tree (i.e. all the coordinate frames we have, who's publishing them, and some stats about how fast they get published:

This section is an explanation of the Nodes found in the Nodes Document.
Here's what our nodes currently look like (ellipses are nodes, arrows are topics):

We are keeping the same format of the document found in google drive for now but please do not simply reiterate it here. We'd like this section to be an explanation. Explain the package, its nodes, the files that belong to nodes, what each file does, etc. Link as many resources as you feel are necessary
Format:
-
Package ** Node - description *** Files: relevant source code *** Sub: Subscribe to a ROS topic (with this type) *** Pub: Publish to a ROS topic (with this type) *** Service client/server: Use or create a ROS service
*** Action client/server: Use or create a ROS action
*** TF: A transform from one coordinate frame to another.
*** Bond: Establish a bond with the specified node (used to detect node crashing) -
iarc7_abstract- This package contains very high-level programs (or "abstracts") that define what the quad should do. The general idea is that these are very hardware-agnostic, they act like the drone is a single object that can execute high-level commands such as "go here" or "push this button."
** iarc7_mission_control - abstract for IARC Mission 7
** iarc7_ai - takes in data on our location, locations of roombas, and locations of obstacles, and decides the best action to take to solve IARC Mission 7 (e.g. track a roomba, explore an area of the arena, press a roomba's top button). This node is intimately linked with the iarc7_mission_control node, and they probably would never exist separately.
-
iarc7_sensors- Contains all the sensor nodes
** Altimeter - Reads data from a Lidar-Lite v2 laser rangefinder, does a transformation based on the quad's orientation, and spits out the raw (untransformed) distance on /altimeter_reading and an estimated pose with covariance on /altimeter_pose (only the z axis is relevant)
*** Files: All the cpp files in this package are for the altimeter
** Battery - Monitors the voltage coming into the Jetson (and current usage) from its battery and publishes it on /jetson_battery
*** Files: src/bat_tester_01.py
-
iarc7_fc_comms- communication with the flight controller
** fc_comms_msp - node that communicates with a flight controller using the MultiWii Serial Protocol *** Files: literally everything in this package
-
iarc7_motion- Nodes related to https://github.com/Pitt-RAS/iarc7_common/wiki/Motion-Control[Motion Control]
** Motion Planning - This node takes in high-level actions (like waypoints or objects to track) and outputs velocities so that the quad executes those actions. This node also takes obstacles into account and guarantees that we don't hit anything even if the action we're executing would naively take us through an obstacle. The one subscription listed here is for the obstacle avoidance algorithm; individual tasks will have their own subscriptions depending on what they're trying to do, those aren't listed here.
*** Files: almost everything in the iarc7_motion>src>iar7motion directory
**** Entry point: scripts/motion_planner.py
**** Obstacle avoidance: scripts/obstacle_avoider.py - will use some version of a https://en.wikipedia.org/wiki/Velocity_obstacle[velocity obstacle algorithm] to make sure our velocity is safe
** Low Level Motion - This is a velocity controller, meaning that it reads in the desired velocity and outputs commands for the flight controller (pitch, roll, yaw, and throttle) so that the quad hits the desired velocity
*** Files: everything in the src and include direcories
**** Entry point: src/LowLevelMotionController.cpp
**** Motion control code: src/QuadVelocityController.cpp
**** Basic PID controller: src/PidController.cpp
-
iarc7_vision- Nodes related to reading and processing images, see https://github.com/Pitt-RAS/iarc7_common/wiki/Computer-Vision[Computer Vision]. This also contains code for localizing the quad using the cameras.
** Obstacle Tracking - finds and tracks obstacles, publishes their locations on /obstacles
** Roomba Tracking - finds and tracks roombas in the arena, publishes their locations on /roombas
** Localization - uses the bottom camera (and possibly the side cameras in the future) to determine our location in the arena, publishes its best guess on /camera_localized_pose
** Camera reader - publishes images from all of our cameras (already part of ROS - see http://wiki.ros.org/libuvc_camera)
-
iarc7_simulator** morse - node launched by MORSE that publishes data from simulated sensors into ROS and takes in commands for simulated actuators ** simulator_adapter - node that does some translation of our datatypes into standard ROS datatypes that MORSE uses and vice versa -
robot_localization- not written by us, uses an extended Kalman filter (read: magic position estimator) to figure out where we actually are based on the estimates from our altimeter, cameras, et cetera. (For more info, take a look at http://docs.ros.org/kinetic/api/robot_localization/html/index.html[the docs] or read about https://en.wikipedia.org/wiki/Kalman_filter[how Kalman filters work])