IEEE CEMRA: Difference between revisions
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
== Autonomous mobile robot programming using ROS on a remote web-based real robot lab == | == Autonomous mobile robot programming using ROS on a remote web-based real robot lab == | ||
[[Image | [[Image:CEMRA logo.png|400px]] | ||
=== Summary === | === Summary === |
Latest revision as of 17:44, 3 February 2018
Autonomous mobile robot programming using ROS on a remote web-based real robot lab
Summary
Autonomous mobile robots is an exciting engineering field, providing fertile ground for teaching a broad range of important concepts, including dynamical systems, control theory, sensor fusion, decision-making, motion and action planning, among others. In addition, the cyber physical nature of robotic platforms makes them quite appealing and motivating for students. This project aims at the development of learning material to teach basic concepts of mobile robots, encompassing both single and multiple robots, through remote hands-on experimentation. As output, we will provide (1) the learning material, based on the widely used Robot Operating System (ROS) as middleware and development environment, (2) an open specification of a remote physical robotics lab where students may execute practical exercises using real robots, and (3) simulator configuration files modeling the physical lab. The scope of the learning material covers a wide range of topics on mobile robots, from sensors and actuators, up to autonomous navigation and formation control.
Program
This course comprises the following modules:
M1: Sensors
Main focus will be on dead-reckoning sensors, particularly encoders and their use for odometry, and on range sensors (laser range finders, sonars, RGBD-cams). Probabilistic motion models and measurement models that reflect the impact of uncertainty will be introduced for later use in the localization module
M2: Actuators
Wheel (mostly DC) motors will be explained, and their dynamic model introduced, so as for further usage in the Guidance Module.
M3: Motion Planning
A diversity of available methods will be covered, including path planning (e.g., RRT, A*), trajectory planning and vehicle tracking (adding time, velocity and acceleration specifications) and state-feedback maneuvering.
M4: Localization
The Bayes’ filter will be introduced as a general framework to estimate the state of a dynamic system interleaving update and prediction steps. Localization will be presented as the estimate of the vehicle pose by a Bayes’ filter fusing odometry and range sensing (map- or landmark-based), and then particularized for the (Extended) Kalman Filter and Monte Carlo Localization (based on the particle filter) versions.
M5: Guidance
Guidance will be explained as a control problem where the vehicle body frame is controlled to follow a path, track a trajectory or perform maneuvers (using M3 concepts) while periodically estimating its pose along the path or trajectory (M4).
M6: Formation Control
Multirobot systems will be tackled through a module on formation control, which will highlight control strategies (e.g., rigid vs deformable formations, consensus-based methods), relative localization methods and the impact of communications.
Team
- Rodrigo Ventura (Faculty, Coordinator)
- Pedro U. Lima (Faculty)
- José Seoane (Grantee)
Funding
This project is funded by the IEEE Robotics and Automation Society (RAS), under the program Creation of Educational Material in Robotics and Automation 2016 (CEMRA).