User Tools

Site Tools


ece4580:module_autonav

This is an old revision of the document!


Vision-Based Autonomous Navigation of a Mobile Robot

This module is really a blend of robotics and computer vision. We will be exploring different, reactive algorithms for performing collision free navigation through the world, and maybe even vision-based tracking of an specifically colored target. The baseline set of activities will rely on the Turtlebot learning modules (AKA Turtlebot Adventures). This adventure is for people who are open to learning the python programming language and also to learning the Robot Operating System (ROS), or who already have some knowledge of either/both.

Module #1: Wandering


This module pretty much follows the standard pipeline from the Turtlebot Adventures.

Week #1: Basic Operation

This module assumes no prior experience, so the first week is about getting the basic covered in terms of simply running the Turtlebot mobile robot. Like connecting to the Turtlebot and basic tele-operation of it. To answer some of the questions and be able to see what's going on from the command line, you should get familiar with the basic command line ROS commands.

  1. Demonstrate that you can connnect to the Turtlebot, launch the core services, and tele-operate it.
  2. Answer the questions in the First Run adventure.

Week #2: Drive Commands and Visual Sensing

Now that we have some grasp on the basics of the Turtlebot, we want to understand how to both actuate and to sense, the latter because that's the purpose of the class, and the former because this module is about deciding how to actuate based on the sensed information.

  1. Complete the Basic Movements module and answer the questions. Also, demo the robot movement during office hours.
  2. Complete the first two bullets of the Sensing the World adventure. Demo that you can subscribe to the visual sensor and to the depth sensor, and properly display both of their messages.

Week #3: A Safety Finite State Machine

Here, we will explore how to implement some safety checks in the robot en route to creating visual navigation algorithms for the Turtlebot. Importantly, these involve creating a finite state machine (FSM) for the operating mode of the Turtlebot, then coding this same FSM into python code.

  1. Complete the first two bullets of the Sensing the Turtlebot adventure.
  2. Demo the robot blindly navigating around based on its bump sensors.

Week #4: Vision-Based Wandering

Here, we will explore the most basic form of navigation, wandering around aimlessly without hitting things (hopefully). Limitations in the sensor field of view means that some collisions are inevitable under the right obstacle geometries.

  1. Complete the Wandering adventure, turn in the coding mistake regarding the sectors, and demo the robot wandering around.

Module #2: Follow the Gap


This module will explore an early sensor-based navigation method, known as Follow the Gap. It was designed to work for laser scan data, and works by identifying gaps in the local polar space around the robot. As far as obstacle-avoiding navigation strategies goes, it's one of the more basic algorithms.

Week #1: Gap Analysis

Read the paper to get a sense for what is involved in calculating the gap array and finding the maximum gap. Implement the procedure for doing so, and using select depth images from obstacle avoiding scenarios, turn in the gap array and maximum gap outputs.

Week #2: Gap Selection and Control

Given that the gap array and maximum gap has been computed, finish things off with the gap center angle computation, followed by the final heading angle computation. Use these to implement the algorithm.

Week #3: Hallway Navigation

Up to now, you have not really been contemplating some end-goal or objective for travelling. Not that this problem is going to be any better, but let's say that the robot's goal was to move down the 15 units of distance hallway from its start position (I don't know what units the odometry and mapping system uses; I think it is meters). Incorporate this end-goal by subscribing to the Turtlebot's internal frame estimation and using where it thinks it is to identify the vector or angle to the goal. Incorporate it into the follow-the-gap method as it was published to have.

Exploration & Deliverable: Demo the robot moving down the hallway towards its goal, reacting to static obstacles. Do the same for slightly dynamic obstacles. Comment on how robust the algorithm appears to be. What would you want to fix about it if you could? how would you go about doing that?

Module #3: Dynamic Window Approach


This module explores the Dynamic Window Approach (DWA), which is an older algorithm but still in use today.

Week #1: Externally Derived Objective Functions

Week #2: Velocity Derived Objective Function

Week #3: Integration and Selection

Week #4: Execution

Module #4: Vector Polar Histogram


Another recent addition to the solution landscape is the Vector Polar Histogram (VPH) method.

Week #1: TBD

Week #2: TBD

Week #3: TBD If At All

ece4580/module_autonav.1487738069.txt.gz · Last modified: 2024/08/20 21:38 (external edit)