Rovio mobile webcam, Pioneer P3-AT mobile robot, Octomap by K. Wurm et al, MS Kinect

 

MSc dissertation proposal 2012/2013

 

Robotic Home Sentinel

 

 

Introduction:

 

Surveillance robots must have high levels of autonomy. This implies sensing and moving within the environment, working for long periods of time without human intervention, and avoiding harmful situations to people, pets, property, or the robot itself.

 

Recent vision sensors, namely the Microsoft Kinect [MS-Kinect] allow effectively measuring the empty space in front of a robot. This MSc project proposal is therefore concerned with using Kinect like sensors for freely navigating in a cluttered home environment while reporting pre-specified scene-views or any anomalous situations found.

 

Note: this project is proposed in the framework of the European project RoboSoM (see http://www.robosom.eu/ ), having therefore the possibility of international cooperation and experiences exchanging.

 

 

Objectives:

 

The objectives of this work are twofold: (i) Navigating within a cluttered environment using a Kinect like sensor, and (ii) reporting pre-specified scene-views or any anomalous situations found.

 

 

Detailed description:

 

Sensing and moving in an unknown cluttered environment is nowadays facilitated by the recent introduction in the market of the MS Kinect camera [MS-Kinect]. The Kinect camera provides not only visual information of the scenario, but also depth information (3D information).

 

Some recent research works show that by choosing the proper scene representations it is possible to do data acquisition, processing and integration with a standard PC. One such reference is the work by Wurm et al [Wurm10], where a mobile robot maps a number of campuses. Another reference is the work by Shen et al [Shen11], where a flying robot (quadrotor) maps a multi-floor scenario using a Kinect camera (click here to see a video).

 

Preliminary research works, done with wheeled robots equipped with simpler 1D sensors, have now the opportunity to be upgraded to Kinect like (2D) sensors. For example the well known dynamic window based approach to navigation [Fox97], can now be extended to represent volumes instead of just areas, meaning that robots can negotiate better narrow or low passages.

 

The referred works give conceptual starting points and software libraries which facilitate doing experiments. This constitutes therefore the starting point of the MSc project. The libraries allow doing data acquisition and displaying the resulting scenarios. An important point to explore involves using the acquired data for navigation experiments.

 

In summary, the work is organized in the following main steps:

1) data acquisition, implies moving the robot within the scene and capturing scene information

2) data integration into a single scene representation

3) demonstration of an autonomous tour inside the scene

 

 

References:

 

[HRI2007] - 2nd ACM/IEEE International Conference on Human-Robot Interaction, http://hri2007.org/

 

[Gaspar00] Vision-based Navigation and Environmental Representations with an Omnidirectional Camera, José Gaspar, Niall Winters, José Santos-Victor, IEEE Transaction on Robotics and Automation, Vol 16, number 6, December 2000

 

[ISR-galery] Some images of unicycle type robots at ISR: see labmate, scouts, pioneers, ... in "Mini galeria de fotografias de projectos @ IST/ISR", http://users.isr.ist.utl.pt/~jag/infoforum/isr_galeria/

 

[MS-Kinect] http://en.wikipedia.org/wiki/Kinect

 

[Wurm10] Kai M. Wurm, Armin Hornung, Maren Bennewitz, Cyrill Stachniss, Wolfram Burgard, "OctoMap: A Probabilistic, Flexible, and Compact 3D Map Representation for Robotic Systems", in Proceedings of the ICRA 2010 Workshop on Best Practice in 3D Perception and Modeling for Mobile Manipulation, 2010, http://www.informatik.uni-freiburg.de/~wurm/papers/wurm10octomap.pdf

 

[Shen11] Shaojie Shen, Nathan Michael, and Vijay Kumar , "3D Indoor Exploration with a Computationally Constrained MAV", ICRA 2011. See also video in http://www.youtube.com/watch?v=cOeCZDBHrJs&feature=channel_video_title

 

[Fox97] Dieter Fox , Wolfram Burgard , Sebastian Thrun, "The Dynamic Window Approach to Collision Avoidance", IEEE Robotics & Automation Magazine, v4n1 pp23-33, Mar 1997, http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.2.502

 

 

 

Expected results:

 

At the end of the work the students will have enriched their knowledge in:

* Computer vision

* Designing navigation modalities for mobile robots

 

 

Observations:

--

 

 

More MSc dissertation proposals on Computer and Robot Vision in:

 

http://omni.isr.ist.utl.pt/~jag