Direkt zum Inhalt springen
Computer Vision & Artificial Intelligence
TUM School of Computation, Information and Technology
Technical University of Munich

Technical University of Munich

Menu

Links

Informatik IX
Chair of Computer Vision & Artificial Intelligence

Boltzmannstrasse 3
85748 Garching info@vision.in.tum.de

Follow us on:
CVG Group DVL Group SRL Group


Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
members:sturmju:research:tactile [2011/07/12 11:23]
sturmju created
members:sturmju:research:tactile [2012/01/20 12:01] (current)
Line 1: Line 1:
 +====== Tactile Sensing ======
 +
 +In particular for robotic manipulation
 +tasks, tactile sensing provides another sensor modality that can
 +reveal relevant aspects about the object being manipulated, for
 +example, to infer its identity, pose, and internal state.
 +
 +==== Tactile object recognition using the bag-of-features approach ====
 +
 +Our approach uses the bag-of-features model that we apply to object
 +classification based on tactile images. First, the robot generates a
 +suitable tactile feature vocabulary using unsupervised clustering from
 +real data. Second, it learns a set of feature models (a so-called
 +codebook that encodes the appearance of objects in form of
 +feature histograms. After training, a robot can use this codebook to
 +identify the grasped object. Since the objects that we
 +consider are typically larger than the sensor and consist of similar
 +parts, the robot may need multiple grasps at different
 +positions to uniquely identify an object.  To reduce the 
 +number of required grasps, we apply a decision-theoretic framework that chooses
 +the grasping location in such a way that the expected entropy of the belief
 +distribution is minimized.  In experiments carried out on a large set of industrial
 +and household objects, we demonstrate that our approach enables a
 +manipulation robot to discriminate various objects only by touch.
 +
 +
 +<html>
 +<iframe width="320" height="240" src="http://www.youtube.com/embed/E3J1E3Q1kcs?hl=en&fs=1" frameborder="0" allowfullscreen></iframe>
 +<iframe width="320" height="240" src="http://www.youtube.com/embed/QtgbWV9qmbc?hl=en&fs=1" frameborder="0" allowfullscreen></iframe>
 +</html>
 +
 +==== Estimating the internal state of containers ====
 +We investigated features that describe the
 +dynamics in the tactile response while the robot is
 +grasping or manipulating an object. As we showed, the dynamic
 +components of the tactile signal can be used to infer several aspects of the internal state
 +of an object. For example, these features allow a robot to
 +detect whether a grasped bottle contains liquid and whether its cap
 +has been properly closed. This information is highly relevant for
 +domestic robots that fulfill service tasks such as tidying up.
 +
 +
 +
 +====== Related Publications ======
 +
 +<bibtex>
 +<author>J. Sturm</author>
 +<topic>tactile-sensing</topic>
 +</bibtex>
  

Rechte Seite

Informatik IX
Chair of Computer Vision & Artificial Intelligence

Boltzmannstrasse 3
85748 Garching info@vision.in.tum.de

Follow us on:
CVG Group DVL Group SRL Group