Publication
Ascertaining relevant changes in visual data by interfacing AI reasoning and low-level visual information via temporally stable image segments data
Conference Article
Conference
International Conference on Cognitive Systems (228)
Edition
2008
Pages
153-160
Doc link
http://www.cogsys2008.org/program/oral_sessions.php#id0016
File
Abstract
Action planning and robot control require logical operations to be performed on sensory information, i.e. images of the world as seen by a camera consisting of continuous values of pixels. Artificial intelligence (AI) planning algorithms however use symbolic descriptors such as objects and actions to define logic rules and future actions. The representational differences at these distinct processing levels have to be bridged in order to allow communication between both levels. In this paper, we suggest a novel framework for interfacing AI planning with low-level visual processing by transferring the visual data into a discrete symbolic representation of temporally stable image segments. At the AI planning level, action-relevant changes in the configuration of image segments are inferred from a set of experiments using the Group Method of Data Handling. We apply the method to a data set obtained by repeating an action in an abstract scenario for varying initial conditions, determining the success or failure of the action. From the set of experiments, joint representations of actions and objects are extracted, which capture the rules of the given scenario.
Categories
pattern recognition.
Scientific reference
N. Shylo, F. Wörgötter and B. Dellen. Ascertaining relevant changes in visual data by interfacing AI reasoning and low-level visual information via temporally stable image segments data, 2008 International Conference on Cognitive Systems, 2008, Karlsruhe, Alemanya, pp. 153-160, University of Karlsruhe.
Follow us!