Affordance prediction of hand tools using interactive perception [Etkileşimli görsellik kullanilarak el aletlerinin saǧlarlik tahmini]
2012 20th Signal Processing and Communications Applications Conference, SIU 2012, Proceedings
MetadataShow full item record
Please cite this item using this persistent URLhttp://hdl.handle.net/11693/28199
In daily life, the selection of a hand tool for a job depends on appereance of the tool and its effect on the objects. The effect determines the affordance of the chosen tool. Aim of this work is to determine the affordances of hand tools based only on their appereance and to build a basis for simple tool usage of humanoid robots. Towards this end, in this work from the functional regions of human interacted hand tools, sharpness, bluntness, distance between two tip and grayscale histogram features are extracted and specific affordance models are trained. The features of a hand tool which its affordances wanted to be learned are given to the trained models to determine which affordances that the tool has like can cut, can push, can squeeze, can pierce. During testing, the model predicted the affordances %93.1. From this results it can be said that, this model sets a basis for simple tool usage of humanoid robots. © 2012 IEEE.
- Conference Paper 2294