View : 552 Download: 0

Full metadata record

DC Field Value Language
dc.contributor.author김영준*
dc.contributor.author오유란*
dc.date.accessioned2021-06-07T16:31:39Z-
dc.date.available2021-06-07T16:31:39Z-
dc.date.issued2021*
dc.identifier.issn1939-1412*
dc.identifier.issn2329-4051*
dc.identifier.otherOAK-29332*
dc.identifier.urihttps://dspace.ewha.ac.kr/handle/2015.oak/257658-
dc.description.abstractEncountered-type haptic rendering provides realistic, free-to-touch, and move-and-collide haptic sensation to a user. However, inducing haptic-texture sensation without complicated tactile actuators is challenging for encountered-type haptic rendering. In this article, we propose a novel texture synthesizing method for an encountered-type haptic display using spatial and temporal encoding of roughness, which provides both active and passive touch sensation requiring no complicated tactile actuation. Focused on macro-scale roughness perception, we geometrically model the textured surface with a grid of hemiellipsoidal bumps, which can provide a variety of perceived roughness as the user explores the surface with one's bare hand. Our texture synthesis method is based on two important hypotheses. First, we assume that perceptual roughness can be spatially encoded along the radial direction of a textured surface with hemiellipsoidal bumps. Second, perceptual roughness temporally varies with the relative velocity of a scanning human hand with respect to the surface. To validate these hypotheses on our spatiotemporal encoding method, we implemented an encountered-type haptic texture rendering system using an off-the-shelf collaborative robot that can also track the user's hand using IR sensors. We performed psychophysical user tests with 25 participants and verified the main effects of spatiotemporal encoding of a textured model on the user's roughness perception. Our empirical experiments imply that the users perceive a more rough texture as the surface orientation or the relative hand motion increases. Based on these findings, we show that our visuo-haptic system can synthesize an appropriate level of roughness corresponding to diverse visual textures by suitably choosing encoding values.*
dc.languageEnglish*
dc.publisherIEEE COMPUTER SOC*
dc.subjectHaptic interfaces*
dc.subjectRough surfaces*
dc.subjectSurface roughness*
dc.subjectSurface texture*
dc.subjectEncoding*
dc.subjectManipulators*
dc.subjectEncountered-type haptic*
dc.subjecthaptic texture*
dc.subjecthuman robot interaction*
dc.subjecttexture roughness*
dc.titleSynthesizing the Roughness of Textured Surfaces for an Encountered-Type Haptic Display Using Spatiotemporal Encoding*
dc.typeArticle*
dc.relation.issue1*
dc.relation.volume14*
dc.relation.indexSCIE*
dc.relation.indexSCOPUS*
dc.relation.startpage32*
dc.relation.lastpage43*
dc.relation.journaltitleIEEE TRANSACTIONS ON HAPTICS*
dc.identifier.doi10.1109/TOH.2020.3004637*
dc.identifier.wosidWOS:000633391700004*
dc.author.googleKim, Yaesol*
dc.author.googleKim, Siyeon*
dc.author.googleOh, Uran*
dc.author.googleKim, Young J.*
dc.contributor.scopusid김영준(56223507100)*
dc.contributor.scopusid오유란(55569327700)*
dc.date.modifydate20240322133440*
Appears in Collections:
인공지능대학 > 컴퓨터공학과 > Journal papers
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

BROWSE