Advanced search
2 files | 12.27 MB Add to list

UnfoldIR : tactile robotic unfolding of cloth

Remko Proesmans (UGent) , Andreas Verleysen (UGent) and Francis wyffels (UGent)
Author
Organization
Project
Abstract
Robotic unfolding of cloth is challenging due to the wide range of textile materials and their ability to deform in unpredictable ways. Previous work has focused almost exclusively on visual feedback to solve this task. We present UnfoldIR ("unfolder"), a dual-arm robotic system relying on infrared (IR) tactile sensing and cloth manipulation heuristics to achieve in-air unfolding of randomly crumpled rectangular textiles by means of edge tracing. The system achieves $> $85% coverage on multiple textiles of different sizes and textures. After unfolding, at least three corners are visible in 83.3 up to 94.7% of cases. Given these strong "tactile-only" results, we argue that the fusion of both tactile and visual sensing can bring cloth unfolding to a new level of performance.
Keywords
Robot sensing systems, Robots, Grippers, Fingers, Sensors, Grasping, Textiles, Dual arm manipulation, force and tactile sensing, Index Terms, sensor-based control

Downloads

  • (...).pdf
    • full text (Published version)
    • |
    • UGent only
    • |
    • PDF
    • |
    • 2.96 MB
  • Ds641 acc.pdf
    • full text (Accepted manuscript)
    • |
    • open access
    • |
    • PDF
    • |
    • 9.31 MB

Citation

Please use this url to cite or link to this publication:

MLA
Proesmans, Remko, et al. “UnfoldIR : Tactile Robotic Unfolding of Cloth.” IEEE ROBOTICS AND AUTOMATION LETTERS, vol. 8, no. 8, 2023, pp. 4426–32, doi:10.1109/LRA.2023.3284382.
APA
Proesmans, R., Verleysen, A., & wyffels, F. (2023). UnfoldIR : tactile robotic unfolding of cloth. IEEE ROBOTICS AND AUTOMATION LETTERS, 8(8), 4426–4432. https://doi.org/10.1109/LRA.2023.3284382
Chicago author-date
Proesmans, Remko, Andreas Verleysen, and Francis wyffels. 2023. “UnfoldIR : Tactile Robotic Unfolding of Cloth.” IEEE ROBOTICS AND AUTOMATION LETTERS 8 (8): 4426–32. https://doi.org/10.1109/LRA.2023.3284382.
Chicago author-date (all authors)
Proesmans, Remko, Andreas Verleysen, and Francis wyffels. 2023. “UnfoldIR : Tactile Robotic Unfolding of Cloth.” IEEE ROBOTICS AND AUTOMATION LETTERS 8 (8): 4426–4432. doi:10.1109/LRA.2023.3284382.
Vancouver
1.
Proesmans R, Verleysen A, wyffels F. UnfoldIR : tactile robotic unfolding of cloth. IEEE ROBOTICS AND AUTOMATION LETTERS. 2023;8(8):4426–32.
IEEE
[1]
R. Proesmans, A. Verleysen, and F. wyffels, “UnfoldIR : tactile robotic unfolding of cloth,” IEEE ROBOTICS AND AUTOMATION LETTERS, vol. 8, no. 8, pp. 4426–4432, 2023.
@article{01H57AAY80906VYXS49V9F7ZSD,
  abstract     = {{Robotic unfolding of cloth is challenging due to the wide range of textile materials and their ability to deform in unpredictable ways. Previous work has focused almost exclusively on visual feedback to solve this task. We present UnfoldIR ("unfolder"), a dual-arm robotic system relying on infrared (IR) tactile sensing and cloth manipulation heuristics to achieve in-air unfolding of randomly crumpled rectangular textiles by means of edge tracing. The system achieves $> $85% coverage on multiple textiles of different sizes and textures. After unfolding, at least three corners are visible in 83.3 up to 94.7% of cases. Given these strong "tactile-only" results, we argue that the fusion of both tactile and visual sensing can bring cloth unfolding to a new level of performance.}},
  author       = {{Proesmans, Remko and Verleysen, Andreas and wyffels, Francis}},
  issn         = {{2377-3766}},
  journal      = {{IEEE ROBOTICS AND AUTOMATION LETTERS}},
  keywords     = {{Robot sensing systems,Robots,Grippers,Fingers,Sensors,Grasping,Textiles,Dual arm manipulation,force and tactile sensing,Index Terms,sensor-based control}},
  language     = {{eng}},
  number       = {{8}},
  pages        = {{4426--4432}},
  title        = {{UnfoldIR : tactile robotic unfolding of cloth}},
  url          = {{http://doi.org/10.1109/LRA.2023.3284382}},
  volume       = {{8}},
  year         = {{2023}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: