Customizing Textile and Tactile Skins for Interactive Industrial Robots

1Carnegie Mellon University

Video

Abstract

Tactile skins made from textiles enhance robot-human interaction by localizing contact points and measuring contact forces. This paper presents a solution for rapidly fabricating, calibrating, and deploying these skins on industrial robot arms.

The novel automated skin calibration procedure maps skin locations to robot geometry and calibrates contact force. Through experiments on a FANUC LR Mate 200id/7L industrial robot, we demonstrate that tactile skins made from textiles can be effectively used for human-robot interaction in industrial environments, and can provide unique opportunities in robot control and learning, making them a promising technology for enhancing robot perception and interaction.

Skin Fabrication and Calibration

The textile and tactile skin consists of three layers of fabric in a sandwich structure. The top and bottom layers contain alternating conductive and non-conductive stripes placed orthogonally, while the middle layer is formed by a mesh that separates the top and bottom layers.

When pressure is applied to the skin, the top and bottom layers are stretched, causing the conductive stripes to touch each other and complete the circuit. The resistance, which is inversely correlated to the touching area of the top and bottom layers, is then recorded by an Arduino microcontroller.

To localize the skin and calibrate pressure measurement, we use a force torque sensor attached to a fixed tripod. Data collection involves the robot approaching the sensor and actively touching it with the skin, generating a skin-traversal trajectory through Poisson-disc sampling.

Using this method, we achieved a contact localization root mean square error (RMSE) of 3.00 cm and a force prediction RMSE of 1.36 N, which is a significant improvement over naive scaling methods.

Skin Fabrication and Calibration Process

Control Applications

Skin Control Framework

Trajectory Modification from Human Feedback

Skin-enabled controllers can efficiently detect the contact position and generate new trajectories that minimize disruption to the robot's original motion. We formulate the trajectory modification problem as a quadratic program (QP) to minimize the velocity difference between the original and modified trajectories while ensuring that the robot follows human's commands.

Our experiments show that when human touches the robot, the skin successfully localizes the point of contact, and the robot's trajectory is immediately modified to reduce the Cartesian velocity in the direction of the contact to zero.

Trajectory Modification Results

Skin-Enabled Admittance Control

Skin-enabled admittance control solves the limitation of traditional joint torque sensors by dynamically regulating the forces applied to the contact point. Unlike joint torque sensors, which cannot distinguish forces applied to different joints in the multi-joint, multi-contact case, the skin-enabled admittance control measure contact force and location on the robot surface and allows each joint to respond independently to contacts.

With skin-enabled admittance control, the interactive force profiles between humans and robots are smoother. The human operator can interact with the robot in a more intuitive manner and provide rich contact information for learning from human feedback.

Admittance Control Results

Related Work

Robot control in interactive tasks heavily relies on available sensors:

Force Torque Sensors (FTS) can measure contact force and torque but cannot localize the contact position.

Robot joint motor measurements can estimate external force and detect collisions, but do not localize the contact positions.

Vision-based systems can localize contact but do not detect force and suffer from occlusion.

Light curtains do not suffer from occlusion but can not be easily retrofitted onto complex geometries.

Commercial tactile skins like T-skin are available to enhance robot safety, but they are costly and may not be suitable for intricate human-robot interaction tasks.

BibTeX

@article{su2023customizing,
  author    = {Su, Bo Ying and Wei, Zhongqi and McCann, James and Yuan, Wenzhen and Liu, Changliu},
  title     = {Customizing Textile and Tactile Skins for Interactive Industrial Robots},
  journal   = {Modeling, Estimation, and Control Conference (MECC)},
  year      = {2023},
}