RoboTac 2021
 International Workshop
September 27,  2021 (LIVE)
 New Advances in Tactile Sensation, Interactive Perception, Control, and Learning 
A Soft Robotic Perspective on Grasp, Manipulation, & HRI

IROS_baner_2480x372.jpeg

 

The video of each keynote talk is available now.

or

 Check the RoboTac Youtube Channel Here.

Best Paper Presentation Award:

Towards a soft robotic, haptic feedback seat for autonomy level transitions in highly-automated vehicles

       Jan Peters, Bani Anvari, Annika Raatz and Helge A. Wurdemann.       download paper.pdf

 *****Authors of the accepted papers will be invited to submit the extended version for their articles to a coming

special issue journal

Opening             09:00-09:10

Burdet_edited_edited.jpg
DP9A2973_edited_edited_edited_edited.jpg

Mohsen Kaboli

Etienne Burdet

vincent_hayward_edited.jpg

Vincent Hayward

09:10-09:35  (20 min + 5 min Q/A)

 

Title: The physical basis of haptic perception     Video

Abstract: In this discussion I propose the idea that the sense of touch, supported by the somatosensory system, has developed to take advantage of the ambient physics. While this is certainly not an original idea, it has proven to be rich in surprises and practical consequences which will be discussed by examples. 

Tansu.jpeg

Tansu Celikel

09:35-10:00  (20 min + 5 min Q/A)

 

Title:  Touch in silico              Video

Abstract: Neuromimetic algorithms has the potential to provide sense of touch for human-made devices in the near future. Here we discuss the recent work on information processing, communication and recovery along the somatosensory axis and introduce a biologically realistic and computationally efficient multilayer model of the sensory cortex that could learn from the experience of the agent.  

058a1b13-c57e-400a-be32-f70251e1a681.jpeg

Henrik Jörntell

10:55-10:20   (20 min + 5 min Q/A)

 

Title: The Neural Basis of Haptic Perception.    Video

Abstract: Here I will discuss recent developments in the view of haptics information and information processing in the nervous system. A central tenet is that haptics is a highly dynamical process, that engages large parts of the skin and consequently a large numbers of sensors, but this not well represented in the neuroscience literature. A consequence is that also large parts of the nervous system becomes engaged in haptics processing, which is also what we have found in a recent series of experiments. The implications of these changing views for the design of artificially intelligent systems, or robots, which rely on haptics to ‘understand’, are also discussed.

Katherine.jpeg

Katherine Kuchenbecker

10:20-10:45   (20 min + 5 min Q/A)

 

Title: Sensing Tactile Contact Over Large, Soft Surfaces.     Video


Abstract: Robots should be able to feel contacts that occur across all of their body surfaces, not just at their fingertips.Furthermore, tactile sensors should be soft in order to cushion contact and support the transmission of tangential force and torque.  Today's robotic systems rarely have such sensing capabilities because artificial skin tends to be complex, bulky, rigid, delicate, and/or expensive.  Taking inspiration from other successful sensor designs, my collaborators and I have created two families of soft sensors that can feel a distribution of contact forces across their large surfaces.  First, Hyosang Lee has led a long-term project on large fabric-based tactile sensors that use piezoresistive laminated structures and electrical resistance tomography (ERT).These ERTac sensors estimate the distribution of normal force at fast frame rates, enabling capable perception of complex contacts and total normal force.  The sensing hardware is relatively simple and robust, with point electrodes distributed across the surface, while the more complex sampling electronics and reconstruction algorithms provide interesting opportunities for system improvement.  Second, Huanbo Sun, Georg Martius, and I have recently invented a tactile sensor that uses vision and deep learning to deliver all-over 3D tactile sensing in a package the size and shape of an extended human thumb.  Called Insight, our sensor has a soft single-layer silicone skin that is over-molded on a stiff skeleton, lit by internal LEDs, and viewed from within by a camera.  Extensive contact data was collected by an automatic testbed that applies both normal and shear forces at points across the surface.  After training, the network estimates the distribution of 3D forces across Insight's skin from each camera image, capably capturing multiple complex contacts.

download.jpeg

Benjamin Tee

10:45-11:10   (20 min + 5 min Q/A)

 

Title: Materials and Skins for Intelligent Machines.     Video 

Abstract: We live in an increasingly hyper-connected environment where humans, smart devices and robots live in synergy together. Flexible, wearable sensors and systems are accelerating this trend by generating ever greater amounts of data for AI algorithms to process and understand. Exciting new understanding and developments in somatosensory sciences will further augment human abilities and aid in applications as health diagnostics, surgery and predictive analytics. We believe a multi-disciplinary approach especially in materials design and processing is essential to achieve near or even superhuman capabilities in robotics. In the area of manipulation tasks, robots have yet to match human abilities despite progress in various sensing and actuator systems. We apply a neuromorphic approach for sensory systems as a potential pathway towards greater tactile and machine intelligence. I will discuss our approach and recent progress in developing new soft materials systems and neuromorphic approaches for robotic intelligence. Fusion of sensing modalities such as neuromorphic vision and touch will also facilitate robotic learning towards greater autonomy, especially as remote work, and “digitial twins” gains critical importance in pandemics and the future of work.

Canata.jpeg

Giorgio Cannata

 

11:10-11:35   (20 min + 5 min Q/A)

Title: Robots touching and touching robots.         Video

 

Abstract: Tactile sensors enable robots to properly react to contacts, but on the other hand they allow to sense the tactile features of touched objects and in the end to recognize them.

In this talk I will present some experiments involving robots sensorized with large area capacitive sensors, based on the CySkin technology developed at the University of Genova, and in particular I will focus on the problem of the recognition of human hand touch.

As a matter of fact, coexistence of robots and humans has gained great relevance over the past few years, and the capability of recognizing tactile gestures is a key element to trigger safe human robot interaction and to drive cooperative tasks or robot motions, or to ensure a safe interaction. Tactile data are acquired by sparse transducers, non-uniformly distributed over non-planar manifolds. These aspects together with the complexity the contacts arising make the processing of tactile information a difficult task. The approach that we propose is based on geometric transformations of the tactile data from 3D maps, formed by pressure measurements associated taxels spread over the robot body, into tactile images representing the contact pressure distribution in two dimensions. Deep learning algorithms are then applied to recognize human hands and to compute the pressure distribution applied on the robot by the various hand segments: palm and single fingers.

Screenshot 2021-03-30 at 22.08.42.png

Salvatore Pirozzi

11:35 -12:00   (20 min + 5 min Q/A)

 

Title: Force/Tactile Sensor Technology for Robotic Manipulation.       Video

Abstract: Nowadays, robotic systems use tactile sensing as a key enabling technology to implement complex tasks. For example, manipulation and grasping problems strongly depend on the physical and geometrical characteristics of the objects, in fact, objects may be deformable or change their shape when in contact with the robot or the environment. For this reason, often, robots end effectors are equipped with sensorized fingers which can estimate the objects' features, forces, and contact locations. The idea of designing and developing, in our laboratories, a tactile sensor based on optoelectronic technology dates to about a decade ago, within the FP7 European project DEXMART. During these years, the evolution of optoelectronic devices and our experience in the field allowed us to optimize our prototypes, by reaching with latest versions a high measurement performance and a high mechatronic integration level. The working principle is based on the idea to design a deformable layer to be suitably assembled with a discrete number of optoelectronic sensing devices, with the objective to transduce the external contacts into deformations measured by the optical sensible points (typically called “taxels” in literature). The sensing points, positioned below the deformable layer, provide a “tactile map” corresponding to a spatially distributed information about the contact. Based on application task, the tactile map can be used to reconstruct contact properties, e.g., contact force, contact torque, object shape. This contribution will present the technology related to the latest solution, in particular the one developed during the last two years, within the H2020 European projects REFILLS and REMODEL. Different application scenario will be presented in order to demonstrate the manipulation abilities based on the reconstructed forces and torques or the direct use of the tactile map. All these abilities can be performed by simple parallel jaw grippers equipped with the sensors. The slipping avoidance ability consists of firmly grasping an object by applying the “lowest” grasp force that avoids slippage. The pivoting maneuver can be executed in two different modalities called gripper and object pivoting, respectively. The first one consists of having the object fixed in the space while the gripper rotates about the grasp axis so as to change the relative orientation between the gripper and the object. The second pivoting modality is the dual one and consists of having the gripper fixed in the space while the object rotates in a pendulum-like motion. Additionally, the tactile map can be directly used to estimate the shape of grasped Deformable Linear Objects (DLOs) and to recognize object features (e.g., wire diameters), by means of machine learning techniques

guder_firat-guder_edited.jpg

Firat Guder

12:00-12:25   (20 min + 5 min Q/A)

Title: Stretchable soft conductive composites for sensing force and touch  Video

 

Abstract: Although stretchable soft conductive composites, consisting of silicone polymers and conductive fillers, have been used for electrical sensing of force in academic laboratories, these materials have not been able to move into real world applications. This is largely due to the lack of robust, low-cost and geometrically scalable technologies to reliably connect chemically inert silicone composites with solid-state electronics. In this talk, I will present our recent work [1] on the nanoporous Si-Cu based electrical contact technology and how it enables a range of applications for silicone-based force sensors especially for sensing touch for medical applications. 

 

1. Michael Kasimatis, Estefania Nunez-Bajo, Max Grell, Yasin Cotur, Giandrin Barandun, Ji-Seon Kim, and Firat Güder, “Monolithic Solder-On Nanoporous Si-Cu Contacts for Stretchable Silicone Composite Sensors”, ACS Applied Materials & Interfaces 2019 11 (50), 47577-47586 

12:30-13:00

Paper presentation 

     12:30-12:40

  • Low-pass filter effects in biological neurons as a feature to facilitate representation of tactile information

      Udaya B. Rongala and Henrik Jörntell.         download paper.pdf

     12:40-12:50

  • A Local Filtering Technique for Robot Skin Systems

       Alessandro Albini, Giorgio Cannata and Perla Maiolino         download paper.pdf

     12:50-13:00

  • Sensor Fusion and Multimodal Learning for Robotic Grasp Verification

       Priteshkumar GohilSantosh Thoduka and Paul Plöger.    download paper.pdf

Antonio_Bicchi.jpeg

Antonio Bicchi

13:00-13:25   (20 min + 5 min Q/A)

 

Title: Grasping with a Sense of Touch.     Video

Abstract: I will report on the new problems that arise in robotic grasping and manipulation with soft, adaptable hands and approaches that can be used in conjunction with tactile sensing capabilities. We will consider reactive grasping procedures that progressively refine an initial approximated grasp into a full form closure one. I will also consider interaction with tight environment constraints, and how manipulation can be planned in cases where classical randomized methods have difficulties

Domenico .jpeg

Domenico Prattichizzo

13:25-13:50   (20 min + 5 min Q/A)

Title: Soft Manipulation with Rigid and Magnetic Constraints    Video 

Abstract: Soft robotic hands are powerful end-effectors allowing compliant interactions with the environment and objects. The softness of the fingers largely increases the robustness of the physical interaction making manipulation very robust with respect to uncertainties. Soft robotic hands are typically underactuated and this, together with unpredictable deformations, affect the overall accuracy of manipulation tasks. Moreover, soft robotic hands are not dexterous because they are typically underactuated to avoid complex design of hands. 

In this talk I will present how rigid and magnetic constraints can be exploited to improve soft manipulation without impacting the simplicity of the design of the soft manipulation systems. I will present some ideas on how to exploit softness, rigid and magnetic constraints to improve soft manipulation.

cmajidi.jpeg

Carmel Majidi

13:50-14:15   (20 min + 5 min Q/A)

Title: Soft Robots that Feel – Multimodal Sensing Skins for Soft Robot Grasping.   Video

Abstract: By eliminating rigid materials and hard contacts, soft robot end effectors have the potential to revolutionize robot grasping and manipulation. However, their ability to sense and map objects is highly limited by the bulk and stiffness of existing sensor electronics. In this talk, I will present progress in creating soft electronic sensing skins that can be incorporated into soft robot grippers and enable a wide range of sensing modalities. These sensing skins utilize a variety of material architectures, from highly stretchable liquid metal circuits to soft magnetized elastomers. When combined with methods in machine learning, these skins can be used to enable soft robot grippers to perform a variety of closed-loop grasping tasks that were not previously possible with open-loop techniques. Moreover, they can also be used as wearable electronic stickers for monitoring health vitals. In addition to describing their material architecture and sensing properties, I will discuss the utilization of these sensors in a variety of applications, from humanoid robotics to healthcare.

Christopher_G_edited.jpg

Chris Atkeson

14:15-14:40   (20 min + 5 min Q/A)

Title: Superhuman tactile sensing.   Video

 

Abstract: Robots should go beyond biological models for sensing.  Human and animal tactile sensing is limited to when skin, hairs, or whiskers are in contact with an object. Thermal sensing can sense infrared radiation, sensing heat at a distance. Sensing of local wind currents and their temperature can indicate the movement of nearby objects. For robots, we can do better than this by extending tactile sensing with proximity sensing of nearby objects. Proximity sensing is useful to be able to predict contact time and location, and generate priors for

contacted object pose, as well as nearby object locations and poses. In work on a camera-based tactile sensor with transparent skin, we found that vision of nearby objects was useful for centering grasps, grasping with little force, measuring small forces, and letting go of an object without knocking it over. In recent work we have explored cameras collocated with tactile sensors, rather than using the same camera and optical path for proximity and tactile

sensing. This avoids some of the drawbacks of FingerVision. We are also exploring the use of radar for proximity sensing, including proximity sensing of occluded objects.

 

“I don’t want to be human. I want to see gamma rays, I want to hear X-rays, and I want to smell dark matter.”

—John Cavil, Cylon Model Number One, “No Exit”, Episode 15, Season 4, Battlestar Galactica (for this meeting a better quote would be: "I want to touch dark matter

Robert Howe.png

Robert Howe

14:40-15:05   (20 min + 5 min Q/A)

Title: Using Tactile Signals and Grasp Analysis for Real-time Stability Prediction.  Video

Abstract: Grasp analysis is a well-developed framework for predicting grasp stability, based on force and torque equilibrium including friction. While it has been widely used for grasp planning, it has not been exploited for real-time control for robot hands. We are developing a highly instrumented robot hand with contact sensors to estimate the quantities needed for stability prediction, namely finger-object contact locations, surface normal vectors at the contact locations, and contact force vectors. Initial results suggest that force signals are intrinsically noisy with the relatively stiff polymer materials typically used for robot fingertips. In addition, the observed frictional behavior does not follow the Coulomb models typically used in grasp analysis. This limits the ability to accurately predict when objects will slip within a grasp. This has implications for the design of effective robot hands, as well as reliable grasp control methods.

Peter Allen.jpeg

Peter Allen

15:05-15:30  (20 min + 5 min Q/A)

Title: MAT: Multi-Fingered Adaptive Tactile Grasping via Deep Reinforcement Learning.   Video

Abstract: Vision-based grasping systems typically adopt an open-loop execution of a planned grasp. This policy can fail due to many reasons, including ubiquitous calibration error. Recovery from a failed grasp is further complicated by visual occlusion, as the hand is usually occluding the vision sensor as it attempts another open-loop regrasp. This talk presents MAT, a tactile closed-loop method capable of realizing grasps provided by a coarse initial positioning of the hand above an object. Our algorithm is a deep reinforcement learning (RL) policy optimized through the clipped surrogate objective within a maximum entropy RL framework to balance exploitation and exploration. The method utilizes tactile and proprioceptive information to act through both fine finger motions and larger regrasp movements to execute stable grasps. A novel curriculum of action motion magnitude makes learning more tractable and helps turn common failure cases into successes. Careful selection of features that exhibit small sim-to-real gaps enables this tactile grasping policy, trained purely in simulation, to transfer well to real world environments without the need for additional learning. Experimentally, this methodology improves over a vision-only grasp success rate substantially on a multi-fingered robot hand. When this methodology is used to realize grasps from coarse initial positions provided by a vision-only planner, the system is made dramatically more robust to calibration errors in the camera-robot transform.

15:30-16:00

Paper presentation 

     15:30-15:40

  • Active Tapping via Gaussian Process for Efficient Unknown Object Surface Reconstruction

      Su Sun and Byung-Cheol Min.   download paper.pdf

     15:40-15:50

  • TIAGo RL: Simulated Reinforcement Learning Environments with Tactile Data for Mobile Robots

      Luca Lach, Robert Haschke, Francesco Ferro, Helge Ritter      download paper.pdf

     15:50-16:00

  •  Towards a soft robotic, haptic feedback seat for autonomy level transitions in highly-automated vehicles

       Jan Peters, Bani Anvari, Annika Raatz and Helge A. Wurdemann.       download paper.pdf

Lynette%20Jones_1_edited.jpg

16:00 -16:25  (20 min + 5 min Q/A)

Title: Thermal and Tactile Sensing and the Development of Multi-sensory Cutaneous Displays.  Video

Abstract: When the hand makes contact with an object its geometric and material properties are readily encoded by cutaneous mechanoreceptors that signal features such as the object’s shape, surface texture and compliance. Changes in skin temperature can also occur as the object is manipulated, with the thermal properties of the object and skin, as well as their initial temperatures, determining whether the heat flux is conducted out of the skin or object on contact. These changes in temperature provide information about the object’s thermal properties which assists in identifying its material composition. Although the thermal cues are subtle and changes in temperature are strictly localized to the area of contact, we have demonstrated in a number of experiments that such signals not only enable the composition of objects to be identified and discriminated but also provide information about contact force and area. Over the range of forces typically used during manual exploration (0.1-6 N), skin temperature decreases by an average of 5-6 °C after 10 s, reflecting changes in blood flow to the finger pad as it is compressed. Thermal models developed that incorporate such contact conditions and material properties have been shown to capture these changes in skin temperature. When implemented in thermal displays they enable users to identify and discriminate between simulated materials. Thermal feedback has also been combined with vibrotactile feedback in multisensory cutaneous displays to enhance user experience during object manipulation in virtual environments or when working with teleoperated robotic systems. Given the very different temporal and spatial processing properties of the tactile and thermal sensory systems, it is critical to determine the optimal temporal profiles for presenting such cues so that the signals are not masked. Our work has demonstrated that the perception of tactile cues can be enhanced or impeded depending on whether the skin is warmed or cooled and that these effects are specific to particular features of the vibrotactile signals presented.

Lynette Jones

allison-okamura_edited.jpg

Allison Okamura

16:25-16:50   (20 min + 5 min Q/A)

Title: Towards Proprioception and Exteroception for Soft Growing Robots.  Video

Abstract: Due to their ability to move without sliding relative to their environment, soft growing robots are attractive for exploring unknown environments and deploying distributed sensor networks in confined spaces. Sensing of the state of such robots and their environment would add to their capabilities as human-safe, adaptable manipulators. However, incorporation of sensors into soft growing robots is challenging because it requires an interface between stiff and soft materials, and the sensors needs to undergo significant strain. In this work, we present two methods for adding distributed sensors to soft growing robots that use (1) bundled optical fibers for strain sensing using Optical Frequency Domain Reflectometry (OFDR) and (2) flexible printed circuit boards with self-contained units of microcontrollers and sensors encased in a laminate armor that protects them from unsafe curvatures. We demonstrate several capabilities of these sensing systems, including proprioception to measure growing robot shape and exteroception in the form of directional temperature and humidity information.This work advances the capabilities of soft growing robots, as well as the field of soft robot sensing.

lynch-kevin.jpeg

16:50-17:15  (20 min + 5 min Q/A)

Title: The Visiflex tactile and wrench-sensing fingertip.  Video

Abstract: Robot manipulation of, and contact with, rigid and nearly rigid objects requires compliance at the manipulator. In this talk I will describe the Visiflex, a tactile fingertip that uses a camera and a passive six-dof flexure to achieve the desired compliance and to simultaneously sense applied wrenches (forces and moments) and contact locations. I will also describe desirable symmetry properties of the flexures. These passively-compliant sensing fingertips enable the application of a theoretical framework for planning and controlling dexterous tasks such as in-hand sliding regrasps.

Kevin Lynch

download.jpeg

Yon Visell

17:15-17:40  (20 min + 5 min Q/A)

Title: Haptic Sensing, Perception and Soft Mechanics.  Video

 

Abstract: The sense of touch is essential for skilled manipulation and object perception.  Tactile sensing by humans and other animals is supported by biomechanical coupling in soft tissues, which transforms mechanical signals that are elicited by even localized touch contacts and distributes these signals to widespread tactile sensory neurons.  In this talk I will discuss how these processes are revising our understanding of haptic perception and how they furnish new ideas and strategies for haptic and robotic engineering.  

 

17:40-18:10

Paper presentation 

     17:40-17:50

     An Active Extrinsic Contact Sensing for Generalizable Insertion Strategy 

      Sangwoon Kim and Alberto Rodriguez     download paper.pdf

     17:50-18:00

     

      Active Visuo-Tactile Object Pose Estimation

      Prajval Kumar Murali and Mohsen Kaboli     download paper.pdf

    18:00-18:10

      Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors

      Abhinav Grover, Chrisopher Grebe, Philippe Nadeau and Jonathan Kelly.      download paper.pdf

   

 18:20 -19:20

Panel Discussion

Best Paper and Presentation Award

Keynote Speakers

(Confirmed)

Lynette Jones
Lynette Jones

Massachusetts Institute of Technology (MIT), USA

press to zoom
Allison Okamura
Allison Okamura

Stanford University, USA

press to zoom
Katherine Kuchenbecker
Katherine Kuchenbecker

Max Planck Institute for Intelligent Systems, Germany

press to zoom
Robert Howe
Robert Howe

Harvard University, USA

press to zoom
Antonio Bicchi
Antonio Bicchi

University of Pisa, Italian Institute of Technology, Italy

press to zoom
Kevin Lynch
Kevin Lynch

Northwestern University, USA

press to zoom
Peter Allen
Peter Allen

Columbia University, USA

press to zoom
Vincent Hayward
Vincent Hayward

Sorbonne University, France

press to zoom
Henrik Jörntell
Henrik Jörntell

University of Lund, Sweden

press to zoom
Chris Atkeson
Chris Atkeson

Carnegie Mellon University (CMU), USA

press to zoom
Yon Visell
Yon Visell

University of California, Santa Barbara, USA

press to zoom
Domenico Prattichizzo
Domenico Prattichizzo

University of Siena, Italy

press to zoom
Tansu Celikel
Tansu Celikel

Radboud University, Netherlands

press to zoom
Carmel Majidi
Carmel Majidi

Carnegie Mellon University (CMU), USA

press to zoom
Giorgio Cannata
Giorgio Cannata

University of Genova, Italy

press to zoom
Benjamin Tee
Benjamin Tee

National University of Singapore (NUS), Singapore

press to zoom
Firat Guder
Firat Guder

Imperial College London, UK

press to zoom
Salvatore Pirozzi
Salvatore Pirozzi

University of Campania Luigi Vanvitelli, Italy

press to zoom
Mohsen Kaboli
Mohsen Kaboli

BMW Group Radboud University, Germany

press to zoom

Organizers

Mohsen Kaboli *
Mohsen Kaboli *

BMW Group, Germany Radboud University, Netherlands

press to zoom
Tapo Bhattacharjee
Tapo Bhattacharjee

Cornell University, USA

press to zoom
Etienne Burdet
Etienne Burdet

Imperial College London, UK

press to zoom
Vincent Hayward
Vincent Hayward

Sorbonne University, France

press to zoom
Henrik Jörntell
Henrik Jörntell

Lund University, Sweden

press to zoom

Accepted Papers

  • An Active Extrinsic Contact Sensing for Generalizable Insertion Strategy

       Sangwoon Kim and Alberto Rodriguez     download paper.pdf

  • A Local Filtering Technique for Robot Skin Systems

       Alessandro Albini, Giorgio Cannata and Perla Maiolino         download paper.pdf

  • Active Tapping via Gaussian Process for Efficient Unknown Object Surface Reconstruction

      Su Sun and Byung-Cheol Min.   download paper.pdf

  • Sensor Fusion and Multimodal Learning for Robotic Grasp Verification

       Priteshkumar GohilSantosh Thoduka and Paul Plöger.    download paper.pdf

  • Towards a soft robotic, haptic feedback seat for autonomy level transitions in highly-automated vehicles

       Jan Peters, Bani Anvari, Annika Raatz and Helge A. Wurdemann.       download paper.pdf

  • Low-pass filter effects in biological neurons as a feature to facilitate representation of tactile information

      Udaya B. Rongala and Henrik Jörntell.         download paper.pdf

  • Robust Grasping under Uncertainty Employing Tactile Sensors

      Luca Lach, Robert Haschke, Francesco Ferro, Helge Ritter      download paper.pdf

  • Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors

      Abhinav Grover, Chrisopher Grebe, Philippe Nadeau and Jonathan Kelly.      download paper.pdf

   Topics of interest (but not limited to)

 

   Human Sense of Touch  

  • Touch physiology from skin to brain

  • Haptic Perception 

  • Action and Perception Loop

  • Perception for Learning

 

   Tactile Sensing Technologies

  • Conformable and compliant materials

  • Features enabled by conformable sensors

  • Biomimetics

  • Sensor effects

  • Integration and read-out strategies

  • Self-healing properties and strategies

  • Sensor skins: design, fabrication and integration strategies

  • Integration strategies for sensors in robotics

  • Enabling technologies for fully integrated robotic systems

   

    Tactile Interactive Perception and Predictive Coding

  • Exploitation of contact constraints

  • Novel contact models

  • Object perception for the exploitation of contact

  • Tactile information processing

  • Tactile feature extraction / feature learning 

  • Tactile-based object modelling 

  • Tactile object localization

  • Tactile shape reconstruction and recognition

  • Tactile object classification

  • Tactile exploration

  • Trends in combining of vision and touch sensing

  • Roles of vision and touch sensing in different object perception tasks

  • Modelling and representation of sensing modalities

  • Integration of visio-tactile sensing modalities

  •             

 

    Tactile Grasp, Manipulation, & HRI (Soft and conventional Robotic Systems)

  • Linear/rotational slip detection 

  • Grasping planning

  • Grasp stability assessment

  • Soft manipulation

  • In-hand/whole body manipulation

  • Tactile planning interplay between touch sensing and vision

  • Tactile knowledge/skill transfer  

  • Tactile transfer learning

  • The meaning and function of different sensing modalities in object manipulation

  • Sensing and planning in object manipulation

  • Multi-robot manipulation and coordination

  • Control strategy for object manipulation and collaborative assembly

  • Learning object manipulation skills from human demonstration

  • Novel approaches to grasp and manipulation planning

  • Whole-body, multi-contact planning and control

  • Design and characterisations of contact-exploiting, compliant hands

Intended audience 

 

In our proposed workshop we will discuss the most recent approaches in the area of tactile perception and learning in especially soft robotic systems. This is a topic that is not widely represented in the general IROS-2021 conference. The goal of this workshop is to disseminate results and benefits of novel approaches to a wider audience looking for emerging new technologies. We intend to invite well-known experts in these areas to attract a larger set of IROS-2021 attendees and simultaneously be a platform for more junior researchers. This workshop is intended for roboticists working in the areas of tactile sensation, perception, manipulation, and learning in the field of robotics. It is especially aimed at roboticists interested in improving the reliability and autonomy of robotic systems. We hope to bring together outstanding senior and young researchers as well as graduate students to discuss current trends, problems, and opportunities in tactile perception and learning in robotics.

Sponsored by

EU.jpeg
bmw_group_logo.png
Screenshot 2021-08-12 at 11.27.36.png
download.png
MDPI Robotics partnership-logo.png
235d4ccd-aa8f-4174-a864-e2df3ce5ea50.JPG
logos-13.jpeg
BWixJVVHxS96.jpeg
radboud-uni.png
NeurotechEU_Flag_Square.png
Screenshot 2021-03-30 at 00.36.10.png
Screenshot 2021-09-01 at 16.28.50.png
donders_logo.jpeg