top of page

Program

Local Time in Detroit-USA (GMT -4)
October 1, 2023​

 

08:30-09:00 Welcome opening 

Talks

09:00-09:25 (20 min + 5 min Q&A)

Title: Embedding optics-based force and stiffness sensors

Abstract: With an ever-growing interest in creating intelligent manipulation devices capable of expertly interacting with their environments, comes the necessity to create new sensing technologies that can accurately estimate relevant interaction parameters. Researchers face the challenge of creating sensor devices that satisfy the requirements for integration with the increasingly sophisticated robot hands and grippers being developed and, at the same time, allow a truthful tactile perception of the objects being handled. Our research focusses on creating miniature sensors that are suitable to be embedded with the fingers and palms of robot hands to measure interaction forces, tactile information as well as the stiffness of handled objects.

09:25-09:50 (20 min + 5 min Q&A)

Title: Tactile dexterity - Control of object pose and contact with tactile feedback

Abstract: 

09:50-10:15 (20 min + 5 min Q&A)

Title: What we’ve been up to with tactile sensing lately: from hardware to applications

Abstract: This talk reports on the last six months of work in the area of tactile sensing for manipulation from our lab. From a sensor design perspective, we’ve been evaluating vibration transducers (microphones), piezoelectric materials, and off-the-shelf capacitive transducers, looking to decide which of these to include in our next generation tactile fingers. From a low-level signal processing perspective, we’ve been experimenting with Transformer learning architectures that infer contact properties from a short history of raw readings, as opposed to using only individual snapshots. Finally, from a high level application perspective, we recently demonstrated dexterous in-hand manipulation and, in a separate project, object recognition, all based on tactile sensing; we’re currently working on tactile-based object retrieval from granular media. We will discuss all these aspects, as well as the interplay between them.

 

10:15 -10:30 Break

Student Pitch - I

     10:30 - 10:35

  • Active Acoustic Sensing – Augmenting and Complementing Visuo-Tactile Sensing for Robot Manipulation - S. Lu and H. Culbertson

     10:35 - 10:40

  • GelSight Svelte Hand: A Three-finger, Two-DoF, Tactile-rich, Low-cost Robot Hand for Dexterous Manipulation - J. Zhao and E. Adelson

     10:40 - 10:45

  • Robotic Assembly Using a Tactile Sensor and a Soft Wrist - J. R-Miquel, M. Hamaya, C. C. B. Hernandez, K. Tanaka. 

     10:45 - 10:50

  • VisTac: Towards a Unified Multi-Modal Sensing Finger for Robotic Manipulation - S. Athar, G. Patel, Z. Xu, Q. Qiu, Y. She

     10:50 - 10:55

  • Estimation of Extrinsic Contact Patch for Stable Placement - K. Ota, D. K. Jha, K. M. Jatavallabhula, A. Kanezaki, J. B. Tenenbaum

     10:55 - 11:00

  • General In-Hand Object Rotation with Vision and Touch - H. Qi, B. Yi, S. Suresh, M. Lambeta, Y. Ma, R. Calandra, J. Malik

     11:00 - 11:05

  •  Seamless Integration of Tactile Sensors for Cobots - R. Proesmans and F. Wyffels

     11:05 - 11:10

  • Avocado Firmness Assessment Using Vision-Based Tactile Sensing: A Non-Destructive Approach - M. M. Mohsan, B. B. Hasanen, T. Hassan, M. U. Din, N. Werghi, L. Seneviratne and I. Hussain

11:10-11:35 (20 min + 5 min Q&A)

Title: Deciphering Physical Cues and Dimensions that Underlie Our Tactile Sense of  Compliance

Abstract: Mobile phones, tablets, and watches have become a normal part of everyday life. The next generation of devices will enable touch feedback, become flexible, and extend increasingly rich and immersive interactions into virtual reality. Remote touch interactions will convey an object’s compliance, or softness, as with fruits such as plums and tissues such as skin. In addition to applications in entertainment and personal productivity, such interactions will enable surgeons to distinguish gallbladder and prostate tissue and ducts from fat and bone, small children to feel a parent’s hand, and consumers to inspect and compare products, clothing, and work pieces. While promising new possibilities, the displays under development do not yet feel natural. They also face severe limitations in terms of weight, power, and actuation range. To inform devices to replicate naturalistic interactions of this sort, we have been working to define how the finger pad must deform in space and time to adequately convey a natural sense of compliance, in conjunction with volitional control strategies. In general, our sense of ‘softness’ or compliance is thought to be encoded by relationships of force, displacement, and contact area at the finger pad. I will present a series of studies which are seeking to define how time-dependent cues, or information in the rate of change of skin over a spatial field, govern the encoding of compliance.

11:35-12:00 (20 min + 5 min Q&A)

Title: Soft Robotic Architectures and Sensing Skins for Manipulation

Abstract: Robot end effectors capable of dexterous manipulation typically rely on a bank of motors for high degree-of-freedom actuation and an external camera for vision-based feedback.  While adequate for many robot tasks, these systems are often too bulky for use with lightweight, mobile, and wearable robotic platforms.  Progress in the use of robotic manipulation for these applications depends on new designs that allow for high DOF articulation and expanded sensing capabilities with limited dependency on bulky supporting hardware.  In this talk, I will discuss efforts to create lightweight, compact manipulators using emerging paradigms in soft robotics and soft-matter electronics.  In paricular, I will focus on new classes of sensing skins that utilize a variety of material architectures, from highly stretchable liquid metal circuits to soft magnetized elastomers.  When combined with methods in machine learning and soft robot actuation, these skins can be used to enable soft robot grippers to perform a variety of closed-loop grasping tasks that were not previously possible with open-loop techniques.  Moreover, soft electronics and actuators can be combined with synthetic biology to create biohybrid soft robot grippers capable of unique chemical sensing and response functionalities.  Applications range from in-hand manipulation and “universal” pick-and-place grasping capabilities to augmentation of the NASA Robonaut 2 for space exploration and teleoperation.

 

12:00 -12:30 Panel Discussion I 

12:30 -13:30 Lunch Break  

13:30 - 13:55 (20 min + 5 min Q&A)

Title: Building tactile sensors for robotics – a commercial perspective

Abstract:  There has been a lot of media surrounding “general purpose robots” that can help to address labour shortages. This means robots that replicate what people can do with their hands – their dexterity. There is overwhelming evidence and general agreement from roboticists that tactile sensing is crucial for achieving dexterity in robots. A shift in design thinking is required to ensure that these robots are designed with sensing as a core requirement, rather than as an afterthought. However, there are practical impediments to the uptake of existing tactile sensing technologies in robotics: there are issues of usefulness, reliability and robustness, and affordability. Of course, none of these can be considered in isolation, and at the center, one must always be thinking of the objective: dexterity.

13:55 - 14:20 (20 min + 5 min Q&A)

Title: Dexterous and Forceful Tool-Use with High-resolution and Highly Compliant Tactile Sensors

Abstract: Dexterous tool manipulation is a dance between tool motion and force transmission choreographed by the robot's end-effector. Take for example the use of a spatula. How should the robot reason jointly over the tool’s geometry and forces imparted to the environment through tactile feedback? In this talk, I will present our recent progress on tactile control for tool-use with high-resolution and highly deformable tactile sensors. We will discuss how tactile sensor compliance is both a blessing and a curse and how it can not only be accounted for but even exploited for contact rich manipulation with grasped objects. We will show the application of our techniques for robust tool-use behavior (e.g., in hand pivoting, drawing, and dense insertion) in the presence of complex dynamics induced by the sensor mechanical substrate. We’ll conclude the talk by discussing future directions for dexterous tool-use.

14:20 - 14:45 (20 min + 5 min Q&A)

Title: Tactile Robot Dexterity

Abstract: ‘In this talk I summarize some goals for robot dexterity and the critical dependence on an artificial sense of touch. I cover recent progress on 3d-printed high-resolution tactile sensing based on the human sense of touch, the integration into 3d-printed robot hands, and the control of tactile robots for tasks such as dexterous object following, tracking and pushing. Methods include tactile servo control with ConvNet models to predict sensor-object pose/shear from high-resolution tactile images, and also sim-to-real deep reinforcement learning for learning policies with zero shot transfer to real environments. I conclude with some comments about how improved accessibility for tactile robots is needed for widespread adoption of robot dexterity.’  

 

Student Pitch - II

     14:45 - 14:50

  • Simultaneous Tactile Estimation and Control for Object Manipulation - S. Kim, A Bronars, P. Patre, A. Rodriguez

     14:50 - 14:55

  • Visuo-Tactile based Active Object Parameter Inference - a Bayesian approach - A. Dutta, E. Burdet, M. Kaboli

     14:55 - 15:00

  • Toward Dexterous Robot Manipulation of Deformable Linear Objects: Estimating Cable Pose during a Fingertip Grasp using Tactile Sensors - K. Mathenia, T. Armstrong, N. T. Fitter, J. R. Davidson

​​

     15:00 - 15:05

  • Sim2Real Learning of Vision-based Tactile Sensing at Large-scale - Q. K. Luu and V. A. Ho

     15:05 - 15:10

  • AcousTac: Tactile sensing with acoustic resonance for electronics-free soft surfaces - M. S. Li and H. S. Stuart

     15:10 - 15:15

  • Placing by Touching: An empirical study on the importance of tactile sensing for precise object placing - L. Lach, N. Funk, R. Haschke, H. J. Ritter, J. Peters, G. Chalvatzaki

     15:15 -15:20

  •  Visuo-Tactile-based Articulated Object Tracking with Manifold Filters, P. Murali, B. Porr , M. Kaboli

     ​

     15:20 - 15:25

  • Model-based Tactile Regrasping with the Smart Suction Cup - J. Lee, S. D. Lee, T. M. Huh, H. S. Stuart

15:30 -15:45 Coffee Break  

15:45 - 16:10 (20 min + 5 min Q&A)

Title: Object pose estimation with multiple vision-based tactile sensors

Abstract: Object pose estimation is a fundamental ability for robots to interact with objects. This problem is conventionally solved using vision, however, tactile feedback can provide useful cues when vision is unavailable or disturbed by occlusions. In this talk I will provide an overview of our work in which we studied how to use tactile feedback to estimate the pose of objects. I will focus on recent work in which we used multiple vision-based tactile sensors (i.e. DIGITs) to solve this task, filtering contact hypothesis using geometric constraints imposed by the relative position of the sensors and local features extracted from CNN trained in simulation. In the last part of the talk I will show preliminary work towards reducing the sim-to-real gap and the experimental validation in simulation and with real objects.

16:10 - 16:35 (20 min + 5 min Q&A)

Title: Approaches to Rendering Tactile information in Robotic Prostheses

Abstract: Grasping an object is one of the most common and complex actions performed by humans. The human brain can adapt and update the grasp dynamics through information received from sensory feedback. Prosthetic hands can assist with the mechanical performance of grasping, however currently commercially available prostheses do not address the disruption of the sensory feedback loop. Providing feedback about a prosthetic hand’s grasp force magnitude is a top priority for those with limb loss. This talk presents approaches to render tactile information for robotic prostheses

16:35 - 17:00 (20 min + 5 min Q&A)

Title: Multimodal Sensing at Different Resolutions for Robust Interactions

Abstract:  Different sensor modalities provide complementary information of interactions with the environment, with haptic and tactile feedback often providing more detailed information while vision provides more global context. In this talk, I will present recent work on how to efficiently train neural network policies to combine sensory feedback at different spatial and temporal resolutions. I will also discuss how tactile sensing can be used to learn more general concepts about food through interactions. 

17:00-17:30 Panel Discussion (20 minutes)
Best paper award & Concluding remarks

17:30-21:00 Mingling & Workshop dinner

bottom of page