2 Workshops at CHI’26
We are pleased to share that Prof. Dr. Jürgen Steimle and Marie Muehlhaus from our team are involved in organizing the following workshops at CHI’26:
AI for Haptics and Haptics for AI: Challenges and Opportunities
Organizers:
- Easa AliAbbasi, Sensorimotor Interaction Group, Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany
- Dennis Wittchen, Sensorimotor Interaction, Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany; Faculty of Informatics / Mathematics, Dresden University of Applied Sciences, Dresden, Saxony, Germany
- Yinan Li, School of Computing and Augmented Intelligence, Arizona State University, Tempe, Arizona, United States
- Shihan Lu, Northwestern University, Evanston, Illinois, United States
- Thomas Müller, TU Dresden, Dresden, Saxony, Germany
- Donald Degraen, HIT Lab NZ, University of Canterbury, Christchurch, New Zealand
- Thomas Leimkühler, MPI Informatik, Saarbruecken, Germany
- Sang Ho Yoon, Graduate School of Culture Technology, KAIST, Daejeon, Republic of Korea
- Hasti Seifi, Arizona State University, Tempe, Arizona, United States
- Oliver Schneider University of Waterloo, Waterloo, Ontario, Canada
- Heather Culbertson Computer Science, University of Southern California, Los Angeles, California, United States
- Jürgen Steimle Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
- Paul Strohmeier Sensorimotor Interaction, Max Planck Institute for Informatics, Saarland Informatics Campus, Saarbrücken, Germany
Description: AI has transformed methods and knowledge across many domains. However, the intersection of AI and haptics remains underexplored. While modern AI techniques — fueled by machine learning and using powerful techniques such as generative modeling and reinforcement learning — offer powerful opportunities for advancing haptic design, insights from haptics research, such as perception modeling and adaptive interaction – grounded in human touch, embodiment, and multisensory integration — can also play a critical role in shaping more human-centered AI systems. This workshop will bring together an interdisciplinary community of researchers from HCI, haptics, AI, robotics, and design to (1) identify pressing questions in haptics that could benefit from AI approaches and (2) highlight ways in which haptic knowledge can support the development of embodied and context-aware AI. Through position papers and paper presentations, we will map key challenges, exchange methods, and explore new research directions that connect the two fields. By framing haptics and AI as mutually reinforcing, the workshop aims to build a shared research agenda and foster collaborations that advance both the science of touch and the design of intelligent interactive systems.
More information and how to participate: https://derikon.github.io/HapticsAI_Workshop/
_____
Augmented Body Parts: Bridging VR Embodiment and Wearable Robotics
Organizers:
- HyeonBeom Yi Electronics and Telecommunications Research Institute, Daejeon, Korea, Republic of
- Myung Jin (MJ) Kim Electronics and Telecommunications Research Institute, Daejeon, Korea, Republic of, mj@etri.re.kr
- Seungwoo Je Southern University of Science and Technology, Shenzhen, China, seungwoo@sustech.edu.cn
- Seungjae Oh Department of Software Convergence , Kyung Hee University, Yongin, Korea, Republic of, oreo329@khu.ac.kr
- Shuto Takashita Information Somatics Lab, The University of Tokyo, Tokyo, Japan
- Hongyu Zhou School of Computer Science, The University of Sydney, Sydney, NSW, Australia
- Marie Muehlhaus Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
- Dr. Eyal Ofek the school of computer science, University of Birmingham, Birmingham, United Kingdom, e.ofek@bham.ac.uk
- Andrea Bianchi KAIST, Daejeon, Korea, Republic of, andrea@kaist.ac.kr
Description: Recent work across HCI/HRI and wearable robotics has investigated how people control and perceive extra body parts in both virtual and physical settings. Virtual embodiment in XR has shown that users can experience ownership and agency with non-anthropomorphic avatars, while wearable robotics has introduced supernumerary limbs such as third arms and robotic tails. Despite these shared goals, connections between findings remain limited because VR and hardware studies rely on different assumptions about sensory feedback, human perception, and physical constraints, making insights difficult to transfer across contexts. This workshop brings together researchers in XR, wearable robotics, haptics, and neuroscience to explore how to foster embodiment and adaptation with augmented body parts, and how to bridge virtual embodiment to effective use with wearable devices. Through a keynote, brief position shares, and two hands-on group activities, participants will examine control mappings and sensory-feedback strategies and identify which aspects of VR-based embodiment realistically transfer when accounting for hardware limits, sensor variability, and cognitive load. Ultimately, the workshop aims to articulate a focused research agenda connecting VR-based insights to feasible wearable robotics implementations, enabling future work on augmenting the human body with new parts and capabilities.
More information and how to participate: https://sites.google.com/view/augmented-body-parts
