Our research mission is to contribute to a next generation of user interfaces that seamlessly merge with the physical world and the human body. These interfaces create more effective, expressive, and engaging interactions with interactive systems and devices. Moreover, they are compatible with challenging mobility contexts and integrate well with real-world activities. This opens up a wide range of applications in various fields, including mobile and wearable computing, robotics, smart home, health and fitness devices.
We develop user interface technologies for advanced sensing and displays, invent new concepts for interaction, and empirically study user behavior.
Our focus areas include:
- Soft, flexible and stretchable user interfaces
- Touch and multi-touch interactions
- Body-based user interfaces
- Wearable devices, interactive skin and electronic textiles
- Mobile computing
- Augmented Reality
- Digital fabrication and rapid prototyping
- New materials for interaction
We are developing the foundations for a new generation of user interfaces that seamlessly blend with human skin. Interactive skin is ultra-thin, stretchable, and has a customized geometry. It contains multimodal sensors to capture user input and context. Embedded visual displays and advanced haptic components convey rich and expressive output. This enables its use as highly conformal interactive patches on various body sites, for interaction in challenging mobility tasks, for new types of tactile augmented reality, and for ergonomic physiological interfaces.
Next Steps in Epidermal Computing
Skin is a promising interaction medium and has been widely explored for mobile, and expressive interaction. Recent research in HCI has seen the development of Epidermal Computing Devices: ultra-thin and non-invasive devices which reside on the user’s skin, offering intimate integration with the curved surfaces of the body, while having physical and mechanical roperties that are akin to skin, expanding the horizon of on-body interaction.
Computational Design And Optimization Of Physiological Sensors
Our computational design tool enables users to quickly design custom physiological sensing devices with few clicks. This offers a direct, fast, and user-friendly way of setting body dimensions, selecting the modalities the sensor will be able to capture (EMG, ECG, and/or EDA), and selecting specific muscles for EMG sensing. The sensing quality of one or multiple modalities can be easily prioritized by moving a slider. Similarly, the priority of a compact sensor vs. the highest possible sensing quality can be continuously adjusted.
BodyStylus presents the first computer-assisted approach for on-body design and fabrication of epidermal interfaces. BodyStylus proposes a hand-held tool that augments freehand inking with digital support.
Like a Second Skin
Our work presents psycho-physical findings about how epidermal devices of various elasticity and thickness affect human tactile perception. Findings can guide the design and material selection of future epidermal devices.
Tacttoo is a feel-through interface for electro-tactile output on the user’s skin. At less than 35μm in thickness, it is the thinnest tactile interface for wearable computing to date.
SkinMarks are thin and conformal skin electronics for on-body interaction. They enable precisely localized input and visual output on strongly curved and elastic body landmarks.
More Than Touch
An elicitation study on how people interact on skin for controlling mobile devices. Investigates skin-specific input modalities, gestures and their associated mental modals, and preferred input locations.
We contribute new methods for designing and fabricating sensors, displays, and haptic devices that are soft, flexible, or stretchable. This comprises concepts, models, and algorithms for digital design tools that automatically generate low-level fabricable designs based on the designer’s high-level specification. Moreover, we develop accessible rapid manufacturing techniques to fabricate functional devices with custom geometry and material properties. These build on digital printing with new materials, printable electronics, and laser manufacturing.
A DIY approach for composing soft interactive devices from bio-based and bio-degradable materials.
Multi-Touch Kit enables electronics novices to rapidly prototype customized capacitive multitouch sensors with a commodity microcontroller and open-source software and does not require any specialized hardware.
Multi-Touch Skin are thin and flexible multi-touch sensors for on-skin input. They enable high-resolution multi-touch input on the body and can be customized in size and shape to fit various locations on the body.
Foldio is a new design and fabrication approach for custom interactive objects. The user defines a 3D model and assigns interactive controls; a fold layout containing printable electronics is auto-generated.
A Cuttable Multi-touch Sensor
In this project, we propose cutting as a novel paradigm for ad-hoc customization of printed electronic components. We contribute a printed capacitive multi-touch sensor, which can be cut by the end-user to modify its size and shape.
We contribute design tools and fabrication approaches for realizing interactive 3D objects. Moving beyond conventional 3D printing, these objects contain functional electronic components that are printed using multiple functional materials — in many cases even in a single pass along with the object itself. This realizes interactive objects of organic geometries that can sense touch and deformation, provide visual and haptic output, or change their shape.
We present a novel digital fabrication approach for printing custom, high-resolution controls for electro-tactile output with integrated touch sensing on interactive objects. We call these controls Tactlets.
HotFlex leverages printed embedded elements, capable of computer-controlled state change, to enable hands-on remodeling, personalization, and customization of a 3D-printed object after it is printed.
We investigate electronic textiles as a platform for wearable human-computer interaction. We develop interaction techniques that leverage intrinsic textile properties. Our work further contributes textile sensors and novel approaches for the rapid design and fabrication of customized e-textiles.
ClothTiles presents a prototyping platform to fabricate actuators of clothing. ClothTiles leverage flexible 3D-printing and Shape-Memory Alloys (SMAs) alongside new parametric actuation designs.
Rapid Iron-On User Interfaces
Rapid Iron-On User Interfaces support rapid fabrication in a sketching-like fashion, through handheld dispenser tool for directly applying continuous functional tapes of desired length.
We contribute to the field of haptics with lightweight and ergonomic haptic interfaces and with interfaces that extend the expressive range of haptic interaction. To this end, we develop novel haptic hardware using new materials and digital fabrication, work on haptic rendering, and investigate haptic perception in empirical studies with users.
At UIST’21 we present a paper on rapid prototyping of vibrotactile feedback in virtual reality by leveraging the user’s voice.
We introduce Springlets, expressive, non-vibrating mechanotactile interfaces on the skin. Embedded with shape memory alloy springs, we implement Springlets as thin and flexible stickers to be worn on various body locations.
We present a first systematic evaluation of the effects of compliance on force input. Results of a visual targeting task for three levels of softness indicate that high force levels appear more demanding for soft surfaces.
Leveraging the high dexterity of the human hands and fingers, we develop microgestures that allow users to control interactive systems in mobile or hands-busy settings, in a direct, rapid and subtle manner.
Computational design approach to recognize the desired set of freehand and/or grasping microgestures with minimal hand instrumentation. We also performed a series of analyses, including an evaluation of the entire combinatorial space (393K sensor layouts) and quantified the performance across different layout choices.
Our work informs gestural interaction with computer systems while hands are busy holding everyday objects. We present results from the first empirical analysis of over 2,400 microgestures performed by end-users.
We investigate new types of displays that are deformable or even actively change shape. We contribute new interactive technologies for flexible and shape-changing displays using visual computing and soft robotics, and we develop novel tangible interaction techniques that leverage the geometry of the display.
Eyecam is an anthropomorphic webcam mimicking a human eye through which we challenge conventional relationships with ubiquitous sensing devices and call to re-think how sensing devices might appear and behave.
Future generations of displays will be thin, lighweight, and flexible. In this project, we develop novel interaction techniques for displays which can be dynamically expanded and collapsed.
Collaborative Use of Rollable Display
Rollable Displays allow for creating tabletop sized displays on-the-go. In this project we investigate the collaborative use in the context of mobile face-to-face encounters.
Established practices of interacting with physical media have proven to be very powerful, as they make intelligent use of space and draw on expressive tangible manipulation. Many of these advantages are lost in current graphical user interfaces. Our goal is to develop novel embodied interfaces for digital media that seamlessly combine the power of physical interaction with the benefits of computer technology. We do so by leveraging new types of physical and spatial displays and introducing novel interaction techniques, both for individual and collaborative work.
CloudDrops is a pervasive awareness platform that integrates virtual information from the Web more closely with the contextually rich physical spaces in which we live and work.
In this project, we investigate how the unique affordances of pico projectors can be leveraged for novel, tangible interaction techniques with physical, real-world objects.
CoScribe is a collaborative platform for knowledge workers, which tightly integrates paper-based and digital documents. It offers novel interaction techniques for cross-media annotations, hyperlinks, and tags on both types of documents.
Our research projects receive funding from: European Research Council (ERC Starting Grant “InteractiveSkin”), Deutsche Forschungsgemeinschaft (DFG), Bosch Research, Google, and the State of Saarland.
Human-Computer Interaction Lab
Department of Computer Science
Campus E 1.7
Vega Wordpress Theme by LyraThemes