Periodic Reporting for period 2 - TOUCHLESS (Touchless Haptic Experiences with Neurocognitive AI)
Reporting period: 2022-01-01 to 2023-06-30
More specifically, here below is the work that has been performed during the first 30 months of TOUCHLESS in technical WPs:
- WP1: Published instructions on how to build an ultrasonic phased-array for haptic stimulation with extensive work carried out on the simulation, electronic design, and software but also on video editing for the creation of the step-by-step instructions. Designed, built and tested a large ultrasonic array capable of generating sensations on the back of the hand and the forearm. Introduced advanced siimulation techniques for acoustic holography called PhantomFields. On electrostatics, the paper PiloNape is a solid foundation in terms of hardware and psychophysics of the sensation on different body parts; an attempt at using it for affective modulation was made but a more thorough study will be conducted with the help of the other project partners. Also, our Sparkitect demo (directing high-voltage sparks with ultrasound), may result in the first system capable of mid-air fine perception (e.g. Braille). On thermal and combination of haptic modalities, we have developed an experience in which the technologies are used separately from each other but in combination with VR. Next, we are investigating a “Portal” experience that combines all the developed haptic technologies.
- WP2: Mid-air haptic stimulation was used to quantify the basic spatial and temporal capacity for tactile perception, providing quantitative information about how midair haptics can deliver functional touch experiences. The ability to stimulate different touch receptors, and their associated afferents, with different frequencies of ultrasonic vibration, was confirmed, and the low-level responses of the different receptor channels to mid-air haptic stimuli were also confirmed.
- WP3: Developed an AI Operational Infrastructure for efficient collaboration and research code exchange between multiple consortium participants. Then, we developed an AI-driven haptic modulation engine that leverages an ambient sensing and inference platform that integrates touchless sensors, including visual and audio sensors, to continuously gather data on various aspects of human monitoring, enabling comprehensive environmental understanding. Moreover, leveraging user research and neurocognitive models we have built a Cognitive-AI layer that quantifies and predicts user experiences in response to haptic stimuli and contextual factors, offering insights into individual interactions and engagement. In one implementation of this engine, we have created a demonstrator of an AI-based modulation for biosignal sharing.
- WP4: Aiming to create novel mid-air interactions, the work package has created a touch library, explored interaction techniques and created multiple demonstrators and publications. The library and novel interactions generated allow us to create richer experiences than were previously possible with the technology and create cross-package collaborations. The period resulted in many publications and dissemination at premiere venues.
- WP5: Developed a novel training offer by combining generic RRI training with specific elements relevant to haptic/body engineering research and technology sectors. The learnings of RRI training were carried forward into the core research activities of the project and also embedded into the practices of our industry partners. We have developed a new taxonomy of tactile devices, reflecting the different ethical concerns they may raise, and considered how to address them. The main results include: highlighting the ethical importance for digital touch of a concept of sensory autonomy; drawing attention to ethical issues arising from potential asymmetries of power in interpersonal tactile interactions and social haptics; and underlining the crucial ethical importance of control over the OFF-switch in future digital touch technology.
- WP6: Several dissemination and outreach activities related to the project results and vision, active engagement with sister projects. Live content is available on our project website. Launched and grew of the Haptics Helix community and innovation platform. Organised workshops and symposia on haptics, AI, ethics and XR at premier academic events. Delivered engaging industrial demonstrations, thereby raising the awareness about our project activities and vision and engaging with the scientific and industrial communities. A few highlights have included the demonstration of project results at flagship industry and academic events such as WebSummit 2022, CES 2023, and ACM CHI 2023.