Skip to main content
European Commission logo

Where our senses meet technology

We interact with technology on a daily basis, but almost entirely through our eyes and ears. The EU-funded SenseX project has unveiled a multisensory experience design. This new approach offers citizens technology that incorporates all their senses.

© SenseX project - M.Obrist, 2018

PDF Basket

No article selected

We live in a sensory world. In fact, almost any experience you can think of, from eating a meal to attending a concert, involves all our core senses – sight, sound, touch, taste and smell.

Consumer technology, however, is often limited to audiovisual.

“Even though interactive technologies have become an essential, ubiquitous part of our everyday lives, the typical user experience only involves our visual and auditory senses,” says Marianna Obrist, a professor of Multisensory Interfaces at University College London (UCL).

Through the EU-funded SenseX project, supported by the European Research Council, Obrist is advancing the use of what she calls multisensory experience design. “Touch, taste and smell have a huge impact on health, safety, leisure, work and our overall well-being,” she adds. “As such, multisensory experiences, when embedded into interactive technologies in a user-friendly way, could open the door to entirely new product, technology and service opportunities.”

The playbook on multisensory experience design

Before tech companies can start leveraging the power of multisensory experiences, they first must understand what’s possible – which is where the SenseX project came in. “Our goal was to write the playbook on multisensory experience design, providing concrete examples of how designers can integrate touch, taste and smell into the user experience,” remarks Obrist.

To do this, researchers had to not only develop new, non-existing devices and interfaces to stimulate the sense of smell and taste, but also understand their effectiveness regarding the creation of the sensation and perception effects. These technical and perceptual parameters were then used to design multisensory experiences, many of which are highlighted in the groundbreaking book ‘Multisensory Experiences: Where the senses meet technology’.

According to Obrist, the book takes the reader from the fundamentals of multisensory experiences to the relationship between the senses and technology and, finally, to what the future of these experiences may look like. “This book describes the digital transformation and how we can start designing novel interfaces and interactions that integrate all our main senses,” she explains.

A tasty idea

Take for example TastyFloats, one of the multisensory experiences highlighted in the book. A taste-delivery system, TastyFloats aims to provide the user with an enjoyable and meaningful interaction. “In other words, we didn’t want users putting tubes in the mouth or up their nose or have to subject their tongues to electrical stimulation,” says Obrist.

Instead of tubes and wires, the experience harnesses the principles of acoustic levitation. As Obrist explains, acoustic levitation uses high-intensity sound waves to suspend matter, in this case taste particles, in the air. “We developed a contactless delivery device that presents taste stimuli in mid-air – with literally no strings attached,” notes Obrist. “Thus, the user is free to interact with the stimulus using their tongue.”

Having successfully levitated food items, researchers then turned their attention to understanding what levitated stimuli taste like, how it affects one’s perception of taste, and what happens if other sensory stimuli are integrated. “The goal of this work is to lay the foundation for designing multisensory experiences that could eventually be integrated into our everyday devices,” says Obrist.

Researchers also collaborated with London’s Tate Britain art gallery to create the Tate Sensorium – a case study on how touch and audio technology can be used to design multisensory art experiences. “This initiative demonstrated how the human-computer interaction community, creative industries, and art curators can use mid-air technology to think beyond conventional art experiences and towards something more emotionally engaging and stimulating,” adds Obrist.

Game-changing potential

Although the SenseX project is now finished, its work is not. To further advance its sensory innovations, project researchers launched a spin-off company called OWidgets. The company recently closed an investment round of approximately EUR 1 million, which will be used to grow the team and foster further research and innovation initiatives across Europe.

Furthermore, Obrist herself is leveraging the many collaborations with academia and industry established during the project. As deputy director for Digital Health at the UCL Institute of Healthcare Engineering, she is exploring how SenseX’s research on immersive experiences can address healthcare and well-being challenges.

“I believe these latest developments and collaborations, many of which are being driven by the OWidgets spinoff, demonstrate the significant, game-changing potential that multisensory experiences can bring, not only to our everyday devices, but also to the advanced technology driving the future of healthcare, transportation and research,” concludes Obrist.

PDF Basket

No article selected

Project details

Project acronym
SenseX
Project number
638605
Project coordinator: United Kingdom
Project participants:
United Kingdom
Total cost
€ 1 494 865
EU Contribution
€ 1 494 865
Project duration
-

See also

More information about project SenseX

All success stories