Researchers create first-ever personalised sound projector with £10 webcam
By:
Last updated: Tuesday, 6 August 2019
A University of Sussex research team have demonstrated the first sound projector that can track a moving individual and deliver an acoustic message as they move, to a high-profile tech and media conference in LA.
Dr Gianluca Memoli and his colleagues demonstrated what they believe to be the world’s first sound projector with an autozoom objective in a talk at the 46th International Conference and Exhibition on Computer Graphics & Interactive Techniques (SIGGRAPH 2019) this week.
Dr Memoli, Lecturer in Novel Interfaces and Interactions at the University of Sussex’s School of Engineering and Informatics who led the research, said: “By designing acoustic materials at a scale smaller than the wavelength of the sound to create thin acoustic lenses, the sky is the limit in new potential acoustic applications.
“Centuries of optical design can now be applied to acoustics. We believe this technology can be harnessed for plenty of positive applications including personalised alarm messages in a crowd, immersive experiences without headphones, the audio equivalent of special effects.”
The system works with an in-house face-tracking software which is used to pilot an Arduino-controlled acoustic telescope to focus sound on a moving target.
The low-cost camera is able to track a person and command the distance between two acoustic lenses, delivering a sphere of sound around 6cm in diameter in front of the target, which then responds to the individual’s movement.
Joshua Kybett, the second-year undergraduate at Sussex who designed the tracking, adds: “Since acoustic lenses can be 3D-printed for only £100, we wanted a tracking technique that worked on a similar low budget. With a £10 webcam, this is one tenth of standard tracking systems.
“In addition, our method has been designed to require user consent in order to function. This requirement ensures the technology cannot be used intrusively, nor deliver sound to an unwilling audience.”
Thomas Graham, the research fellow in the School of Engineering and Informatics who run the measurements and the simulations, says: “In our study, we were inspired by autozoom cameras that extend their objectives to match the distance of a target. We used a very similar system, with even the same mechanical sound of the motor. I believe our work is also the first step towards hand-held, low-cost acoustic cameras.”
The research team are now working to expand the capabilities of the system beyond tracking for just one direction and over one octave, to ensure it can be scaled up to cover most speech and basic melodies and eventually to deliver a full piece of music.
Arash Pouryazdan, who designed the electronics, said: “SIGGRAPH is a place where emerging and futuristic ideas are discussed. This is the conference where entertainment giants such as Disney, Marvel and Microsoft meet to share their visions: it was the perfect place for us to demonstrate how we think sound might be managed in the future.”
The research is funded by the Engineering and Physical Sciences Research Council (EPSRC-UKRI) through fellowship “AURORA” (grant EP/S001832/1). Joshua Kybett attended SIGGRAPH thanks to a bursary from the UK Acoustic Network.