Do humans really wear their hearts on their sleeve? An ambitious Australian neuroscience project aiming to translate emotional impulses directly into music is hoping to find out.

We can actually definitively tell you that he is feeling emotions and we can tell you exactly how much emotion he’s feeling

Canadian artist Erin Gee describes it as “human voices in electronic bodies”, and there is a definite futuristic feel to her collaboration with the University of Western Sydney’s medical school.

A fingerprint scan is required to gain entry to the labs where her first subject, Ben Schultz, 27, is strapped to a bed, connected via a complex maze of wires to monitors not unlike those seen in a hospital.

Neurophysiologist Vaughan Macefield plays with a needle attached to a wire feeding directly into Schultz’s leg, listening carefully for changes in the white noise crackling from a speaker in the corner.

“That’s the sound that’s being picked up from the nerve,” Ms Gee explains.

“That’s the translation of what’s happening electrically.”

Mr Schultz said the needle was un-comfortable when it was moved but wasnot painful.

Tapping into a very precise part of the nerve will allow Dr Macefield to eavesdrop directly on the brain’s signals to the body as Mr Schultz is shown a series of images designed to elicit emotion, such as mutilation and erotica.

And that is where the music begins.

“While we cannot read Ben’s mind and tell you why he’s feeling emotions, the tech-nology exists today that we can actually definitively tell you that he is feeling emotions and we can tell you exactly how much emotion he’s feeling,” Ms Gee said.

“I can bottle Ben’s emotions and save them for later.”

Along with the nerve reading, Mr Schultz’s blood pressure, breathing speed, skin sweat and heart activity are being recorded and fed into Ms Gee’s computer, where custom-made software converts them into a chorus of chimes and bells.

The experiment will be repeated with several other subjects so Ms Gee and Dr Macefield can fine-tune their methods and sounds for a live “emotional symphony” performance that promises to be unlike any other attempted before.

Two actors attached to the various monitors will perform an “emotional score” – Mr Gee is not quite sure what it will look like yet, but it will require them to summon a series of emotions.

The music their feelings produce – “what happiness sounds like” for instance – will be performed by small robotic pianos that will also flash lights as different moods are detected. The team has chosen actors as subjects because they routinely need to manifest emotion on demand. “It will be like seeing someone expertly playing their emotions as they would play a cello,” said Ms Gee, whose first show is scheduled for Montreal next year.

Dr Macefield said the research would feed into the field of “affective computing”, which deals with machines that can understand and respond to human inputs.

Computers that can connect directly to the brain, allowing users to search for information simply by thinking about it, are currently in development and Dr Macefield said he was interested in how machines could help people.

Many mental illnesses and disorders are associated with heightened or blunted emotional responses and Dr Macefield said technology could have therapeutic benefits.

Children with autism disorders, for example, struggled to understand the emotions of others or to express themselves, and Dr Macefield said Ms Gee’s robotic technology could be used to teach them how to identify feelings by externalising and exaggerating them into forms like music.

“It may well be that by amplifying people’s emotions they can read them better,” he said.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.