I heard a recording the other day that gave me goosebumps. It was a 15 second clip of Pink Floyd’s “Another Brick in the Wall” but it sounded like it was coming from deep underwater, or through a long, bendy aluminum duct. Or both.
That was creepy enough, but when I read about the source of this recording, I was thrust right into the verbal and auditory equivalent of Uncanny Valley (whatever that may be). And believe me, I really do try to avoid that realm. But when I find myself there, I can’t seem to look away. I have to know more. It’s like that irresistible urge to slow down to look at the aftermath of a car accident, thus causing the traffic snarl behind me to get even worse.
Anyway… the reason for my serious case of the willies is that this Pink Floyd-adjacent rendition of their most popular song came from… people’s brain waves. Seriously. You can read all about it in an article entitled, “Listen to Pink Floyd’s ‘Another Brick in the Wall,’ as decoded from human brain waves”. The article includes the recording I first heard. (The second recording in the article is the one, not the first.)
My summary doesn’t do the article justice. I can’t even vouch for my accuracy here. But the upshot seems to be that researchers were interested in seeing how the brain processes music, so they analyzed people’s brain activity via electrodes that were placed on their actual cortices (a.k.a cortexes) while they listened to the song. Those electrodes picked up the electrical activity of their neurons. Then, somehow, based on that data, they were able to decode the acoustics into that creepy recording. To grossly oversimplify, they were able to record the thoughts inside peoples’ heads.
They are hoping to apply this technique to BCI, or Brain-Computer Interface. This would allow people who are unable to speak, but who can mentally form words, to communicate with the wider world. That’s brilliant, and it could be life-changing for people. I’m all for that!
But I’m also me, so naturally my imagination took it in a much more nefarious direction. But before I send you crawling across the twisted, rusty transom of my mind, let me give you some back story, because scientists have apparently been doing similar studies for years.
I found so many articles about this on the internet that I could have lost myself in them for hours. (Who am I kidding? I did.) Some of those articles were so highly technical that they were beyond me. I’m hardly a neuroscientist, so I stopped reading those after the first paragraph. But there were other, more user-friendly articles scattered about, and they were frequent enough to where I’m kind of surprised I never heard of this avenue of inquiry before. This should be major news. Am I the only one who cares that it’s possible, to a rudimentary degree, to get inside your head?
Anyway, the first article I found was from 2012. Entitled, “How Eminem Invents Freestyle Rhymes on the Spot” it describes a study that wanted to discover how the brain thinks creatively. So they got several freestyle rappers to agree to go into a fMRI machine. First, they were given a set of lyrics to memorize that had been written by someone else. Next, they were asked to freestyle something of their own. The researchers could map the blood flow in their brains to determine which sections turned on, and which turned off at any given moment. When the artists were creating their own rap, “the parts of their brains linked to motivation, organization and integration get active, while portions responsible for self-monitoring and control get quiet.”
Naturally, the article goes into much more detail, but basically they concluded that creativity is “just simple rearrangements of brain activity and cognitive processes that are a normal part of everyday experiences.”
Fascinating. There was vague mention of the possibility (or lack thereof) of seeing if this improvising brain network could be trained to act more quickly. I don’t know if they ever got around to taking things to that next step. The important takeaway is that even in 2012, scientists were thinking about doing so.
Next, I came across an article entitled, “Scientists design algorithm that ‘reads’ people’s thoughts from brain scans”, and it said that using an fMRI (again), scientists were able to decode people’s thoughts. They did caution that they could only decode the semantic meaning of people’s thoughts, not the word-for-word translations. Whew, that’s a relief. Not.
Basically, they had people listen to 16 hours of different podcasts and radio shows over several sessions. They then fed the scan data into a computer, and that computer could compare the patterns in the audio to patterns in the brain activity. “The algorithm could then take an fMRI recording and generate a story based on its content, and that story would match the original plot of the podcast or radio show ‘pretty well.’”
Even more fascinating, ‘the algorithm could fairly accurately explain the plot of a silent movie that the participants watched in the scanner. It could even retell a story that the participants imagined telling in their heads.”
And yet again, brain-computer interfaces were mentioned, and a great deal was made of the fact that these scans were noninvasive, and how it would help people who cannot speak or type. Which, again, is good. But now they’re getting even further into our heads. This was in 2022.
Then, later that same year, an article was published that was entitled, “1st patient with new ‘mind-reading’ device uses brain signals to write”. This one came with a video of the poor guy with a port in his head that the scientists could plug into. He could then spell words in his head, and they’d show up on the computer screen.
They quickly discovered that they couldn’t just use the alphabet, because as anyone who has ever spelled out something knows, a lot of letters can be confused for one another. So they had the guy think the spelling in the NATO phonetic alphabet. (You know the one. “Alpha, bravo…”)
That spelling produces distinct brain waves that can be translated into an algorithm that the computer can then translate into the letter in question. It’s pretty slow going, but it’s a breakthrough. The scientists are hoping that they’ll soon be able to make the process wireless so no one will have to be plugged into anything. (Excellent idea.)
The last article I read came out last month, and was entitled, “Google’s ‘mind-reading’ AI can tell what music you listened to based on your brain signals”, and it takes things to yet another level.
In this one, people had their brains scanned while listening to music. Then that data is examined by an artificial intelligence, and with that information, the AI can “produce a song that matches the genre, rhythm, mood and instrumentation of music that the individual recently heard.”
Then the AI was customized for each individual, based on their unique brain patterns. You can hear some of the results here. They believe that the AI has about 60 percent accuracy, but I beg to differ. When you hear the samples the people heard, and then what AI came up with from their brain waves, while it’s similar, the lyrics are garbled and it is definitely not good enough to identify the song they heard. (For some reason it does better with classical music. Go figure.) So now they can customize the process for each person. Next, they want to see if AI can reproduce songs that people are only imagining in their heads.
Then, two weeks after that, the first article I read, the Pink Floyd one, came out. And the song is identifiable in that one, albeit twisted and warped, but the process was more invasive. So, what I’m seeing is increasingly rapid progress. Those who need BCI to communicate shouldn’t have much longer to wait.
But, like I said, my brain went to a completely different place. And when I talked to a friend about it, she had some ideas, too. So, consider the following if this technology keeps progressing:
- On an exciting note, can Star Trek’s universal translator be far behind?
- At some point, we might be able to hear what our dreams actually sound like, or how light sounds to a synesthete.
- We might not need cellphones anymore. We could transmit our brain scans back and forth remotely.
- On the other hand, as the scanning gets more and more remote, what if you can be scanned without your knowledge?
- When I’m typing in passwords, I’m thinking them in my head. Lets hope this technology never gets to hacker level.
- And how strange is it to be listening to music that has already been listened to? What are the copyright implications of that?
- If someone were taken prisoner and tortured, and that person knew that the terrorists wanted to know about some top secret info they talked about after (for example) casting about for some reading material in Trump’s bathroom, they’d then be thinking of that conversation. If the torturers could scan their brain…
- Or maybe the police could get you to think about a crime you did or did not commit. I hope it would be just as inadmissible as a lie detector, but who knows? It wouldn’t be hearsay. Instead of he said, she said, it would be you said.
- What if they figure out a way to send sound/thoughts in as well as listen? Could they fill our mind with images? Tastes? Smells? Commercials? Propaganda? Pain? Emotions?
Can you see why this whole thing gives me goose bumps?
I think I need a hug and a pint of ice cream.

Like the way my neurodivergent mind works? Then you’ll enjoy my book! http://amzn.to/2mlPVh5


Leave a Reply