Welcome back! Last week, we talked about how dangerous Hippos are. If you missed that blog and would like to catch up, click HERE.
I recently had an experience that was a tad bit unnerving for me. A conversation I was engaged in via text wasn’t going in the direction I expected it to go, and the person I was texting with got very angry. I went back over our text messages, trying to see where I had gone wrong, and I just couldn’t understand where the miscommunication had come from, as what they were saying I had asked them to do wasn’t what I had asked them to do at all. In fact, I hadn’t asked them to DO anything.
This got me thinking, and I wonder how many times conversations end this way. Both parties are upset with each other for “misunderstanding” them. Let’s look at some reasons this might happen, and see if there’s a possible solution.
Frontiers for Young Minds was a good place to start. Semantics is a term that means
“the meaning of a word, phrase, sentence, or text.” So imagine if you use the wrong word. For example, I say, “Can you pass the pepper please” when I really meant to say salt. Now, if you know me well, you’d know that I don’t use pepper, so you might see the mistake yourself and pass the salt. But what if you don’t know me well, and you pass me the pepper? It’s what I said, after all, right? In this example, I have made a “semantic error“. I will never know I made it.
If the word fits…
Frontiers for Young Minds reports: “Scientists have done experiments using a technique called EEG. An EEG measures the electrical activity that is always happening in every part of the brain. These experiments have shown that people’s brains respond differently to different kinds of semantic errors. In particular, there is a certain brain response based on how well the incorrect word fits in with the other words in the sentence. These experiments have shown that our brains often use knowledge about what kinds of words are expected in a sentence to construct meaning from that sentence.
Did you just read what you think you read?
“What their studies found was that the brain uses background knowledge to process the meaning of a sentence, based on how expected a word is in that scenario. When a word is expected because it fits well, the brain might be a bit lazy when determining the meaning of that word”. Here’s another example that the article used:
Imagine this scenario: “An airplane has just crashed on the border of Spain and France. The plane debris is scattered throughout both countries. Importantly, none of the passengers are from either Spain or France. Where should authorities bury the survivors?“
“What do you think? Where should they be buried? If you have selected a burial location, you have made a mistake! Go back and reread the scenario. Do you see the problem? The question asks you where to bury the survivors! However, survivors are ALIVE, so you should not bury them!
It happens to the best of us
“Do not worry, if you got tricked, you are in good company. Language scientists have given this same scenario to lots of volunteers in many experiments. They have found that most of the time, those volunteers also do not notice that you should not bury survivors“.
In short, our brains “make up” the words that make the most sense to our brains during that conversation.
Sometimes the written word won’t work
Sometimes changing up the delivery of the words makes a huge difference. For example, someone may not be able to comprehend your written meaning but if you speak it to them, they instantly understand what you’re actually saying.
Why is this? Scientific American may have some answers.
Are we hearing written words in our heads?
They write that “Words are not encoded in the brain by their meaning but rather by simpler attributes such as sound and shape. As your eyes scan these words, your brain seems to derive their meaning instantaneously. How are we able to recognize and interpret marks on a page so rapidly? A small new study confirms that a specialized brain area recognizes printed words as pictures rather than by their meaning.“
We “hear” written words in our heads. While the sound may have been the original vehicle for language, writing allows us to create and understand words without it.
Sometimes it’s the exact opposite, and speaking won’t work
This new research shows that sound remains a critical element of reading, so it makes sense that speaking your words would resonate more soundly than texting or emailing them.
But what if that person you’re talking to doesn’t understand your verbal words? Why would that be?
Hearing difficulties are very common. The person may simply not be able to hear you, or your range of tone. Sometimes, however, it’s a little more complicated than that.
Auditory processing disorder (APD) is where you have difficulty understanding sounds, including spoken words. There are things you can do that can help (start by consulting with your PCP).
If you or your child have APD, you may find it difficult to understand:
- people speaking in noisy places
- people with strong accents or fast talkers
- similar sounding words
- spoken instructions
APD is not a hearing problem. People with the condition usually have normal hearing.
What causes APD?
It’s not always clear what causes APD.
Possible causes include:
- regular ear infections
- a faulty gene
- head injury
- complications at birth
My dad may not have been wrong when he said, “People hear what they want to hear”.
Healthline gives us more insight into this statement:
Selective hearing is actually a thing. They tell us, “Selective hearing is the ability to listen to a single speaker while in a crowded or loud environment. You might also hear it referred to as ‘selective auditory attention’ or the “cocktail party effect”.
Selective hearing involves many factors, including your goals, vision, and brain activity patterns. Your brain chooses what to listen to based on what you’re trying to do. Visual cues are also an important part of selective hearing.
They broke it down for us:
Your brain chooses what to listen to based on what you’re trying to do.
For example, imagine that someone started talking to you while you were trying to finish watching an episode of a TV show. Chances are good that you didn’t hear much of what they said to you. Your brain prioritized the sound of the TV over that person’s voice because your goal was to finish watching the show.
A 2008 study put this concept to the test by asking participants to pay attention to sounds in one ear but not in the other. The investigators then played different pitches in each ear at the same time and asked the participants to note any changes in pitch in the ear they were asked to focus on.
MRI scans of the participants’ brains showed that they heard the sounds in each ear. However, when they were detecting changes in the specified ear, they ignored the sound in the other ear.
Visual cues are also an important part of selective hearing.
For example, a 2013 study involved playing audio of a man and woman talking at the same time. Participants were asked to pay attention to either the female or the male speaker. They had a much easier time focusing on only the male or the female voice when watching a video of the speakers along with the audio.
Based on these results, being able to see someone while they’re talking might help you listen more effectively.
A 2012 study found that the presentation of sounds within your brain doesn’t reflect all of the sounds in your environment but, rather, what you want or need to hear. These results are similar to those of the 2008 study discussed above.
However, the investigators also found that they could use the patterns of brain activity they observed to predict which speaker or words someone was listening to.
Participants were asked to listen to two different samples of speech at the same time. Each sample contained a different speaker and phrase. They were then asked to pick out which words were said by one of the two speakers.
Using information about brain activity patterns from the electrodes as well as a decoding process, the investigators reconstructed what the participants heard. The brain activity patterns suggested that the participants only paid attention to the speaker they were asked to focus on.
The investigators were able to use these brain activity patterns to predict which speaker the participant listened to and determine whether they paid attention to the wrong speaker at any point.
In addition, people with hearing loss, ADHD, auditory processing deficits, and autism seem to have trouble with selective hearing. The decoding technology could help researchers understand what people with these conditions are actually hearing and processing.
Knowing this information could be crucial for developing new treatments.
What I’ll do differently next time
After reviewing all the research and information, I’ve decided that texting and emailing will not be “good enough” in the future if I am trying to get a point across to someone I care about. While it’s always a good idea to have a paper trail for important conversations, that paper trail is only as good as the understanding it brings. If speaking the words brings a better understanding, then that’s the most important thing.
The human touch, or in this case voice, is by far the most important gift we have to give as it relates to understanding. It may feel uncomfortable in a world where we have become used to simply typing to others, but it is still by far the best way to be understood.
As always, this blog is not a replacement for sound medical advice. I am not a doctor. Please make an appointment to see your healthcare provider and put a good plan in place that works for you and the needs of your body.
That’s all I have for you this week, dear reader. I’ll see you back here next Wednesday to share another cup of coffee. Until then, be good to yourself and each other.
Mind, Body, Spirit…Osteopathic Doctors treat the whole person, not just the ailment. Is your PCP a DO? Would you like to learn more about Osteopathic Physicians? Click HERE!