×

LingQをより快適にするためCookieを使用しています。サイトの訪問により同意したと見なされます クッキーポリシー.

image

VOA(Regular Speed) WORDMASTER, How a Conversation Can Really Become a Meeting of the Minds

AA: I'm Avi Arditti with Rosanne Skirble, and this week on WORDMASTER: When two people "click," that means they really understand each other. Well, that metaphorical clicking could be the sound of what researchers call "speaker-listener neural coupling." RS: Studies to date have largely analyzed speech production and comprehension independently within individual brains. But new research at Princeton University examines their relationship in producing successful communication.

AA: We talked with lead authors, Lauren Silbert and Greg Stephens. They began the study by recording Silbert as she lay with her head inside an fMRI -- a functional magnetic resonance imaging machine, essentially a giant magnet that scans the brain at work. She reminisced about her life.

LAUREN SILBERT: " I spent a good amount of time in the fMRI scanner telling stories, just as if I was talking to a friend, trying to communicate something about my life. We then played the recordings back to eleven different listeners while they were also individually in the scanner.

"After the scan we have a behavioral assessment of how much they actually did understand. So then we can sort of measure this coupling between the speaker and listener in correlation with how well the communication was." RS: "And, Greg, what did you find?" GREG STEPHENS: "At the end of the experiment, we have a functional scan of Lauren's brain and we have a functional scan of these eleven listener brains. So then we're faced with the task of how do we assess how similar are these brains patterns, how coupled are they. "And so the first result was that actually it turns out that there's extensive coupling between the two, which extends well beyond sort of low-level auditory areas and it goes all the way up into sort of central cortex. It involves a lot of the language areas that people have seen.

"So that was kind of the first finding, which is that there was sort of this extensive coupling between the two processes that you might have thought, naively, that they would be kind of different, the production side and the comprehension side." AA: "So, in plain English, you're basically saying it's a real meeting of the minds." LAUREN SILBERT: [Laughs] GREG STEPHENS: "Basically we are saying that, actually. And, as a little bit of context, you know, what we think might be going on is simply that we're all human, we have similar brains and so that when you comprehend speech you might use similar machinery, similar algorithms as you do when you produce it, because we're all sort of using the same fundamental machine, the same brain." RS: "What does this tell us about who we are and how we operate?" LAUREN SILBERT: "First, I think it tells us that these two processes that we look at as opposite processes are actually not so opposite. Sort of an extrapolation from the mirror neuron hypotheses, where in order to understand what somebody is saying you have to also produce what they're saying, which, you know, could tell us a bit about empathy or whatever it is you want to think from that. "Another thing that it tells us is that our brains don't exist in isolation. We grow up, we develop in relation to our surroundings, to people around us, and we communicate in relation to other people. We have interactive processes. We don't exist solely in isolation." AA: "What did you find, actually, when you started looking at people speaking two different languages to each other? What did you find when you looked at their brain activity?" RS: "And how were they any different from what your paper relates?" LAUREN SILBERT: "Well, so in the paper we use one control where we have a Russian speaker and non-Russian speaking listeners. So in that case there's no understanding that's going on at all. And in that case we see no coupling between the brains." AA: "Now do you think you would get the same results, for example, if -- I mean, you were just using some technical language, you were talking about mirror neurons and this and that. And now if someone maybe wasn't familiar with it who was just listening to you talk about that, do you think if you were to look at their brain and your brain you would see different patterns, showing that you weren't coupled?" LAUREN SILBERT: "Yes, I do, actually. I think that their brains would start searching for something that I have already moved on to something else, and there would be a difference in processing." AA: "So they can just blame their brain, right? It's not just lack of understanding." LAUREN SILBERT: "Or, on my side, I have to bring whoever I'm speaking with into my world in order to make it as successful a communication as possible." RS: Lauren Silbert and Greg Stephens are researchers at Princeton University in New Jersey. Their study appears in the Proceedings of the National Academy of Sciences.

AA: And that's WORDMASTER for this week. With Rosanne Skirble, I'm Avi Arditti.

Learn languages from TV shows, movies, news, articles and more! Try LingQ for FREE

AA: I'm Avi Arditti with Rosanne Skirble, and this week on WORDMASTER: When two people "click," that means they really understand each other. Well, that metaphorical clicking could be the sound of what researchers call "speaker-listener neural coupling."

RS: Studies to date have largely analyzed speech production and comprehension independently within individual brains. But new research at Princeton University examines their relationship in producing successful communication.

AA: We talked with lead authors, Lauren Silbert and Greg Stephens. They began the study by recording Silbert as she lay with her head inside an fMRI -- a functional magnetic resonance imaging machine, essentially a giant magnet that scans the brain at work. She reminisced about her life.

LAUREN SILBERT: " I spent a good amount of time in the fMRI scanner telling stories, just as if I was talking to a friend, trying to communicate something about my life. We then played the recordings back to eleven different listeners while they were also individually in the scanner.

"After the scan we have a behavioral assessment of how much they actually did understand. So then we can sort of measure this coupling between the speaker and listener in correlation with how well the communication was."


RS: "And, Greg, what did you find?"

GREG STEPHENS: "At the end of the experiment, we have a functional scan of Lauren's brain and we have a functional scan of these eleven listener brains. So then we're faced with the task of how do we assess how similar are these brains patterns, how coupled are they.

"And so the first result was that actually it turns out that there's extensive coupling between the two, which extends well beyond sort of low-level auditory areas and it goes all the way up into sort of central cortex. It involves a lot of the language areas that people have seen.

"So that was kind of the first finding, which is that there was sort of this extensive coupling between the two processes that you might have thought, naively, that they would be kind of different, the production side and the comprehension side."

AA: "So, in plain English, you're basically saying it's a real meeting of the minds."

LAUREN SILBERT: [Laughs]

GREG STEPHENS: "Basically we are saying that, actually. And, as a little bit of context, you know, what we think might be going on is simply that we're all human, we have similar brains and so that when you comprehend speech you might use similar machinery, similar algorithms as you do when you produce it, because we're all sort of using the same fundamental machine, the same brain."

RS: "What does this tell us about who we are and how we operate?"

LAUREN SILBERT: "First, I think it tells us that these two processes that we look at as opposite processes are actually not so opposite. Sort of an extrapolation from the mirror neuron hypotheses, where in order to understand what somebody is saying you have to also produce what they're saying, which, you know, could tell us a bit about empathy or whatever it is you want to think from that.

"Another thing that it tells us is that our brains don't exist in isolation. We grow up, we develop in relation to our surroundings, to people around us, and we communicate in relation to other people. We have interactive processes. We don't exist solely in isolation."

AA: "What did you find, actually, when you started looking at people speaking two different languages to each other? What did you find when you looked at their brain activity?"

RS: "And how were they any different from what your paper relates?"

LAUREN SILBERT: "Well, so in the paper we use one control where we have a Russian speaker and non-Russian speaking listeners. So in that case there's no understanding that's going on at all. And in that case we see no coupling between the brains."

AA: "Now do you think you would get the same results, for example, if -- I mean, you were just using some technical language, you were talking about mirror neurons and this and that. And now if someone maybe wasn't familiar with it who was just listening to you talk about that, do you think if you were to look at their brain and your brain you would see different patterns, showing that you weren't coupled?"

LAUREN SILBERT: "Yes, I do, actually. I think that their brains would start searching for something that I have already moved on to something else, and there would be a difference in processing."

AA: "So they can just blame their brain, right? It's not just lack of understanding."

LAUREN SILBERT: "Or, on my side, I have to bring whoever I'm speaking with into my world in order to make it as successful a communication as possible."

RS: Lauren Silbert and Greg Stephens are researchers at Princeton University in New Jersey. Their study appears in the Proceedings of the National Academy of Sciences.

AA: And that's WORDMASTER for this week. With Rosanne Skirble, I'm Avi Arditti.