We are very sensitive to those around us, and when we are infants and young children, we observe the behavior of our parents and children, learn how to walk, talk, read, and even use smartphones from them, and the complexity of the behaviors we learn from observation al-learning seems to be limiting. But the social impact goes far beyond that, we don’t just imitate the behavior of those around us, we imitate their ideas, and as we get older, we understand other people’s thoughts, feelings, and needs, and try to adapt.
Our brains are very good at building self-awareness and replicating the results of other people’s brains. But how does the brain distinguish between what it thinks and what others think? The latest study, published in the journal Nature Communications, is closer to the answer.
Our ability to imitate the minds of others is important, and when the process goes wrong, it can lead to a variety of mental health problems. You may become unsympathetic to someone, or become very extreme, and you may become so sensitive to other people’s thoughts that you feel capricious and vulnerable.
The ability to think about other people’s ideas is one of the most complex adaptations of the human brain, and experimental psychologists often evaluate it using a technique called a “wrong belief task.”
In this task, the observer, also known as the “subject”, looks at another person, who is called a “partner”, who hides an object in a box, and then the “partner” leaves the scene, and the “main body” sees the researcher remove the object from the box and hide it in another place. When the “partners” come back, they will mistakenly believe that the object is still in the box, but only the “subject” knows the truth.
This requires the “subject” to remember not only the true perception of the truth, but also the wrong view of “partner”, but how do we know that the experimental “subject” is really thinking about the “partner”?
For the past decade, neuroscientists have been exploring a type of mind-reading technique called “simulated theory” that suggests that my brain will try to replicate the results of your brain when I’m in a position to think about you.
Neuroscientists have found compelling evidence that the brain does simulate calculations related to social partners. Research shows that if you watch another person get a reward, such as food or money, your brain activity is the same as you get a reward.
But there’s a problem, if my brain replicates your brain’s analysis, how do i tell the difference between my own brain and the brain consciousness that I simulate?
In our experiment, we recruited 40 participants to play a “probability” version of the wrong belief task. At the same time, we used functional magnetic resonance imaging (fMRI) to scan their brains to indirectly measure brain activity by tracking changes in blood activity.
In this game-like experiment, the player does not believe that the object is in the box, but believes that the object may be here or there, this is the equivalent of a “Schrodinger box”, the object is always moving, so the two participants in the experiment’s beliefs are always changing, the testsubjects not only to track the whereabouts of the object, but also to constantly analyze the ideas of the companion.
The experimental design allows us to use a mathematical model to describe what a player is thinking in the brain during the game test, showing how participants change their minds each time they get information about the location of an object, and it also describes how they change their perception of their peers, each time a partner sees important information.
The model works by calculating “prediction” and “prediction errors”, for example: if a participant predicts that an object has a 90 percent probability that it may be in the box, but then finds that it is not near the box at all, they will be surprised. So we say this person has gone through a big “prediction error” that will be used to improve the next forecast.
Many researchers believe that prediction errors are a basic unit of computing in the brain, and that each prediction error is related to a specific pattern of activity in the brain. This means that when the subject thinks about the prediction error of the partner, we can compare the patterns of brain activity and other brain activity patterns when the subject experiences the prediction error.
Our findings suggest that the brain uses different patterns of activity when it comes to predictive errors and “simulated” prediction errors, which means that brain activity contains not only what is happening in the world, but also information about who is thinking about the world, a combination that leads to self-subjective sensations.
At the same time, we found that people can be trained to do self-brain activity exercises with multiple differences and overlaps. By controlling tasks, we allow testers and peers to see the same information little or frequently, and if they differ in that information, they are better able to distinguish between their own and their peers’ ideas. If the pattern has more overlapping information, they cannot distinguish between what they and their peers actually think.
This means that the boundaries between the self and others in the brain are not fixed, but very flexible. The brain can learn how to change this boundary, which may explain why two people spend a lot of time together, but still feel like a person who shares the same point of view. On the social level, this may explain why we are more likely to empathize with people with similar experiences than people from different backgrounds.
The findings of this study are of some value, and if the cognitive boundaries of ourselves and others are malleable, then we can use that ability to address some cognitive biases that will alleviate our mental health disorders.