AI Ethics: Can Robot Partners Replace Human Partners?

Robotic partners are being portrayed as a panacea for long, lonely lives. The future life of mankind may be heading for an inflection point from here.

AI Ethics: Can Robot Partners Replace Human Partners?


After Constance Gemson took his mother to a nursing home, the 92-year-old not only became more lonely but confused. Even with two kind, attentive full-time life assistants in full care, all her social needs cannot be fully met.

It wasn’t until one day that Gemson went to visit his mother and brought a new helper, a robotic cat designed for the elderly. It can make a lovely sound of snoring, and it can also use the nose and people to go around. “Although it is no substitute for care,” Gemson says, “it’s a reliable friend that my mother can embrace and trust.” Her mother died last June at the age of 95.

“When mom gets depressed, as long as the cat is by her side and she sings, she’s calmed down,” Gemson said. In the last few days of her life, she poured her love into the mechanical cat.


Nursing robots play an important role in the families of the elderly and the sick, providing much-needed friendship and help.

The growing aging population has also helped to make the rise of the robot care industry. Experts predict that the global population aged 65 and over will more than double by 2050. At the same time, the working-age population in many developed countries is shrinking, and nursing robots are increasingly being seen as a good medicine for old age.

Today, nursing robots come in all shapes and functions. Cute desktop robots will remind older people to take medicine on time and take regular walks; some robots can feed snacks to dying patients. More than 100,000 machine pets designed for older people have been sold in the U.S. since the machine pet debut in 2016, according to Machine Pets. Sales of care robots designed for the elderly and disabled are expected to grow by 25 per cent a year by 2022, according to the International Federation of Robotics.


For nursing robots, we should also be deeply concerned about the ethical issues they involve. Because it is about the future of mankind and the ultimate meaning of care, the moral question of what human freedom and dignity is urgently answered. These ai-do robots were originally designed to help the elderly, and they help with simulated appearance and a degree of social skills, accompanied by moral hazard. For example, will Grandma’s robot pet encourage more family conversations, or will it keep her loved ones away from the heavy lifting of caring for the elderly?

“I think what’s really questionable is that when these machines are just puppets, someone is pushing their social functions,” said Matthias Scheutz, a robotics expert and director of the human-computer interaction lab at Tufts University. “

Because that’s where the moral dilemma begins – with soulless algorithms, we can make robots blink, sing, and make all kinds of intelligent movements. However, no matter how many intelligent actions and language we add to robots, people can only get a slightly grotesque “care” from it.

In response, Maartje de Graaf, a researcher at the University of Utrecht in the Netherlands who studies the ethics of human-robot interaction, argues that the ideal scenario for social robots is to inspire human empathy towards them. For example, some owners of robot vacuum cleaners will feel sad when their robot stalks;

Many experts in the field have also noted the stressful ethical dilemmas faced by care robots, but they believe the benefits may outweigh the risks. Richard Pak, a scientist at Clemson University, said the technology “is designed to help older people take control of their daily lives.” If the price is to some extent deceitful, it is also valuable. “But he also occasionally doubts his opinion, “Is this really right?”

Little is known about the long-term effects of robot care, and public concern about it is growing. In the 2017 survey, nearly 60 percent of Americans said they did not want to use robots to care for themselves or their families. Sixty-four percent believe that this care will only increase loneliness among older people. At the same time, 60 per cent of people in EU countries favour a ban on the use of robots to care for children, the elderly and the disabled. These concerns may provide some reference for the functional design of robots.

Only recently have the elderly begun to make their voices heard: some older people have expressed a willingness to own a care robot and become friends with it. But research shows that many older people don’t like robot-like companionship. They are afraid of being monitored and controlled, and more afraid of losing human love as a result. One potential user of the survey is concerned that robot care could discriminate against some people and make them considered “unworthy of human companionship”.

“When the only goal is to build machines that increase profitability and efficiency, human nature is not given priority. John C. Havens, executive director of the AI Ethical Guide Initiative, said. The key to AI ethics and one of its main principles is “transparency”, or anti-Turing testing. Humans need to realize that they are interacting with robots. It should also be noted the potential side effects of robot social functions, such as “disturbing intimate relationships between human partners.”

Similar AI guidelines can help users and designers keep their heads clear to better use and design robots.


Mabu, the “health coach robot” that went on sale this year, is a classic example. Initially, it was designed for patients with chronic diseases such as heart failure. The small robot sends health advice and drug alerts to patients. In some cases, it can also send data on the user’s physical condition to a physician. When Catalia first designed the robot, it stressed that it was not a doctor or nurse, but a health care worker.

However, the company often portrays Mabu as a real person. For example, the company’s ad says, “I’m going to be your number one cheerleader!” “

According to the company, the vast majority of the hundreds of people who now use Mabu are elderly, and they communicate with robots for an average of 25 minutes a week. However, Cory Kidd, the company’s founder and chief executive, has said that some users will name and dress Mabu and take them on vacations.

I asked Kidd: “Is Mabu transparent enough as a care robot?” “

“There’s a lot of work to be done to understand this interaction between users and robots. He said.

A retired bus driver sees his Mabu as an important pillar of his own. Some people tell me that they think robots are friends.

Kerri Hill, 40, who spends most of her time at home with heart failure, is crucial to Her companionship when she’s alone. But she doesn’t want to rely too much on such a care robot. “Robots are just robots,” Hill says. “


These comments remind me of the last four years of my mother’s life, the difficult choices I had to make to take care of my mother. While I’m raising two children under the age of six, I have to constantly assess how many nannies I need to hire and whether she’s safe after I leave my mother.

Would it be a treat for her to have such a robot make her laugh, encourage her to eat, and help her pick up a spoon that had fallen on the ground? Will it be a relief for me? Would it be better?

During the days when I took care of my mother, I admitted that I had a hard time, but I believed that even if there were such a robot, I would have chosen to take care of herself. Because I think that there is such a place in our lives that we need to use warm hands to soothe the uneasiness of others and provide real care.

Caring for and caring for others is a difficult job, accompanied by constant self-questioning. Only by remaining suspicious and hesitant about robots that care about humans can we protect the sex of such people.

One day in the fall, Gemson and I were sitting in a Manhattan cafe. Gemson spoke fondly of the moment when the paramedictook her mother out to lunch and gently bathe and feed her. She also fondly recalled the machine cat. After her mother died, she threw the machine cat away. (From: The New York Times Author: Maggie Jackson Compilation: Netease Intelligence Participation: Perseverance)

Add a Comment

Your email address will not be published. Required fields are marked *