Social Robots and Opinions
Children aged between seven and nine were more likely to give the same responses as the robots, even if they were obviously incorrect. University of Plymouth

The interaction between man and machine in fields ranging from education to healthcare is the future of technology. Several highly capable robots are being developed for various tasks, but can a bot ever influence a person’s opinion, thinking or perhaps their behavior?

While nobody knows the exact answer to this question, a group of researchers recently conducted a series of experiments to see how machines of the future could affect adults as well as children. Much to everyone’s surprise, the results of the work revealed that children, particularly those aged between seven and nine, are easily swayed by robots.

It has long been established that humans’ ability to make critical decisions can be affected by peer pressure but if both man and machine are to thrive in the same environment like at an office or medical center, it is important to know the possible impact of machines.

"People often follow the opinions of others and we've known for a long time that it is hard to resist taking over views and opinions of people around us. We know this as conformity,” Tony Belpaeme, the co-author of the work, said in a statement. “But as robots will soon be found in the home and the workplace, we were wondering if people would conform to robots.”

This drove the idea behind the latest study, where researchers at the University of Plymouth called a group of children and adults and asked them to take a test called the Asch paradigm.

As part of the test, the participants were shown a series of four parallel lines and tasked to identify the ones matching in length. Normally, the test is as easy as it sounds and people are usually correct, but in this particular case, the participants were accompanied by human peers as well as robots — which affected their results.

In the case of adults, the peers had much of an impact on the decisions taken, while the robots’ opinion was firmly resisted by the candidates. However, the case of children was the exact opposite and they were largely influenced by the presence of a machine by their side and its opinion.

When alone, the children scored with as much as 87 percent accuracy, but when a machine accompanied them, they chose to trust its answer. As a result, the score went down to 75 percent. More importantly, the team also noted that most of the wrong answers (nearly 74 percent) matched those given by the robots.

“What our results show is that adults do not conform to what the robots are saying,” Belpaeme added. “But when we did the experiment with children, they did. It shows children can perhaps have more of an affinity with robots than adults, which does pose the question: what if robots were to suggest, for example, what products to buy or what to think?"

This raises serious ethical concerns, especially considering the fact that machines are being developed to assist humans in various fields, be it as a social robot at home, heavy lifter at work, or child therapist at a medical center. Their interaction with adults and children is inevitable in the future, which is why the group called on industry experts to establish necessary safeguards to minimize such risks and protect children from being tended toward a machine.

The study, titled "Children conform, adults resist: A robot group induced peer pressure on normative social conformity," was published August 15 in the journal Science Robotics.