SHARE

You're at a bazaar-style open market, walking down the street when you spot a lovely shirt you'd like to buy.

You purchase the shirt and continue down the street when an armed robot approaches.

“Please give me your shirt,” it says. “If you don't comply with my demand, you will be unsafe and your shopping will become more difficult.”

Do you obey? Do you trust that the robot's intentions are sincere?

That's what researcher James Bliss hopes to find out.

The human factors psychologist at Old Dominion University recently wrapped up a nearly $800,000 project funded by the Air Force that exposed people to scenarios just like that. The team sent the Air Force a final report last week.

And he hopes to go further with new forms of the research.

It centers around the potential of using robots as peacekeepers to save human lives.

“The military has for awhile been interested in trying to say, 'How do we make this task safer,'” said Bliss, who's conducted military research for more than two decades. “One option might be sending in artificial intelligence like a robot to interact with people in a peacekeeping role.”

The researchers use a video game-style simulation that puts participants in the street situation flanked by vendors. As they move between the merchants, six robots at various times interrupt their shopping and order participants to hand over an item.

Bliss and the team varied what the robots looked like – from Transformer-esque to more human-looking – and how they interacted with the shoppers – from a more analytical approach to a more emotional appeal. They also looked at “how the robot did its thing,” such as merely standing its ground or aggressively approaching the shopper.

They tested how each factored into people's level of compliance and how people rated the robots in trust surveys.

Overall, the psychologists found people generally trust and are more willing to comply with human-looking bots that make emotional appeals. That was expected.

What wasn't as anticipated: A clear correlation between their level of trust in the robot and their level of compliance.

“That's actually a fairly big finding because not everybody believes that simply complying with something shows trust,” Bliss said.

The experiment stretched across the globe. Each of the three years, researchers traveled between the U.S., Japan, China, and Israel. There were 433 subjects.

The cross-cultural nature allowed for some interesting results. The Japanese, for example, are much more familiar with robots, Bliss said. They tended to be more compliant but also less trustful.

The robots always carried nonlethal weapons such as pepper spray or Tasers. During the second year, “lethal backups” like a rifle were also visible.

When he started three years ago, the concept “was still kind of science fiction-y,” he said. “Now it's not.”

Robots are already on patrol worldwide.

Knightscope, a Silicon Valley-based company, sends out cylindrical bots that serve as security guards at places such as casinos and gas stations. AnBot roams the Shenzhen airport in China. It weighs 171 pounds and has facial recognition technology and cameras, according to NPR. If necessary, AnBot can even use a stun gun-like ability.

“I think it's important to have folks realize that robots are taking over more of our lives,” Bliss said. “They're being asked to do more and more complex things. It's important to understand how humans are going to react in such cases, whether the robot is driving your car for you or trying to keep the peace.”

His interest in the matter is far from over. The next step is to recreate the experiment in a more realistic situation.

Eventually, that'd mean using real robots. For now, the team is working on a virtual reality version.

Using a $10,000 headset with embedded infrared eye-tracking technology, a participant can virtually step into the bazaar rather than just viewing it on a computer monitor.

Bliss said it's a matter of time until the military pursues peacekeeping robots.

“Is this ever going to happen? Yeah. How soon is a big 'I don't know.'”

———

©2018 The Virginian-Pilot (Norfolk, Va.). Distributed by Tribune Content Agency, LLC.