Do machines feel pain?

“Does it hurt when you get shot at?” asks Edward Walter Furlong looking at a leather jacket with bullet holes. “I feel wounds, the data could be called pain”, replies Arnold Schwarzenegger. The tell-tale lines from the sci-fi blockbuster, Terminator 2: Judgment Day, anticipate “machines with a mind.”

Pain is relative – and whether or not it makes sense to wire the machines to become sensitive is a moot point.

Dr Ben Seymour from the University of Cambridge says: “Pain is the pinnacle of consciousness, of course not pleasant.” The university released a short documentary, “Pain in Machine” to further explore the concept.

Coding robots with “human feelings” have wider applications. These robots can help people on the spectrum develop social skills. Social robots are available to help veterans deal with PTSD (post-traumatic stress disorder) and as a companion to geriatrics. In Japan, many people rely on robots to keep depression at bay.

Dr. Nikhil Agarwal, CEO – IIT Kanpur (FIRST, AIIDE and C3i Hub), asserts that the definition of pain is not the same for humans and machines. “In a machine, pain is related to the activity of the machine, for example: if there is equipment and it is used for a long time, which causes it to wear out and requires replacement. In terms of software, if there is a bug in the program, virus or malicious code, it causes pain that needs to be cured.

strange valley

The human appearances of robots can discourage us. The concept is called Uncanny Valley.

“Do machines feel pain? is a very philosophical question, says Anuj Gupta, Head of AI, Vahan. “Some robots react when touched. Does that mean they ‘feel’ pain – no. Their reaction is a combination of sensors and software. It’s like a toy that reacts to the gestures of the Currently, machines cannot feel anything. They can be programmed to trick humans by simulating human emotions, including pain.

A few years ago, scientists from Nanyang Technological University, Singapore, developed “mini-brains” to help robots recognize pain and activate self-repair.

Left to right: Associate Professor Nripan Mathews, Dr Rohit Abraham John and Associate Professor Arindam Basu have developed a way for robots to have artificial intelligence (AI) to recognize pain and self-repair when they are damaged. Photo credit: NTU

The approach embeds AI into sensor nodes, connected to several smaller, less powerful processing units that act as “mini-brains” on the robotic skin. Then, by combining the system with a self-healing ionic gel material, the robots, when damaged, can recover their mechanical functions without human intervention.

Explain the ‘mini-brains’, Co-author of the study, Associate Professor Arindam Basu, of the university’s School of Electrical and Electronic Engineering, says: “If robots are to work with humans, one wonders if they would interact safely . To ensure a safe environment, scientists around the world have found ways to make robots aware, including how to feel pain, react to it, and withstand harsh operating conditions. However, the complexity of assembling the multitude of sensors required and the resulting fragility of such a system is a major obstacle to widespread adoption.

Mikhail Lebedev, Academic Supervisor at the Center for Bioelectrical Interfaces at HSE University, says: “Robots can even stimulate pain sensations: certain forms of physical contact that feel normal or contact that causes pain. This contact radically changes the behavior of the robot. He begins to avoid pain and develop new behaviors, that is, he learns – like a child who has been burned by something hot for the first time”.

Case

Researchers from Osaka University, Japan, have developed a lifelike robot, Affetto, with synthetic skin and the ability to feel pain.

Affetto can distinguish between a light touch or a hard blow. The team behind the robot said it would help robots understand and empathize with humans.

Affetto is equipped with an AI-powered “Pain Nervous System” and personalized skin technology to react to sensations using a variety of facial expressions.

Minoru Asada, the project’s principal investigator, said: “Engineers and materials scientists have developed a novel touch sensor and attached it to Affetto, which has a realistic face and body skeleton covered in artificial skin.”

Affetto can distinguish between soft and hard touches based on detected signals, and attaching skin sensors to Affetto helps the robot avoid any touch that causes “pain”. Social robots are programmed to show empathetic reactions to pain in others through a mirror mechanism similar to that experienced by humans.

Comments are closed.