WHEN JAPAN’S FUKUSHIMA NUCLEAR DISASTER struck in March 2011, following a devastating earthquake and tsunami, the robots that were deployed to support rescue workers were very basic and largely ineffective, says robotics expert Professor Sven Behnke of the University of Bonn in Germany.

Spurred by this problem, Behnke and his team developed a robust disaster-response system through a project called Centauro. The project, funded as part of the European Commission’s Horizon 2020 programme, ran from April 2015 until 30 September 2018.

The outcome is a robot controlled by a human from a safe distance, with various sensors that allow it to perceive its environment and relay information back to its operator.

“The main operator controls the robot through a telepresence suit which measures the motions of the operator’s arms, wrists and fingers and transfers them to the robot,” said Behnke, who was project co-ordinator. A head-mounted display worn by the operator allows them to see in 3D what the robot sees from its own perspective.

The 1.5m-tall Centauro robot weighs 93kg, is made of lightweight metals like aluminium and has 3D-printed plastic skin.


The robot has a centaur-like body with four articulated legs ending in steerable wheels. Behnke said those four legs make it more stable than bipedal robots. Centauro can take on numerous postures and navigate challenging environments as it has rotating joints at the hip, knee and ankle. Centauro’s upper body has two arms with multi- fingered hands which allow it to lift objects and manipulate tools and doors. Although tele-operated, the robot has some degree of autonomy. For example, if it is told to move to specific locations or grasp an object it will plan and execute the action.

Last year Centauro was tested in real-world challenging scenarios at the German nuclear disaster-response provider Kerntechnische Hilfsdienst GmbH. It successfully climbed stairs, navigated debris, overcame gaps, unlocked a door, and operated valves and power tools, says Behnke. “The Centauro disaster-response system provided the high degree of flexibility needed for realistic missions,” he said.

Behnke hopes the technology may one day play a role in disaster relief efforts, but it is not ready to face radiation yet. However, researchers at the University of Birmingham, UK, are developing robots that can handle high radiation levels to clean up nuclear waste from the past half century across the European Union, where more than 90 nuclear reactors have been permanently shut down and more facilities have to be decommissioned. Under its 2021-2027 budget, the European Commission has proposed to allocate nearly €1.2 billion towards nuclear safety.

“There’s nearly five million tonnes of legacy nuclear waste in the UK and cleaning that up is the biggest and most difficult environmental remediation challenge in the whole of Europe,” said robotics expert Professor Rustam Stolkin, who coordinates the RoMaNs project.

Stolkin and his colleagues are designing autonomous behaviours in robots so that they can sort radioactive waste according to contamination levels.

“This can only be done by robots because this waste is too radioactive for humans to go near, even wearing protective suits,” said Stolkin.


To date, robots operating in dangerous environments have been controlled by humans, but this one-way approach would be too slow for grasping and moving huge amounts of materials of unpredictable shapes, sizes and consistencies, says Stolkin. To overcome this problem, the team developed an autonomous, vision-guided robot which uses AI to assist the human operator.

Project partner France’s Alternative Energies and Atomic Energy Commission (CEA) created a nuclear-resilient robot arm with a hand and fingers, controlled by a robotic glove or haptic exoskeleton worn by the operator.

“This now is a bit like a fancy joystick,’ explained Stolkin. “So as you move your arm and your fingers, the slave arm in the radioactive zone moves its arm and its fingers.” The system uses AI for automatic vision, so robot knows how to detect, recognise and pick up objects.

The human operators share control of the robotic arm with the robot through tele-operation and AI, says Stolkin. For example, the operator can move the arm around and the robot automatically controls the orientation of the hand to make the grasp easier, or the robot, planning to grasp an object, would display its intentions to the human for confirmation.

“The robot [AI] is doing all the hard work, but the human still feels in charge at some level,” he said.

Such systems are usually very complex to control, says Stolkin, but this one allows operators to simply mouse- click on an object. The robot will go to it and grasp it.

When the robot arm touches a surface or grasps an object, the operator feels the contact through the robotic glove. Giving remote operators situational awareness of what is going on inside the no-go zone through a virtual sense of touch is extremely useful, says Stolkin.

For the gloves to work, the robot arm must behave adaptively, responding to the environment. To achieve this, the CEA team developed adaptive mechanisms in the arm joints that mechanically move like springs and are more resilient against radiation than delicate electronic parts.

Near future

In 2017 the RoMaNs team successfully tested a robot arm with an AI control system in a radioactive environment, under full nuclear safety and UK national security regulations, at a site in northern England operated by the National Nuclear Laboratory. This was the first time an AI-controlled robot was deployed in a real radioactive environment.

Stolkin had previously imagined that it may take at least another decade to transfer these technologies to the nuclear industry, but says plans are already being made to deploy them at decommissioning sites in the near future. “When we proposed this, the idea of AI-controlled robots, it was considered absurd by this industry,” he said.