The first rule of robotics is that humans must flourish

The first rule of robotics is : Humans must flourish

Science fiction writer Isaac Asimov wrote about using robotics to control smart machines Article:

  • The robot can’t hurt humans, nor can it sit and watch humans get hurt.

  • Robots must obey human commands unless they conflict with the first law.

  • The robot must protect itself from existence as long as it does not conflict with the first or second law.

This is often the case, and science fiction has become a scientific fact. A report issued by the Royal Society and the British Institute shows that there should not be three smart machines that we will live in at the same time, but only one general principle: “Humans should flourish.”

The first rule of robotics Yes: Humans must thrive

For robotic systems, prioritizing human interests for the machine is paramount. And the development of such systems cannot be managed solely by technical standards, they are also full of moral and democratic values. To bring benefits, the public must be trusted and trusted that these systems are being used and managed correctly. “

For example, the development of autonomous vehicles has raised questions about how human safety should be prioritized.

When machines must be selected between vehicle safety and pedestrians, What happens?

If there is an accident, there is a question of determining responsibility. Is it the fault of the owner or the machine?

Fundamentally, the intelligent system is only when people believe in them and It will only work if they are supervised.

Without these, the enormous potential of these systems for human prosperity will never be fully realized.