self-driving
A self-driving taxi during its public trial in Singapore, Aug. 25, 2016. Reuters/Edgar Su

Even though self-driving legislation is yet to go to vote on the Senate floor, other nations are acting fast and conducting it a faster pace. Germany has come out with a unique way of legislating self-driving rules which it calls ‘ethical’ self-driving rules.

The legislation contains rules such as saving humans before animals and property, saving lives without discrimination and that self-driving software should contain safeguards against malicious hacking.

Germany has taken a novel approach, rather than concentrating on self-driven cars, it concentrates on self-driving software — most self-driving legislation concentrates on self-driven cars and consequences of putting them on roads and literally in charge of human lives. The German legislation concentrates on the root of self-driving, the computers, and software used to run self-driven cars.

Self-driving is expected to reduce the loss of human life and therefore, the government’s intervention in deciding rules governing it is supremely important, and the minutiae of such rules, such as prioritizing humans over animals and property has been picked up the legislation.

The guidelines of the legislation were presented by the Federal Transport Minister Alexander Dobrindt, in a report on automated driving to the German Cabinet last month. The report had been prepared by the Ethics Commission on Automated Driving, which consisted of a panel of scientists, subject matter experts, and legal experts.

According to the report, “Nevertheless, at the level of what is technologically possible today… it will not be possible to prevent accidents completely. This makes it essential that decisions be taken when programming the software of conditionally and highly automated driving systems.”

It contains 20 guidelines for the motor industry to consider while creating self-driving software. It does acknowledge that some issues might be too ambiguous to solve. It does answer the most important question regarding self-driving — in case of an accident causing loss of human life, who needs to be blamed? Should it be the company designing the car or the owner of the car? Well, it should be the human sitting at the wheel who should take over control when such situations emerge. If humans fail to act, the vehicle should swiftly come to a stop.

What it also does though is direct the car to take the best course of action in such a situation. It does not answer the question we have been seeking an answer to totally, since there still is ambiguity over who would be blamed, but comes closer than other countries legislation has ever come.

The adoption of basically a code of ethics for self-driving can actually put Germany ahead in the global race to bring out successfully working self-driving vehicles.

The report says that no self-driving system is perfect but these systems need to be safer than current human driving.

“In the era of the digital revolution and self-learning systems, human-machine interaction raises new ethical questions. Automated and connected driving is the most recent innovation where this interaction is to be found across the board...we are now going to implement these guidelines – and in doing so we will remain at the forefront of Mobility 4.0 worldwide,” Dobrindt said in an official press release Thursday.