More than 50 leading experts in the field of artificial intelligence have announced their intention to boycott the Korean Institute of Advanced Technology (KAIST) – the prestigious and leading educational and research university of South Korea – for creating a laboratory for developing weapons using artificial intelligence. They reported this in an open letter.
The indignation of experts from the field of AI also evoked the agreement signed with KAIST with the arms company Hanwha Systems, which, according to various sources, is already developing powerful weapons, among which, among other things, are cassette bombs prohibited under the UN Convention on Cluster Munitions (South Korea did not sign this convention).
"While the United Nations is discussing how to curb the threat to international security from autonomous weapons, it is unfortunate that a prestigious university like KAIST is looking for ways to accelerate the arms race through new developments similar types of weapons, "AI experts write in their open letter.
" Therefore, we publicly declare that we will boycott all types of cooperation with any KAIST units until the president of KAIST provides for the belief that the center will not develop autonomous weapons, whose operation is possible without significant human involvement. "
KAIST has at its disposal one of the world's best robotic and scientific laboratories: it conducts research and development of a variety of advanced technologies , starting from liquid batteries and ending with medical equipment.
KAIST head Shin Seon Chol said that he was saddened by the decision to boycott, adding that the institute is not going to create robotic killers.
"As an academic institution, we very much appreciate human rights and ethical standards. KAIST is not going to conduct research activities that are contrary to human dignity, including the development of autonomous weapons that do not require human control, "commented Sin Song Chol in his official statement on the boycott.
Given the rapid pace of development of hardware and software, many experts are concerned that sooner or later we will create something that we will not be able to control. Ilon Mask, Steve Wozniak, the recently deceased Stephen Hawking and many other very prominent figures in the IT field once called upon the governments of the countries and the UN to impose restrictions on the development of weapons systems controlled by AI.
According to many, if we come to rockets and bombs, whose functioning will be possible regardless of human control, the consequences of this can be appalling, especially if such technologies fall into the wrong hands.
"The creation of autonomous weapons will lead to a third revolution in the military business. Such technologies will allow for more rapid, destructive and large-scale military conflicts, "the authors of the open letter note.
" These technologies have the potential to turn into a weapon of terror. Dictators and terrorists will be able to use them against innocent groups of the population, ignoring any ethical aspects. This is a real Pandora's box, which, if opened, will be virtually impossible to close. "
The UN members are going to hold a meeting next week, at which it is planned to discuss issues related to the banning of lethal autonomous weapons capable of killing without human participation. By the way, it will not be the first such meeting, but the participants of the discussion have not reached an agreement yet. One of the problems facing the adoption of any concrete decisions is that some countries are not at all against the development of such technologies, as they will be able to increase their military potential. Stumbling blocks are also private companies that actively resist government control over the development of their own models of artificial intelligence systems.
It would seem that many agree that robotic killers are bad. However, it is not yet possible to take any concrete decisions as to what to do with their development. It is possible that the development of new weapons systems, which KAIST plans to engage in, will force more parties to participate in the necessary discussion of the problem.
Many scientists call for the creation of recommendatory instructions governing the development and deployment of AI systems. Nevertheless, there is a high probability that even if they are available, it will be very difficult to see to it that each side adheres to them.
"We are fixated on an arms race that nobody wants. The actions of KAIST will only accelerate this arms race. We can not stand it. I hope that this boycott will increase the relevance of the need to discuss this issue at the UN on Monday, "said Toby Walsh, a professor of artificial intelligence at the Australian University of New South Wales, who also signed the letter.
" This is a clear appeal to the community engaged in technology development artificial intelligence and robotics – do not support the development of autonomous weapons. Any other institutions that plan to open such laboratories should think twice. "