Home >without pilot >The "artificial killers" .. Will the American army give us the power to kill robots?
Dec 12By smartai.info

The "artificial killers" .. Will the American army give us the power to kill robots?

Wallis Island is remote swip -quagmomated lands located along the eastern beach of Virginia, and is known as a region used by the government and private sector companies to launch missiles;But it is also considered an ideal patch for the experiences of high weapons technologies.The swamp crossed a hunting boat last October, mostly, its crew will have hinted a leak of rubber boats with a length of 7 and a half meters, and it cuts shallow water usually..In care, the crew would have noticed the absence of the human element;The engines of the motor valve rising and landing on its own as if it were inhabited by ghosts.The boats used super -technology equipment to sense the ocean, communicate with the rest of the other boats, and automatically concentrated so that the "Bruning 50" guns can fir.

These secret efforts, which are part of the "Seamob" program launched by the US Navy, aim to prove the ability of boats equipped with advanced technology to carry out deadly attacks without the need for humans in the command room.The experiment was successful, as it was described by some sources familiar with the program as a milestone in developing a new wave of smart weapons systems that soon will make its way to the battlefield..It is not the fully new deadly automatic weapons of the new matter. These systems were used for decades, and if their role is limited to defensive aspects, such as the explosion of the missiles that depart from ships and battleships.But with the development of artificial intelligence systems, the American army is about to recruit robots that are able to launch attacks, distinguish targets, and carry out deadly strikes by dispensing with direct human guidance.

US military officials still refuse to give these robots full power, and there are no confirmed plans to do so in the near future.The matter is normal, as many of these officers have arose for years on the importance of controlling the battlefield.The critics inside and outside the army are based on a number of fears, among them the inability to predict the decisions taken by robots or understand them, the possibility of a defect in the programmed instructions or their exposure to hacking, and the possibility of these machines to leave the role assigned to them by their makers, without neglecting the moral side,Where some argue that allowing this type of weapon violates the legal and moral legislation that was laid after the atrocities of World War II regarding the use of power in the battlefield.

But the use of artificial intelligence machines in the war is also shrouded in many advantages;Humans usually require a quarter of a second to pay attention to something in front of them, but the machines exceeded the human ability, at least in the speed of treatment.At the beginning of this year, for example, researchers at Nianiang Technological University in Singapore entered data 1.2 million pictures to one of the computers, and the computer managed to identify the elements in the pictures within 90 seconds, that is, within 0.000075 per image.

The end result was not that splendor, despite its amazing speed, the systems were able to identify the elements accurately in 58% of the times, which is average that could be disastrous in the battlefield.But the fact that the machines can behave, and they have a reaction, faster than humans. It has become more important with the acceleration of the war..During the next decade, the missiles will be able to go to space at a speed of miles per second, which is a speed that exceeds the human ability to make decisive decisions, as it will be able to attack the aircraft running in complete swarms using self -guidance, while specialized computers will be able to launch their aggression on other computers at speedthe light.Humans may make weapons and give them initial instructions, but they will be just an obstacle, and many military officials expect.

"The problem when you deal with war on the basis of the speed of machines is the presence of a point where a person becomes an obstacle; humans will not be able to keep pace with speed, which means that you have to delegate the devices at some point," says Robert Work, the Pentagon Deputy Minister of Defense during the Obama and Tris.These days, the American army branches are trying to search for a way that enables them to achieve tremendous leaps in identifying images and processing data with the aim of creating a kind of war that is more fast, more accurate and less human.

Meanwhile, the American army is developing a new tank system that can choose the targets intelligently and accurately aims.In addition to a missile system called a "JAGM), which has the ability to choose vehicles and attack them without human instructions.Last March, the Pentagon applied for funding from Congress for the acquisition of 1051 programs from the company "Lucherd Martin", one of the largest American military manufacturing companies, at a value of 367.$ 3 million.While the Air Force is working to modify a version without a pilot of F-16 fighters as part of the SkyBorg program, which will enable it at a stage of transporting huge quantities of military hardware to a computer-run battle.

Until this moment, the new war systems are still determined that the human permission requires the launch of its void violence, but what it requires is a few adjustments in order to be able to dispense with it.The Pentagon laws, which were enacted during the Obama administration, do not prevent the absolute power of computers in making killing decisions, but it requires an accurate examination of the designs by senior officials, which started wide discussions in the army on the circumstances in which machines will be allowed to take a decision of this kind.

The United States is not the only country that took this road.In the early 1990s, Israel designed an artificial intelligence system for a plane without a pilot called "Harpy", which is flying over the targeted area and attacks radar systems on its own..Israel later sold these regimes to China from among other countries.At the beginning of the second millennium, Britain designed the "BRISTONE" missile, which can find targeted vehicles and coordinate with other missiles with the aim of defining those that will be bombed, and the most appropriate arrangement of the bombing, although it is still rarely allowed to work with all this freedom.While Russian President Vladimir Putin boasted in 2018 regarding the deployment of a drone, he claimed to be equipped with "nuclear ammunition", pointing to a degree of automatic control over the most deadly human weapons.Exactly a year ago, Putin said that relying on artificial intelligence "comes with enormous opportunities, but it also imposes a difficult to predict", then added that countries that pump huge investments in developing artificial intelligence "will rule the world".

China has not made statements of this kind, but President Shin Bing thwarted US officials when he confirmed in 2017 that his country would be a pioneer in this field by 2030.China appears to be in principle to improve internal monitoring capabilities through face recognition technology and other technologies for identification, but the introduction of this technology into military use is not difficult, according to US officials..

Fear that the United States will leave its opponents, Russia and China, which sparked the "Cold War Technological War", as retired US army general, David Petraeus, described.The Pentagon did not disclose the cost of artificial intelligence development programs until the moment, although the "Congress Research Service" estimates that the Ministry of Defense has spent more than 600 million dollars on work in this field in 2016 and more than 800 million dollars in 2017.

Last March, the Pentagon stated that he was in the process of requesting allocations worth $ 927 million from Congress with the aim of developing more programs next year, from which $ 209 million will go to the new Pentagon Intelligence Office in the "Joint AI Center" (JAIC).Which was established in June of 2018 to supervise programs with a budget of $ 15 million.Most of the works carried out by this center are classified as secret works, and all officials spoke to the public are focused on disaster relief..The Pentagon is in its efforts in this regard, but the general documents and meetings that are held with senior officials and secret sources indicate that the army is in the stage of establishing the rules and foundations that will enable artificial intelligence to dominate more and more on military operations.

The road to war is tired of science

After reaching Mars in 2003, and on a journey that took 286 million miles, planetary scientists realized that rapid contact with the Earth will be a difficult task, as the arrival of the most basic orders from the planet to the vehicle requires that it prevents it from stumbling for a period of 10 minutes that arrived after stumbling.So scientists have developed sensors and computers that enable vehicles to search around dangerous places on Mars automatically.The matter has succeeded, as the vehicles that were originally designed to live 90 days and cut half a mile per day were able to cut more miles on the surface of the planet within a period of six years..

This achievement drew the attention of scientists at the offices of the Marine Wars Center (WCFC), in Maryland, where they asked the team that contributed to the design of these sensors to help the American navy in the design of independent ships and leaflets with higher efficiency.But these independent marine boats operate in moving waters, not on solid ground, which requires raising and improving the ability of sensors to distinguish and get to know.Most of this work was carried out by Michael Wolf, who joined the jet payment laboratory (JPL), in NASA, as soon as he obtained a doctorate from the University of California.The navy refused to grant him permission and any other official from the laboratory to talk to us.But he published his last research recently entitled "Savant", in short for "External Optical Analysis System and Independent Possibility".

At a time when the US Navy tested this system in 2009, as it benefits one of the official papers and pictures, the Swank had the shape of the lighthouse room, as six cameras were installed in a circular shape inside an anti -weather fluctuations box on the ship's mast, and the images taken were entered into a contract computer systemA comparison between it and a library of marine photos, including pictures of a huge number of ships.The system was designed that its intelligence increased over time, using self -education algorithms that have become associated with the development of artificial intelligence during the past five years..But by 2014, Wolf had not only got better pictures, but he helped to compose algorithms that help many boats and boots to coordinate them against a potential enemy, according to the American Navy..

The important thing here is the ability of these ships to share the data they collect from the sensors, with the aim of creating a full common image, each of which enables to decide the next step by themselves.But as a group that operates within smart fleets or an independent group of the US Navy as well.Each ship does what you think is the best choice for it, but it is also part of an organized swarm that has the ability to gather and destroy a number of potential targets, which prompted the navy to call it successive boats (Swarboats).

Initially, the ambition was only defensive, such as searching for a way to prevent a similar attack for the USS Cole in 2000, when two suicide bombers sent a boat loaded with explosives towards an American destroyer while it was supplied with fuel in the Yemeni port of Aden, and 17 sailors died in what37 were seriously injured, and the American destroyer was out of service for a full year.But in 2014, the program proved its efficiency when five boats with independent systems besieged other boats that were distinguished as bouquets of the enemy, which prompted a number of Pentagon officials to push towards more converting to artificial intelligence programs.As a result, the system that the jetting laboratory invented began to develop drones as well, in addition to the wild vehicles of the Independent American Army Research and Development Center.

The improvement of algorithms is increasing year after.Soon, some teams will be able to integrate nerve networks, which are computerized models aimed at simulating the human brain structure with layers of data processing clusters that mimic human nerves in the way they work.In 2015, a team from Microsoft and Google reached a thunderbolt using the "Imagenet" system, where they designed algorithms that exceed human performance: When distinguishing pictures, humans made a mistake of 5.1% of the time, while the Microsoft and Google algorithm error was less than 5%.The thunderbolt results of the "Imagenet" regime contributed to persuading the Pentagon officials that "the machines are better in distinguishing targets than humans," as Robert Work, who was then deputy minister of defense and who said that "was a day it was very important for the Ministry of Defense.".

"The Battalion" is a marine machine gun with six pipes to pump bullets, and by the firing of 75 bullets per second of huge and medium warships, it is somewhat disturbed before the launch, but it performs a number of corrections when following the upcoming threats in two miles, including missilesAnd the planes, as it monitors the path of lead so that it is sure that it is aimed at the target, all of this without any human intervention.The machine gun is not entirely, as the American warships took over for about 30 years, and the United States has sold it to many allies..

This long history is one of the many reasons that make the likes of Admiral David Han more flexible in allowing these machines to have the ability to make killings in the battle.The Admiral has a long history with automated devices, and today works as director of the Marine Research Office "ONR", which has a budget 1.7 billion US dollars annually, as well as the US Navy's endeavors to integrate artificial intelligence with more systems, and says: "These systems are not completely new; they may be greater, may contain more comprehensive data, but the essence is the same.".His office was one of the first financiers of the "Seamob" program, but it was registered like any other American navy from discussing the program or touching on the experience of October 2018.

From above the ground and be and beyond it

There are other independent marine programs that are designed for larger ships, such as the 40 -meter "SEA Hunter" ship, which was first run in 2016.It is still unclear how the US Navy arms the ships, but an anti -submarine battle will usually require the presence of a Sonar device, either at the top of the ship or below the surface, in addition to the torpedoes.One of the reasons for the importance of independent systems is the leaders of the leaders that exposure to the wireless devices to penetration or confusion is a reason for losing communication in the future, which requires these ships to have the ability to act on their own, otherwise it will be useless..

The American navy's work is not limited to independent shipping systems, as experiments extend to independent submarine systems to help eliminate marine mines, with the aim of providing ships of larger and most expensive sizes to transport soldiers and carry out other tasks.Han has described these submarines, and the most used species are "MK18", as the beginning of broader uses by the US Navy of Independent Systems Under Sea. في فبراير/شباط من العام الحالي، كانت شركة بوينغ قد فازت بعقد بقيمة 4$ 3 million أميركي لبناء غواصات بأنظمة مستقلة بحلول عام 2022، سيبلغ طول الواحدة منها 15 مترا، وستكون قادرة على قطع 12070 كيلومترا بمفردها.Once it is completed, the navy plans to try it to attack other submarines and ships alike, as documents obtained by Atlantic, as officials of the company refused to provide any details.

Partially, the frantic interest by the Pentagon with independent armament is due to the huge jumps in the fields of engineering and computing, but the biggest reason is Robert Work, who spent 27 years in the navy before he became defense minister, where he devoted a lot of his time studying history and military strategies, specifically inWhile working at the "CNAS", a group of thinking in Washington, in 2013, where the results that ended with a simulation of virtual conflicts with China or Russia were terrifying to "Work" and his colleagues..After the Cold War, this type of training was either ending with the United States' achievement of an overwhelming victory, or a nuclear bombing.But the new simulation made it clear that the technological superiority of the United States began to evaporate.

With the beginning of 2014, Work began publishing reports that determined the need for weapons that are less expensive, more flexible, fast and deadly so that the American forces give more speed and reduce their dependence on aircraft carriers and the similar fragile equipment, which prompted US President Barack Obama to appoint him as deputy minister of defense, in FebruaryFebruary of the same year, after several months, Work launched his personal technological initiative.Since Work and others began to define the types of technologies that they believed to be able to tend to the power of power in favor of the United States, including cyber wars, artificial intelligence, and hybridic, until they began to transfer money and human resources to prepare these technologies to fight.

These steps have pushed other countries to launch their own creative military initiatives;In 2016, China revealed "Junweikejiwei", its new research and development agency, similar to the American Darba Agency.Likewise, Russia has the Skolkovo Institute of Science and Technology outside Moscow, which was described by sources in the US Department of Defense as a copy of Darba..The institute was established in partnership with the American University of "MIT", which quickly withdrew after the US sanctions that affected Victor Vikixilberg, the Russian billionaire who funded the project and was linked to the intervention scandal in the elections.

Work says that thinking that automated devices can go out of control as they appear in "Terminat" movies is unlikely, because the offensive technologies that have been circulated have very limited applications, they "only attack the goals that have been asked of them".He sympathizes with the Air Force General Paul Silva, who retired from his position as Deputy Chairman of the Joint Chiefs of Staff last July, and is one of the biggest supporters of creativity in the field of artificial intelligence.But Silva frankly talked about the "Terminator dilemma", or how to deal with machines that leave her freedom of murder.

Speaking to a group of thinking in Washington in 2016, Silva explained that the issue is not hypothetical "in the world of independent systems, and while we look at the achievements of our enemies in the same track, we must realize that the concept of a rob to have the freedom to harm the harm has become in our hands.No, he is no longer good or improved, he is here, in our hands..In a speech delivered by Silva at the Brookings Institute, he said that with artificial intelligence it will be possible to rectal robots to "learn the method", and then we say to her: "After you learn the method, set the goal.".In these examples, the behavior of robots is not just implementing instructions by other people, but rather a behavior based on signs that she created after learning through the experience, either their experience, or the experience of other robots.

"Seamob" has not yet reached this stage, but they are currently set.In the black photography of Terminator's films, a system called "Skynet" decides to wipe humanity from the face of the earth.One of the operators of the "Seamob" program had concluded one of his presentations by referring to the movie jokingly: "We build Skynet", before adding the last slide: "But our mission is to make sure that the robots will not kill us.".

—————————————————————————

Translation: Farah Essam.

This article is translated from The Atlantic and does not necessarily express the field of Maidan.