Artificial intelligence (AI) and drones are predicted to alter the course of combat and warfare in the future. While the military is planning to leverage the computer vision capability of artificial intelligence to hunt down submarines, detect an enemy intrusion, or decode messages using machine learning abilities, several countries around the world have given nod to using drones in the name of national security. The information gathered by these military drones can assist in solving several problems at once and determining the best course of action. However, not many are on board with using AI-powered drones!
Difficult geographical locations can pose a daunting challenge in terms of accessibility and mobility becomes more difficult in regions where the army does not have authorization. Against this backdrop, the military may consider it critical to study population activities in general, and drones in particular, for their ability to address significant issues in transportation and space maneuvering. Also, advancement in military research on the use of artificial intelligence in drones will mean global dominance in technology and weaponry.
Most recently, President Joe Biden has stated that the United States of America will engage in “intense rivalry” with the People’s Republic of China. This means being able to confront Beijing for control of global trade, influence trade, and technology laws, and, if necessary, fight and win a war with the world’s second-biggest economy. The Pentagon is already researching war scenarios in which AI is permitted to operate on its own after receiving commands from a human. Though the Pentagon promises to build an “ethical” AI army, it would not be easy.
The drone market witnesses huge growth thanks to its commercial applications like aerial photography, medical and groceries delivery — all at a fraction of the cost to the military counterparts. This is also quintessentially why commercial drones outpaced military ones in terms of adoption.
However, arming these commercially available drones with human-like cognitive skills via artificial intelligence would transform them into strong targeted weapons available to rebel militaries and terrorists, for a fraction of the cost of the military drones used by the US government. They can misuse this technology for collecting intelligence, surmounting ground-based physical barriers, and carrying out highly effective airstrikes.
Meanwhile, leaders want drones to have more autonomy, allowing warfighters to delegate crucial duties to them. Commercial providers, on the other hand, have yet to attain advanced autonomy, despite several tries. As a result, military groups worldwide are seeking systems that can operate autonomously in highly complicated, disputed, and congested settings like GPS-denied areas or military camps areas protected by severe electromagnetic interference. This has forced the national defense bodies to turn to startups for gaining technical support.
For instance, since 2018, US Special Operations Command has been using Shield AI’s autonomous technologies onboard smaller quadcopter drones. The company claims that their patented Hivemind AI technology is well-suited to allowing unmanned aircraft to carry out a variety of duties, including “infantry clearance operations” and “breaching integrated air defense systems with unmanned aircraft.” Amid the demands of the Pentagon for VTOL drones that require less footprint during vertical take-off and landing (helpful in crowded areas) and have operational efficiency in GPS disable areas, Shield AI will be integrating its abilities with the V-Bat Drone.
The V-Bat drone, which has a wingspan of 10 feet and is equipped with a 183cc two-stroke engine that powers a ducted-fan propulsion and control system allows it to reach top speeds of around 90 knots and altitudes of up to 20,000 feet, has become a new favorite among US military personnel. The drone was recently selected as one of the finalists in the Navy’s Mi2 Challenge. This competition aims to “accelerate the identification and evaluation of Unmanned Aerial Equipment (UAS) capable of operating in austere deployed situations without ancillary support systems. Meanwhile, the Army Expeditionary Warrior Experiment, which looks for “concepts and capabilities at the lowest tactical echelon in support of Multi-Domain Operations (MDO),” is investigating the V-Bat for future usage.
China is not lagging behind either. This year the mandarin nation unveiled a shark drone to aid in the surveillance and tracking of hostile ships and submarines. Developed independently by Beijing-based Boya Gongdao Robot Technology, this stealthy shark drone can travel at speeds of six knots and will assist the country’s military with surveillance and search and destroy tasks. It was revealed at the 7th China Military Intelligent Technology Expo. This news came after researchers from Harbin Engineering University revealed they have developed an AI automated underwater drone capable of identifying and destroying enemy craft without human input.
This month South Korea’s Defense Acquisition Program Administration (DAPA) announced that the country will test grenade-launching drones that can be controlled remotely over a two-kilometer range while carrying gunpowder-filled 40mm rounds in 2022. Even Russia is ready with its Forpost drones, Kronshtadt Orion combat drones, Okhotnik-B (“Hunter”) stealth combat drones, Orlan-10 surveillance drones, and more.
This may not be the first time we are hearing the news on the possible usage of artificial intelligence-powered military drones, as scattered incidents have already taken place in history. In September 2019, Iran attacked Saudi Arabia with drones and cruise missiles. Turkey has developed a drone named Kargu-2 that allegedly “hunted down” retreating soldiers loyal to Libyan General Khalifa Haftar. However much remains unknown about how autonomous the drone was. According to its manufacturer, Defense Technologies and Trade (STM), Kargu-2 uses machine learning-based object classification to select and engage targets and also has swarming capabilities to allow 20 drones to work together.
While nations are busy adding killer drones to their arsenal, some of them are simultaneously working on developing anti-combat drone solutions too. US Start-up in the defense industry Epirus has developed Leonidas, a technology that can disable a hostile drone while leaving a friendly drone a few feet distant unharmed. Using super-dense Gallium Nitride power amplifiers and AI algorithms to stabilize, focus, and direct energy to precise frequencies, Leonidas can take out both large fixed-wing drones and small quadcopters.
In July this year, Anduril Industries was given a five-year contract for up to $99 million by the Pentagon’s Defense Innovation Unit to make the company’s counter-drone artificial intelligence technology available across the military. The autonomous c-UAS solutions from Anduril ingest surveillance data to detect, monitor and notify military users of possible threats.
While leveraging artificial intelligence in military drones seems like a plausible future, there are certain caveats that need immediate attention.
Experts warn that the datasets used to train these autonomous killer robots to classify and recognize items like buses, vehicles, and humans may not be sufficiently complex or resilient and that the AI system may learn incorrect lessons. In addition to this, the black-box nature of these algorithms may prevent understanding why the system made a certain decision especially during training and legal concerns.
Even during the data transmission between drones in a swarm and to the ground control unit, the presence of slower, more segmented one-point to one-point information transmission can cause significant latency challenges. Much research is needed in network time-sensitive targeting and transmission of intelligence information in real-time across a widely dispersed army of integrated combat “node points” of drones.
To make matters worse, artificial intelligence models still have trouble correctly identifying objects and faces in the field. When a picture is slightly altered by introducing noise of any type, the models are easily confused. This will create problems either while deploying a drone attack or disabling a swarm of drones. Critics also point out that the use of facial recognition in drones (especially military surveillance drones) raises ethical, moral, and legal dilemmas.
However, on the brighter side, if more countries acquire armed drones, there is a strong likelihood to set up international legal frameworks, as well as democratic principles such as transparency, accountability, and the rule of law.