Tech: Swarming at UPenn’s GRASP Lab

By Arthur Holland Michel

Some of the most exciting and advanced technological developments in the world of drones are coming out of the General Robotics, Automation, Sensing & Perception (GRASP) Lab at the University of Pennsylvania. Last year, videos began to circulate of small drones cooperating with each other, independently of human manipulation. The drones at the GRASP Lab were seen flying in formation, carrying weighty objects, negotiating obstacles courses, assembling simple structures, and making “aggressive” moves.

As a result of this work, the director of the program, Dr. Vijay Kumar, has become somewhat of a celebrity among those interested in the future of the drone. His research has been well publicized, and has been fuel for doomsayers all over the world. Last year, he delivered a  TED Talk which reduces some of the complex science involved in swarm technology into layman’s terms. An earlier TED talk, by the Cornell mathematician Steven Strogatz, complements the work of Kumar and the GRASP Lab by shining some light on the science behind swarm behavior in nature. More recently, Ars Electronica Futurelab took some of this swarming technology into the art world, and created this hypnotically beautiful drone dance, which we featured in last week’s roundup. Somehow, the drones in the Futurelab experiment  seem even more autonomous than those in the GRASP videos. By giving us the impression that they are making art, rather than completing a simple task like any of the GRASP drones, we get the sense that they are serving a higher, more abstract purpose.

The work of GRASP is not only significant because of the visual impact of watching drones communicate with each other—for many critics of drones, the appearance of the GRASP videos signaled the beginning of the end  (and by end, they mean the singularity)—but also because autonomous warfare will require precisely this kind of technology. Swarming is increasingly becoming a buzzword in military circles, in part thanks to the seminal text Swarming and the Future of Conflict, from the RAND Corporation. According to the study, swarming “may eventually apply across the entire spectrum of
conflict—from low to high intensity, and from civic-oriented actions to military combat
operations on land, at sea, and in the air.” In short, swarms will matter.

John Arquilla and David Ronfeldt, the authors of the paper, write that while examples of military swarms can be found in history, the technological advances of the Information Age have allowed this tactic to become a doctrine in it’s own right. Swarming is typified by “a deliberately structured, coordinated, strategic way to strike from all directions, by means of a sustainable pulsing force and/or fire, close-in as well as from standoff positions.” It has been a tactic of protesters and soldiers alike. At the center of effective swarming is the efficient flow of information between individual units participating in the swarm. The interest in swarming has grown parallel with the popularity of ‘network-centric warfare,’ a form of organization that is described by General Stanley McChrystal  as the “new front line of warfare.” According to the authors of Swarming and the Future of Conflict the concepts of swarming and networked warfare have the potential to drastically transform military strategy and tactics:

“For American political and military leaders, understanding the rise of swarming should lead to reappraisals of both our mass-oriented, industrial-age way of war, and of the statist focus of our diplomacy.”

Futurelab technicians prepare drones for an aerial swarming performance.

Futurelab technicians prepare drones for an aerial swarming performance. (Photo credit: Gregor Hartl)


Vijay Kumar’s goal is to reduce the need for human intervention in drone operations. But Kumar doesn’t envision drones swarming for war. In his TED Talk, he lists the humanitarian and utilitarian potential of swarm technology, such as disaster relief, conservation, construction. Military applications figure nowhere in his presentation, though it’s hard not to think about military swarming when one watched the drones as they dart about, forming themselves into tight rows and phalanxes. Meanwhile, the Applied Physics Laboratory at Johns Hopkins University, in collaboration with Boeing, is  developing autonomous swarm technologies specifically for military applications. According to Boeing, “The technology allows UAVs to perform similarly to a swarm of insects, completing tasks more quickly and efficiently by communicating and acting together.” The goal of the program is to develop drones with the kind of swarming capabilities imagined by Arquilla and Ronfeldt in Swarming and the Future of Conflict. In response to the program, there have been numerous protests by peace activists on the Johns Hopkins Campus. Presumably, the protesters at Johns Hopkins would like to see the university’s research efforts directed towards the kind of swarming applications envisioned by Kumar and the GRASP team. And yet there has also been a negative response to the GRASP research, and swarming technologies in general, based largely on objections to the mere idea of mechanical autonomy, and, in some cases, the fundamental spookiness of drones that behave like animals.

The content of such objections rarely engages with the technical details of GRASP’s work. While many people feel more than a little uneasy about the thought of taking the human out of the equation, when the science of GRASP’s drones is decoded, and we grasp that the “autonomy” in this case is really just a series of algorithms, the drones lose a bit of their mysticism.

Here, a drone learns how to explore a building that it has never encountered before.

When we talk about drones “making decisions” or “exploring environments” or “talking to each other, it sounds as though we are talking about psychology, even though we’re really just talking about mathematical equations and binary codes and GPS signals. Though it won’t always be the case, animal-like behavior in drones is more about giving the impression of behavior than it is about actual behavior in the ethological sense. One computer scientist I know recently remarked that scientists often benefit from the misconception that their robots are actually fully autonomous, because the allure of an autonomous machine can prove enticing for potential sources of funding.

Even though it’s true that the science behind this technology is overwhelmingly complicated, a little bit of technical literacy goes a long way. Without technical literacy, the discussion about drones is not only limited, but also susceptible to inaccuracy. At The Atlantic, Conor Friedersdorf expressed his horror at a simulation of swarming robots being developed by the Air Force, but he doesn’t pause to consider what might be going on technologically inside the drones imagined in the video. What is the battery life of these micro-drones? How do they explore new environments without crashing? How do they established their position in space without the aid of Vicon motion sensors, which are used in all the GRASP Lab videos? Even a brief look at the science helps to complicate our idea that drones are indeed autonomous at all. The road towards increased autonomy is fraught with technological challenges. By just watching Youtube videos, or reading sensationalist media coverage of this technology, we don’t get to see that part of the picture. The struggle to develop more autonomous drones is at the center of the whole issue. The future of drones will be determined by scientists just as much as law-makers. Any debate about the moral implications of drone swarming needs to engage with at least the basic technical facts. Learning a little about the work at GRASP Lab is a good place to start.