Talk about how drones are going to change everything from the climate change debate is being shaped to the way in which data is gathered on a farm is as exciting as it is ubiquitous. There are numerous people talking about how UAV technology is going to redefine various industries and professions, but that redefinition gets taken to another level when you start talking about concepts like brain-machine interface (BMI) and human systems integration (HSI) when it comes to controlling a swarm of drones.
These are concepts that Panagiotis Artemiadis, an Associate Professor in ASU’s Ira A. Fulton Schools of Engineering, is working to create and develop. Professor Artemiadis has previously detailed what concepts like controlling drones with your mind will mean to the industry, and it’s a topic he’s going to showcase at the Commercial UAV Expo.
To get a better understanding of how these topics are going to be explored at the Expo, we connected with Professor Artemiadis to learn what people need to know about concepts like HSI’s, why it’s important for commercial operators to learn about these topics now and plenty more.
Jeremiah Karpowicz: Can you tell us a little bit about the work you’re focused on at the HORC Lab?
Panagiotis Artemiadis: The HORC Lab focuses on advanced cognitive human-robot interaction methods and architectures. Through the project on Human-Swarm Interaction (HSI) we are developing new algorithms to extract information from the brain that is related to collective behaviors of swarms. In other words, we are using the signals recorded from multiple areas of the brain to control a big team of robotic agents to act collectively. Collective behaviors are abundant in nature (for example swarms of bees or flock of fish), however only recently have been adopted by robots. The HORC Lab goal is to develop novel interface for humans to be able to control swarms of robots.
At a high level, what do drone operators who are not familiar with concepts like HSIs and BMI’s need to know about these terms?
Both terms have to do with how humans can interact and control swarms of multiple agents, in this case swarms of drones. Joysticks, keyboards and other conventional control interfaces are good only for a single or maybe a couple of drones, i.e. they give the operator the ability to control only one drone at a time. The Human-Swarm Interfaces we develop allow a single operator to control multiple drones (theoretically as many drones as a swarm would have), and allow for more complex behaviors, i.e. expand the swarm to cover an area, or make a specific formation. Therefore, the HSI gives a plethora of new capabilities to the operators, that can be found very useful in many applications such as search and rescue, exploration, surveillance and military missions.
Are drones particularly well suited to developing and utilizing BMI interfaces? As opposed to other devices?
All drones that can receive high-level commands (e.g. move along an axis, or go to a specific point in space) are suited for utilizing the Brain-Machine Interfaces. The new interfaces provide unique ways for humans to send those high-level commands to the drones, and therefore, drones that can execute basic motion patterns are suitable. In the future, we would like to be able to work with drones that can easily “talk” to each other by exchanging information to their neighbors, so that we offload some of the safety concerns from the operator.
How have you addressed some of the concerns and fears that I must image come up whenever you talk about concepts like controlling robots with you mind?
I personally do not see the reason for those concerns. We introduce a new, more intelligent and comprehensive interface. The new interface replaces multiple operators, using multiple conventional interfaces (e.g. joysticks), that are needed to control of a swarm of drones, by only one operator with a BSI. What the operator will command the swarm to do might be a concern for some people, but this has nothing to do with the interface the operator uses.
You mentioned that your goal is to decode brain activity to control variables for a drone swarm. What sort of inherent advantages are associated with this type of control versus manually controlling a swarm?
To understand the value of decoding the brain to control a swarm of drones, one has to first appreciate the advantages of a swarm itself. The swarming paradigm, deriving inspiration from the behavior of natural swarms such as bird flocks and fish schools, offers myriad advantages to a team of drones. A swarm system consists of a large group of relatively inexpensive, interchangeable vehicles that execute autonomous decisions using information obtained via local sensing and communication. The redundancy in a swarm makes its operation robust to vehicle failures and disturbances, which also enables the use of sacrificial platforms, and its distributed activity can conceal the system’s mission from an opponent. Recent advances in computing, sensing, actuation and control technologies are currently enabling the development of swarms of aerial vehicles, varying in complexity, size and overall scale. The integration of very large teams of robots into comprehensive systems enables new tasks and missions ranging from search, exploration, rescue, surveillance, pursuit, up to deploying infrastructure. By decoding the brain activity to collective behaviors for a drone swarm we take advantage of the additional capabilities a swarm of robots or drones offers. Manual control of a swarm is currently unavailable, or very limited in scope. By decoding the brain we can extract information related to the desired collective behaviors (e.g. expand swarm coverage or make a specific formation) that is not possible with manual control interfaces.
In what ways is your work different from the work Intel is doing with their drone light shows to create a model that allows a single user to control an entire fleet?
Intel’s efforts on the control of multiple UAVs should not be underestimated. Just flying in a coordinated way a team of hundreds of robots is a very challenging task. Their goal is to have one pilot (or PC) to control all of them, but the control will be eventually done with a high-level control interface and the behaviors will be selected out of a finite set related to the mission. Our work on BSI allows the user embody with the swarm and derive in a very natural and intuitive way new behaviors, or allow more complex behaviors to emerge through this human-swarm interaction. Moreover, in our work on BSI we are interested in the inverse process; the perception of the collective behaviors by the brain. Once we understand how the brain perceives collective behaviors, for example how the brain sees and understands hundreds of ants collectively working, we can extrapolate that knowledge to applications where the human cooperates with the machine in inferring swarms goals and future behaviors; this is another project though.
I imagine HSI interaction is the sort of thing that could be utilized with the “fleet of drones” that we’re always hearing about?
Absolutely, and this is another example where traditional manual interfaces fail to accommodate the control of a fleet of drones that can be very helpful in applications like agriculture etc.
How do you envision this development will impact the commercial implementation of drones?
I strongly believe that the swarming paradigm will lead the commercial drone technology to think about introducing capabilities that would allow communication among drones. You can think of it as communication among smartphones, or driverless cars that are in an intersection. This communication capability can be used for more efficient execution of collective tasks or to avoid “congestion” in some areas.
Do you think HSI interaction will change the way operators think about drone swarms?
If we want to take advantage of the plethora of capabilities a drone swarm provides, then we need to be able to control them. Therefore, we desperately need novel Human-Swarm Interfaces (HSI).
What sort of information will you be looking to showcase around your work at the Commercial UAV Expo?
I will show first how collective behaviors are mapped on the brain. Then show how patterns of activations at the brain level are used to control specific collective behaviors of a small swarms of quadrotors for indoor coordinated flight. With the current commercial ways to record brain activity, the application of the BSI to commercial drone fleets is not far.
Will you be able to talk about the next major milestone in your development process at the Expo?
The next milestone in my research is to expand this interface to multiple operators controlling multiple teams of robots that could be both ground vehicles and aerial vehicles, with application to search and rescue, area exploration etc. This is something that we are currently working on in the lab.
It’s probably going to be a little while before we see HSI interaction out in the real world, so why do current commercial operators need to know what’s going on with it?
Current commercial operators and technology developers should know about the advantages of swarming behaviors, and explore new ways to use them and make them available to the users respectively. It is necessary to start thinking about the network of drones now, in order to be implemented in the near future.
See what else Professor Artemiadis has to say about creating and controlling swarms of drones at the Commercial UAV Expo.