Note: This article first appeared at In Military.
Unlike HUMINT (human intelligence) and SIGINT (signals intelligence), using FICINT, or fictional intelligence, to envision future war is a relative newcomer.
Start a Homeland Security degree at American Military University.
In most forecasting scenarios, wargaming and simulations can take planners only so far. The strength of FICINT, especially in the hands of a talented author, is that it can analyze and develop war scenarios from friendly, enemy, or neutral perspectives using real locations and emerging technologies combined with fictional characters.
I recently sat down with the author of a new FICINT novel called “Burn-In,” a riveting technothriller that is not only highly entertaining but meticulously researched and presents a near-future state that is closer than you think.
Written by P.W. Singer and August Cole, “Burn-In” wastes no time plunging us into a near-future version of Washington, D.C., with a fictional U.S. Marine Corps veteran-turned-FBI Special Agent, Lara Keegan, as she navigates a perilous set of circumstances and technologies from the near future.
Singer is a strategist at New America (formerly the New America Foundation), a Washington, D.C., think tank. The Smithsonian Institution named him one of the nation’s 100 leading innovators. Defense News cited him as one of the 100 most influential people in defense issues. Foreign Policy put Singer on its Top 100 Global Thinkers List and the U.S. Army’s Training and Doctrine Command called him an official “Mad Scientist.”
I first became acquainted with Singer and his co-author Cole through their 2015 novel Ghost Fleet: A Novel of the Next World War, which was recommended to me by my former squadron commander after I had left the Air Force.
What follows are excerpts from a thought-provoking conversation I had with Singer covering several technology topics mostly focused on national security.
Wes O’Donnell: Peter, thanks so much for taking the time to chat. Can you start by telling me a little about your background?
P.W. Singer: Thanks for chatting, Wes. I’ve always been drawn to topics that cover national security. I was a big military history buff growing up; came from a military family. I went off to college and majored in the topic and then started my career.
The first book that I wrote was the first book to look at the rise of private military contractors, companies like Blackwater and Halliburton. They are familiar [names] now, but they weren’t back then when I started on the project. Then I wrote a book that looked at warlords and child soldier groups. Following that, I wrote a book that looked at the use of robotics and unmanned systems in wars called “Wired for War” that became my first bestseller. After that came projects on cybersecurity and social media disinformation.
To sum up my work, I take some hardcore, complex research and try and translate it for a broader audience that needs it. I then jumped into the fiction and August Cole and I teamed for a book called “Ghost Fleet.” It was a look at what a future war between the U.S. and China and Russia might look like.
It’s notable given the background of a lot of the audience [that] when we started “Ghost Fleet” in 2013, everything from Pentagon policy to U.S. military training was focused on terrorism and insurgency.
It was a novel, but ended up having an amazing amount of policy impact.
Wes: I’m familiar with “Ghost Fleet.” That was recommended to me I think back in 2017 from a friend of mine who was still active duty in the Air Force. I didn’t make the connection until I was halfway through “Burn-In” and I was like, “Oh! This is the author of ‘Ghost Fleet.'”
I think you mentioned something extremely interesting in that we are moving towards the realm of fully autonomous weapons systems. And I think there are some questionable ethics involved with that. I fear that there is this new arms race now for autonomous weapons systems that are going to have to make the kill decision.
Let me give you an example: When I was deployed to the sandbox, we were guarding Patriot missile batteries and Bedouins with their flock and their sheep would get close to our perimeter. We would send out the QRF (Quick Reactionary Force) to scare them away. Had we been replaced by an autonomous weapons system, technically by the rules of engagement they would have been allowed to open fire on those Bedouins just for getting close to our perimeter. So, how do you build morals into these robotic systems? I think that’s an interesting angle.
Singer: It’s funny because I thought you were going to go in a different direction with that example because the Patriot — it’s not Skynet making its own decisions — but it is incredibly automated to the extent that, in active conventional warfare, it’s been put on a setting where the human role is just veto-power.
It’s not the old version of humans in the loop. Because of the high-speed nature of it and the greater trust in the machine, it’s led to a friendly fire incident. A British Tornado jet over Iraq during the invasion was taken out by a Patriot missile battery in just that scenario. It misinterpreted [the jet] as an incoming Iraqi missile, the human crew had only a few seconds to react to what the machine was telling them and fired on the allied jet.
I think there are a couple of things there: I think the first is we don’t even have to wait for what they call LAWS, Lethal Autonomous Weapon Systems, for these kinds of dilemmas to happen. They are already playing out right now. As we explore in “Burn-In,” the real-world issue of AI and robotics is not HAL or Terminator or these kinds of killer robots, [it’s] more about the next industrial revolution, the technology being woven into almost everything.
Sometimes it will be in visible ways, an unmanned system out in the field, for instance. Other times it is going to be software guiding human behavior and taking on more and more decisions that humans would have made. In either case, what you have in the real world are these new kinds of dilemmas that are straight out of science fiction.
On law and ethics, it’s questions like machine permissibility and machine accountability. Machine permissibility is the question, “What are our evermore intelligent and independent machines allowed to do on their own?” Machine accountability is “Who owns this evermore intelligent and independent machine and who owns it in a good way?” Who gets the fruits of all its labor, all the data that it’s collecting about everyone and everything that it comes into contact with? Who gets to monetize that and profit from it, but it also is who owns it when things go awry? Which human do we hold responsible, if at all?
Those questions are such great sci-fi fodder, but they’re also relevant everywhere from the battlefield to highway.
So that’s part of the point of the “Burn-In” project to explain and explore these issues that seem like they’re straight out of science fiction but apply to our real world.
Wes: Let’s talk about the privacy versus security debate, like the old Snowden argument. I think with the new coronavirus tracing contact app, it sort of revived that debate and I have a feeling that the technologies of the near future, some of the ones in Burn-In especially, will make NSA Prism System or XKeyscore look like child’s play. So, how can society navigate the right to privacy and the need for robust law enforcement tools with these new technologies?
Singer: You have three parts of a triangle: security, convenience/profit and privacy. Each of them is kind of tugging at each other. Essentially, everyone from government to the private sector to the individual, is deciding where within that triangle to emphasize the most.
So you can see how something that offers greater security — and it might be security against crime, security against terrorism, security against disease — pulls away from greater privacy or, in turn, something that is more convenient, more profitable, but will also tug away from that privacy side.
What I just said describes everything from the face recognition deployed into American cities to toys that you allow your kids to use. Where within that triangle are you most comfortable or where are you going to put the settings? Where is the company going to put the settings? What is the government allowed to do?
That’s one of those things that we wrestle with in “Burn-In”: More and more, you have to be aware of how each of these is dueling with each other and you are going to be deciding for yourself, for your family, and for your nation about which one you believe matters the most.
And if you’re not aware of it, you are the open target from the darker side of each one of those. You’re the one losing privacy. You’re the marketing target. You’re the least secure. It’s really about awareness of it.
Wes: Let’s talk about robots. We’ve been promised robots as long as we’ve been promised flying cars. I love the partnership that you have in Burn-In between Keegan and TAMS, an AI. Why are robots more likely now going from the 2020s and beyond? And just to piggyback on that, how do you think automation is going to affect labor markets?
Singer: The irony is we are at the 100th anniversary of the creation of the word “robot.” It was for a 1920 play RUR, that we would call early science fiction.
Now the science fiction of robots is coming true, but it’s not what sci-fi prepared us for. It’s not a guy in a metal suit, who wises up and rises against us. It’s an industrial revolution. Robotics in all sorts of sizes and shapes and forms — everything from tiny handheld ones that soldiers in the field are deploying to an MQ-1B Predator drone that has been deployed to monitor the protests in Minneapolis.
Wes: I have such a hard time classifying “Burn-In” as hard science fiction because it’s really not fiction. You’re referencing all of these real-world examples of the direction that we are currently moving. You’re painting a pretty convincing landscape of the world of 2050.
That’s one of the things that fascinates me about FICINT; it’s being able to envision this future landscape, whether it’s warfare or society, through fiction. I would imagine that the intelligence you could acquire from fiction is only as good as the author. Is that your experience?
Singer: FICINT is a tool for analysis, explanation and prediction, and just like SIGINT and HUMINT; it’s not the only tool. It’s not just telling a story. It’s not sci-fi. There are rules to FICINT and most important of these is that it’s a combination of narrative world-building and real research.
FICINT takes place in the real world. It’s not set on distant planets. There has to be real technology. We call it the “no vaporware” rule. Anything in it has to be a technology that already exists and is deployed or is at least in the prototype stage.
I think it gives people that added oomph of “Wow, this is scary!” not just because it’s scary on the page, but when you check the footnote, it really could happen or it did happen somewhere else.
Wes: I might be missing the mark here but from a pure intelligence standpoint, different people learn in different ways. Not everybody can read a white paper and envision how those tools and technology are going to play out in the real world. So I think the value for something like this comes from actually putting this technology into a real-world setting that you’ve created with some compelling characters who have to now navigate through this environment. It’s almost like watching a wargame play through. People can watch all these tools and these characters use these tools to, in this case, fight terrorism.
Singer: There are three layers to it: First, research shows that narrative is, in some situations, a better means of sharing information than even the most canonical of academic sources. And it’s because the narrative is the oldest tool of communication.
We shared information through the story back when we were all sitting in caves around the fire. PowerPoint is only 30 years old. It shouldn’t surprise you that the human brain is more tailored for a story than a PowerPoint or academic white paper.
The human brain can’t help itself but feel it as if it is an experience. So, it becomes like a life experience even if it is someone else’s, even a fictional life led by you walking in those characters’ shoes for a short bit of time.
The second issue on why it has a greater impact is not just the way your brain processes, but your emotions. It is more likely to put you in an emotional state where you don’t just digest the information you act on it. One example might be fear: “Wow, I read about this nightmare scenario. What am I going to do to keep that from happening to me?” We experience that with Ghost Fleet. A Navy admiral shared that with us.
That was what drove a DoD investigation related to some supply chain security issues. You know, the nightmare scenario that he read about, he was like “Ahh, can this happen?” Then he goes to the footnote and says, “Okay, it can.”
The third issue is not just readability but shareability. To put it bluntly, more people are likely to read, talk about, and share a good story than they are a good PowerPoint or white paper. And that is up to the highest levels of power. No one ever said to someone else, “Man, this is such a killer PowerPoint! You ought to read it on your next beach vacation!” But they will do that with a novel.
Wes: Peter, it was fantastic speaking with you! Thanks again for sharing your insight with my audience. Best of luck on the successful launch of Burn-In.
Singer: Thanks, Wes. It was my pleasure.
Burn-In is available now at your favorite bookstore and, of course, at Amazon.