fbpx
Politics Foreign Affairs Culture Fellows Program

How Silicon Valley Can Reduce Social Pain For Once

Too many Americans, military and civilian, are killed by friendly fire. Big Tech might be able to help with that.
Soldiers smartphone

Every day we read about new miracles of technology. Yet still the age-old question is: cui bono? That is, who benefits? We know that Silicon Valley tech titans have benefitted, to the tune of many billions of dollars, from their innovations. So perhaps now it’s time that the rest of us became more assertive—or at least more alert—about the possibility of harnessing innovations for collective societal benefit.

Today, many of these tech miracles are coming in the realms of artificial intelligence (AI) and facial recognition. For instance, Snapchat is reportedly working on facial recognition technology to detect the mood of its users—all the better to target them with advertising. And Netflix has technology that enables its users to make a choice simply with their eye movements. This is great news for couch potatoes who are too potato-y even to move their fingers, and much greater news for those who are paralyzed or who have lost their fingers or hands altogether.

Thus are we reminded of the serendipitous value of technological spinoffs. That is, the blessings that have come to us by adjacency—for example, somebody was working on something and came up with something else, from penicillin to the internet to LED lighting for vertical agriculture. As we know, private enterprise tends to do a good job of converting spinoffs into commercial products, and yet there’s also a need for translating advances into an improved public sphere.

A case in point is the tragic situation that unfolded in Robbins, Illinois, just outside of Chicago on November 11. According to WGN-TV, after hearing reports of a shooting in the wee hours of Sunday morning, police raced to the scene. When they got there, they saw a man with a gun—and so they shot and killed him. The problem was that their target, Jemele Roberson, was a security guard, who was, ironically enough, in the process of subduing the suspect. It’s hard to think of a more painful, or racially charged, incident.

Yet if we step back from the terrible details, we can observe a larger phenomenon at work here, namely “friendly fire.” That term, borrowed from the military, encapsulates any situation in which friendly forces shoot at each other by accident.

Friendly fire is actually rather common; it’s just one of those things that happen in the fog of war. In fact, according to one estimate, 21 percent of our casualties in World War II were attributable to friendly fire, including Lieutenant General Leslie McNair, killed, along with more than 100 other Americans, on July 25, 1944, when U.S. bombers hit the wrong target in France.

Since then, greater efforts have been made to prevent these disasters from happening. The general rubric for technologies involved is Identification Friend or Foe (IFF). And yet even as technology improves, the percentage of U.S. friendly fire casualties seems actually to be rising. The best known recent friendly fire victim was Pat Tillman, the football player-turned-Army Ranger, killed in Afghanistan in 2004. Yes, IFF technology keeps improving, but the battlefield also keeps growing more complex, to say nothing of more lethal.

In the meantime, here on the home front, friendly fire is an ongoing problem. Police magazine tells us that there are a handful of cases every year in which cops are killed or wounded by fellow officers. And deaths in related circumstances, such as that of Jemele Roberson, add to the dolorous toll.

There’s also the issue of civilians, innocent as well as guilty, who get caught up in the swirl of events. Each case is different, and yet on the whole, it’s impossible not to have sympathy for the police, because they—and sometimes the rest of us—routinely get vague reports that make it difficult to know what to do. For instance, here’s a November 12 tweet from Washington, D.C.’s Metropolitan Police Department: “Robbery Armed at 1732 hrs in the 3400 block of Stanton Road SE. Lookout for Suspect-1 B/M, 5-10, black jacket, black jeans; and Suspect-2 B/M, 5-6, black puffy jacket, armed with a black handgun; last seen fleeing on foot towards the 1700 b/o Savannah.” (We can note that “B/M” stands for “black male.”)

If any of us received vague information such as this, what would we do if we saw someone who met—or seemed to meet—that description? To put it mildly, it’s a challenge, especially in situations where mistaken identity can be disastrous, even deadly.

So what to do? Perhaps we can borrow from IFF precepts. And perhaps, too, we can put some of those new technologies dreamed up by Silicon Valley brainiacs to a higher use than simply garnering more clicks.

We can start by observing that all IFF grows out of the basic idea of ascertaining identity. Going back thousands of years, armies have equipped their soldiers with at least the rudiments of a distinct identity—if not full uniforms, then at least banners or flags to rally around.

So maybe the same IFF idea can be extended to civilian uses, with high tech supplementing older forms of visual identification. Mindful of the sad fate of Jemele Roberson, we can start by thinking about a security guard at a bar. Most obviously, he could be wearing some sort of clothing—such as a fluorescent vest—that would identify him as a “good guy.” Yet that might not be satisfactory, for a number of reasons, including the possibility that a bad guy, too, could put on on such garb.

This is where newer technology could come into use. It’s easy to imagine that the good guy could have a badge or device—including a smartphone—activated, say, by a fingerprint, that would communicate IFF. The IFF would then signal to the police and other first responders as to the good guy’s identity. Such a system, simply described, would not be foolproof, but that’s where the need for brainiac inventiveness comes in. It shouldn’t be hard for techsters to work out a way for cops and others to determine who’s on which side.

After all, for defense contractors, IFF is a big business. Yet there doesn’t seem to be anything equivalent on the civilian side. At least, it wasn’t available to Jemele Roberson.

We could go further, as we imagine how to link up facial recognition technology, AI, drones, and even smart buildings to the cause of preventing the deaths of innocent people. And while we’re at it, we could use the same tech wizardry to cut down on crime. For instance, if a crime occurs inside a building, the building itself could detect it, and maybe a drone could arrive on the scene even before the police. Information about the perpetrator could thus be pulled together, perhaps even to the point of facial identification.

Obviously all these technologies have to coordinate seamlessly, defying, at least after a while, Murphy’s Law. And just as obviously, politicians, civil libertarians, and other social stakeholders would want to give the new system a thorough going over.

Yet whether or not the ACLU approves, we’re clearly inching in this direction. For instance, individuals and their pets—and possibly even whole companies and their workforces—are experimenting with implanted microchips. These would be natural components in a comprehensive IFF system.

So here’s a chance for Silicon Valleyites to show they care about using technology to solve our problems, not just their problems. A robust IFF for crime emergencies would save lives, reduce social pain—and, yes, could even be a money-maker.

James P. Pinkerton is an author and contributing editor at The American Conservative. He served as a White House policy aide to both Presidents Ronald Reagan and George H.W. Bush.

Advertisement

Comments

Become a Member today for a growing stake in the conservative movement.
Join here!
Join here