Aletho News

ΑΛΗΘΩΣ

The Precarious State of the Private-Sector Drone Industry

By Jay Stanley | ACLU | September 9, 2015

A lot of people are taking it as a given that drones will become ubiquitous in the coming years. But it seems to me that that’s far from a given; there is still a lot of uncertainty over the future of this technology (and therefore over the kinds of privacy threats and free expression opportunities that it will in the end present).

This is an industry that is still in a very early stage of its development, when it is highly sensitive to shocks. Incidents and accidents that happen at this stage can have lifetime effects, lasting many decades. That is especially true with a huge media spotlight on this technology. The amount of press coverage generated by the landing of a harmless toy drone inside the White House fence is indicative here.

Imagine the uproar if we were to see somebody put a gun on a drone and start shooting people remotely. Or drones used to bypass security perimeters and deliver explosives to a high-value target such as the White House. Even if the explosives did no more than blow a small hole in the White House lawn in the middle of the night, hurting no one, that would decisively alter the course of the drone industry.

Another possibility is some kind of spectacular accident. The safety record of this new technology is not great. There has been a lot of attention paid lately to drone “near-misses” with passenger airliners. (I have heard some experts express doubt that an accidental collision between a small drone and an airliner would cause the airliner to crash—but that’s certainly not something anyone wants subject to uncontrolled real-world tests.) Should a drone bring down an airliner, the drone industry might never recover. Even an accident in which a drone falls out of the sky could be a game-changer. If the 375-pound military drone that crashed onto an elementary school playground in Pennsylvania in April 2014 had killed children, we would likely be having a different conversation today.

Even without anything so dramatic, an accumulation of smaller accidents could shape the technology over time. Any technology that involves complex interactions with human beings will inevitably have some rocky times as we attempt to smoothly integrate it into life. If drones—even small lightweight private ones—are regularly crashing onto people’s rooftops, windshields, and heads, tolerance for the technology is likely to go down fast. If drones become popular enough that the skies over our neighborhoods are regularly criss-crossed with them, this could well happen—especially given the many unknowns such as whether territorially jealous birds will routinely attack them.

There may also be a nuisance factor. Even if large numbers of small drones constantly flying overhead turn out not to be dangerous, they may simply annoy people. To start with there’s the buzzing noise they make, and of course there’s also privacy. At the ACLU we have been most focused on the danger that drones will be used to construct regimes of constant wide-area surveillance. And there is a very real potential that private-sector drones may also become a tool for directly harmful privacy invasions. But even without such significant invasions, private-sector drones may spark nebulous feelings of intrusion. I found it interesting in this regard that firefighters in a recently circulated video found drones to be annoying enough that they tried to blast them out of the sky with their hoses. When a drone hovered over a crowd of hockey fans after a 2014 game in Los Angeles, a “mob mentality set in” as the LA Times put it, and “revelers were throwing everything they could to knock the drone down.”

I can’t claim to know what motivates people in incidents like these. I do know that while photography in public is a First Amendment right, as a matter of etiquette it is often unacceptable. As I’ve discussed before, training a camera on someone who does not want to be photographed may be constitutionally protected in public (as is yelling and swearing at them), but it is also perceived as rude.

These kinds of factors may add up to a general feeling by communities that they’d rather do without the putative advantages of widespread drone usage. In this drones may prove to fall into the same category as Google Glass—a widely anticipated and talked about technology that is naively viewed as inevitable, but ultimately one that remains confined to relatively narrow applications due to the subtleties and caprices of human etiquette.

All this makes it very hard to predict what will happen to this technology. In many ways what we’re witnessing is a race against time. If drones prove to be useful enough machines with enough practical benefits that Americans feel they can’t live without them, they’ll likely tolerate the occasional tragic accident or terrorist attack, as well as a good deal of annoyance. But if the disaster happens first, drones may never get a chance to prove themselves.

September 9, 2015 Posted by | Civil Liberties, Full Spectrum Dominance | , | Leave a comment

Chicago Police “Heat List” Renews Old Fears About Government Flagging and Tagging

By Jay Stanley | ACLU | February 25, 2014

The Verge had a story last week (expanding on an August report from the Chicago Tribune that I’d missed) that the Chicago police have created a list of the “400 most dangerous people in Chicago.” The Trib reported on one fellow, who had no criminal arrests, expressing surprise over having received a visit from the police and being told he was on this list. A 17-year-old girl was also shocked when told she was on the list.

The database, according to the Verge, is based on historic crime information, disturbance calls, and suspicious person reports. The CPD’s list is heavily based on social network analysis (which is interesting considering the debates now swirling around the uses of metadata and the analysis such data enables). A sociologist whose work inspired the list, Andrew Papachristos, told the author of a Chicago Magazine piece (which goes into some interesting depth on some of the theory behind the list): “It’s not just about your friends and who you’re hanging out with, it’s actually the structure of these networks that matter.”

The list was funded through a Justice Department grant known as “Two Degrees of Association.” (At least that’s one less hop than the NSA uses.)

I’m still consistently surprised how often things we worry about in the abstract actually show up in the real world. For years, privacy advocates have been warning about how databases might be mined by the authorities for information used to label, sort, and prejudge people. True, there are all too many precedents for this sort of thing, including the CAPPS II program proposed early in the Bush Administration, the nation’s terrorist watch lists, various police gang lists, and the Automated Targeting System. The TSA’s Pre-Check whitelist is also a cousin of this kind of program. All are based on using various information sources and grinding them through one or another logic engines to spit out a judgment about individuals and their supposed dangerousness or safeness as a human being. But still, this program amazes me in how starkly it replicates the kinds of things we have been warning about in many different contexts.

Just two weeks ago, for example, I was asked by several news outlets what we think about police officers using Google Glass. I told them that Glass is basically a body camera, and that the issues were the same as those outlined in our white paper on police use of that technology. The principal difference between Glass and the body cameras being marketed to police is that Glass can also display information. I said this shouldn’t be a problem—unless (I added almost apologetically because of the slightly fanciful nature of this point) the police started using them with face recognition to display some kind of rating or warning for individuals who have been somehow determined to be untrustworthy.

“Of course, that’s not a problem today,” I said, “it’s more of a futuristic concern.”

Ha! Barely a week later, that scenario doesn’t seem so futuristic any more to me, especially at a time when some want to use face recognition to warn them when someone on a blacklist tries to enter a store or school. (True, Google doesn’t currently permit FaceRec apps on Glass, but it’s unclear how long that will last.)

Some further points and questions about Chicago’s heat list:

  • The principal problem with flagging suspicious individuals in this way may be the risk of guilt by association. Although we don’t know how valid, accurate, and fair the algorithm is, it’s important to note that even if its measures were valid statistically—that one particular individual really does have an increased risk of crime because of certain things about his or her life—it may still constitute guilt-by-association for a person who actually remains innocent. It is simply not fair for people to be subject to punishments and disadvantages because of the groups they belong to or what other people in similar circumstances tend to do. I keep going back to the example of the man whose credit rating was lowered because the other customers of a store where he shopped had poor repayment histories.
  • Why should the police restrict their hotlist to 400? Why not 4,000 or 40,000? In fact, why not give every citizen a rating, between 1 and 100 say, of how “risky” they might be? Then the police could program their Google Glass to display that score hovering above the head of every person who comes into their field of vision. This is a path it’s all too easy to see the police sliding down, and one we should not take even the first steps towards.
  • Remember too the point that (as I made here) there are a vast number of laws on the books, many complicated and obscure, and anyone who is scrutinized closely enough by the authorities is far more likely to actually be found to have run afoul of some law than a person who isn’t. In that respect inclusion on the list could become a self-fulfilling prophesy.
  • Will the Chicago police carry out any kind of analysis to measure how effective this technique is? Will they look at the success of their predictions, search for any discriminatory effects, or attempt to find out whether these rankings become a self-fulfilling prophesy? The police often have little inclination to do any such things—to adopt rigorous criteria for measuring whether their new toys and gizmos are providing a good return on investment. Purely from an oversight point of view, every aspect of this program would ideally be made public so the world could scrutinize it—certainly the algorithm. Privacy concerns, however, suggest that the names of individuals who are (quite possibly totally unfairly) flagged by these algorithms not be made public, nor any personal data that is being fed into the algorithms.
  • A Chicago police commander is quoted as saying, “If you end up on that list, there’s a reason you’re there.” This framing begs the question at the heart of this approach: is it valid and accurate? Such circular logic is genuinely frightening when it comes from a police officer talking about matters of guilt and innocence.
  • It’s true that there could be a fine line between laudable efforts to identify and help “at-risk youth,” and efforts to tag some people with labels that are used to discriminate and stigmatize. Research on the “epidemiology of violence” could be valuable if used as part of a public health approach to crime. But if it’s part of a criminal justice “pre-crime” approach, then that’s where the problems arise.

Overall, the key question is this: will being flagged by these systems lead to good things in a person’s life, like increased support, opportunities, and chances to escape crime—or bad things, such as surveillance and prejudicial encounters with the police? Unfortunately, there are all too many reasons to worry that this program will veer towards the worst nightmares of those who have been closely watching the growth of the data-based society.

February 25, 2014 Posted by | Civil Liberties, Timeless or most popular | , , , | Leave a comment