Public Lab Wiki documentation



Unpiloted Aerial Systems Ethics

This is a revision from February 08, 2015 00:55. View all revisions
1 | 7 | | #11563

being open, transparent, and friendly while making, flying, and taking pictures from balloons, kites, and drones

Cover image by Christopher F. Smith.

Published in July 2012 Grassroots Mapping Forum#2. Order online.

Informal conversations on ethics and hazards keep popping up on the Grassroots Mapping mailing list, and it seems about time to dive deeper into these issues. So I reached out to our mailing list, word got round, and me and a few great folks got together for a phone call, which we all then edited down into this. Who are we? Raymond Cha, a UX designer working on digital map interfaces and a grassroots mapper around the Gowanus Canal, Coby Leuschke, President of Metonymy and Rocketship Systems, and an open source UAS developer, Cameron Hunt, Director of Bitworld, a non-profit working on data security, and Amie Stepanovich, Council at the Electronic Privacy Information Center (EPIC), privacy advocate, and congressional witness on surveillance issues, who also helped prepare a great note on surveillance and US law.
-Mathew Lippincott Public Laboratory co-founder balloon and kite developer

Mathew: Let’s start with what distinguishes the ethics of surveillance using Unpiloted Aerial Systems (UAS) from that using airplanes. Why do UAS’s change the ethics? Their low-cost and accessibility? Their 24/7, ubiquitous operation? Their size and maneuverability in spaces planes can’t go? Or their automation— eliminating or minimizing human decision makers?

Amie: I’ve talked about these four points with other people, but in the surveillance field, I’ve focused on what data is collected. From our perspective there are two main differences from past aerial surveillance: UAS’s are cheaper, operate longer, and therefore bring on more surveillance, and they are potentially smaller, and can peer into windows, get into office spaces, and therefore surveil much more than, say, a helicopter.

Cameron: To add to what Amie is saying about ubiquity, from my perspective, it’s the automation driving down costs and increasing the possibility of 24/7 surveillance. The fact that I can put up multiple inexpensive planes with a low human labor burden is the central factor. At what point does ubiquity make aerial surveillance a different type of thing? At what point does a shift in scale become a shift in type? The size is significant, because law enforcement can now fly over fences and into your backyard, around obstacles, and potentially in the window of my house— call it your personal airspace. How will that be dealt with?

Coby: I don’t want a drone over my house, looking in my backyard. I have a 6 ft privacy fence— I have a reasonable expectation of privacy. And I make these things. I’d like these questions of ethics and the law answered sooner rather than later, because we’re looking at things like natural resource management and precision agriculture, and I think it’s in everyone’s interests to get these questions answered up front, and see if we can get some best practices and regulations in place that protect people. There are a lot of things I want to be able to do with these tools that will, for lack of a better word, be impacted by the more sensational use cases. I just want us to have a reasonable framework where police can do their job, we can make the tools, and they get used in the right way.

Raymond: I’m coming from a data side, so some things Cameron said stuck in my mind. Technology and behavior evolve faster than formal and informal ethical codes can develop, and we’re still in that period where we’re trying to catch up. Ubiquity and automated data analysis are changing our notions of surveillance, not just how we as citizens use it, but the way governments use it. More expansive uses of surveillance and their ubiquity are going to amplify surveillance in two vectors— we’re going to see surveillance technology used more frequently and in new kinds of situations.

Mathew: This next prompt was inspired by Coby’s analogy on our list: A UAS “is a tool; like a hammer I can use it to build a house, or hit someone over the head.” Are there ‘good’ and ‘bad’ objects? What ethics play into the design of systems? Should we seek to design objects to limit user behavior to ethical behavior? Can we reduce ethics to programmed, hard laws, or is there nuance that requires case-by-case considerations?

Raymond: I definitely agree that all technologies are tools that can be put to a number of uses, good and bad. But designs are created through the designer’s ethical lens, and he or she has a responsibility to design to limit bad outcomes— both for the user and for everyone else. We can’t control for everything, especially what happens when the tool is in the wild, but the designer can consider the worst cases and unintended uses, and apply their ethical lens to the design process.

Coby: In my role as a designer, I definitely don’t want to make any bad tech, but that’s a relative thing for most folks. We don’t purposely design in limitations, but there are aspects of the design that limit what a one can do. We’re dealing with small systems and its very difficult for them to become weaponized. For us, and especially my company, our designs revolve around our values and ethics statements, which say that we won’t be involved with any projects that have to do with weaponized UAS’s. We won’t sell to the military directly, but if one of our contracts ends up in a military use for surveillance, I wouldn’t have a problem with that. I do have a problem with them becoming weaponized. For us it’s almost pragmatic— most of what we’re being asked to prototype right now are things that can be operated out of the back of an SUV and transported fairly easily. And when you look at what will probably come out of the FAA— and this is just a guess on our part based on what came out of the recommendations of the Aviation Rules Making Advisory Committee that we’ve been sitting on since way back in ‘09— it looks like stuff that’s less than 2 kilos or 4.4 pounds, is probably going to have less regulatory burden placed on it. So for us it’s a matter of size, scale, and speed. We don’t expect to do anything more than 30 mph or build anything bigger than 2 kilos. I’m sure you’ve seen the news stories of the guy who got arrested who was supposedly trying to use RC aircraft as a weapon. Right now the technology is already there to do bad things. For us, it doesn’t make a whole lot of sense to worry about what people will do with our designs. I mean, I can go get a foam model aircraft that can do 100mph for a few hundred bucks.

Mathew: Yeah, there are a million ways to cause problems— we have to ask, is our hardware really making it easier? Probably not. We share similar design constraints in that we have regulatory limits, 5 pounds for kites and 115 cubic feet of gas for balloons/6ft in diameter— we aren’t creating anything big enough, sharp enough, or fast enough— We have to take precautions, but its hard to do a lot of damage at this scale.

Coby: Right. I can do a lot more damage with my truck. We focus on the positive use cases, natural resource management, disaster relief, humanitarian assistance. We’re trying to keep it open, transparent, and accountable, and for us that’s a good way to do business. If we’re going to design something we open source it, put it out there, let the community comment on it, improve it, maybe understand it a bit better, so we don’t have such intense fear of technology that’s actually already out there.

Mathew: I really admire your company for putting out those reference designs and open sourcing those.

Amie: I can’t officially speak for EPIC, but I am very pro open source. People can come in and see how it’s built and understand the design. The fact that someone can come in and modify it in a nefarious fashion is outweighed by the benefits of openness.

Coby: Obviously it has some ethical questions, but there is transparency to it. Can a designer design ethically and be proprietary? I don’t see why not.

Amie: I agree that there are generally not good and bad objects, and do believe that the ethics are decided by the user. I think it’s important for the designer— and that’s not myself, I’m not a designer— to look at the worst case scenario. In the realm of surveillance, the worst case would be the ubiquitous surveillance of an individual. Maybe the ethical line is adding some sort of alert, so if this tool is used around a human being, that person is alerted that the tool is there. The designer may make a decision to allow the individual user to turn the alert off, but they’ve prevented the worst case by default before turning the system out into the world.

Raymond: You could also design into a UAS a tracker, and somehow watermark who is taking pictures and when. That’s a design choice a designer can make, and it defines how that object will function.

Mathew: Our Mapknitter workflow enables a sort of watermarking, because the mapmaker, original image, and date of every image are saved with the map. That’s a level of accountability we see as crucial. I also like Amie’s point of alerting people to images taken of them, and allowing them to opt out. One thing I like about kites and balloons is that they’re fairly obvious, and people often follow the line back to the operator. That fosters interaction, and gives the surveilled more opportunities to opt out before any imagery hits a network.

Amie: One of the things I have to keep telling people who think that privacy advocates are anti-progress is to clarify that we just want certain protections built in, we want transparency and accountability. I personally believe that the transparency in the program you’ve just described is a great start.

Cameron: To that end, one of the things Coby and I discussed is can we take a page out of what Google has done, automatically blocking license plates and faces, and that is something we could insert into the video screen, with the ability to remove it, but engineered into the architecture, we have basic privacy protections for some of the most obvious things.

Mathew: Trying to build firm ethical laws into designs is hard. To me, tools that encourage continuous ethical dialogue are better than those that require hard rules. With surveillance technologies, that means tools where direct engagement and negotiation between observer and observed is hard to avoid. Personal cameras have this— anybody can take public photos, but they expose themselves and have to negotiate with their subjects. Satellites are the opposite— the cameras are up there snapping away, so push for hard rules like blacking out access to certain areas in software. But I’d rather not rely on access conditions or blackouts of sensitive data, I’d like to see systems where people can pre-empt the collection of sensitive data.

That brings up our next prompt- What are the ethics of data collection? What are the rights of the observed, the obligations of the observers? When do aggregations of data cross a tipping point? Can a collection of different databases, each ethically collected, become an unethical and intrusive aggregate?

Amie: When you’re collecting aerial imagery, because faces are in the imagery, or the land is private, there’s a chance that private information is collected. At EPIC we feel that The Code of Fair Information Practices governs all of this data collection.

Mathew: I like that. Would you say it’s important to provide people with the chance, retrospectively, to opt out?

Amie: Yes, that is incredibly important. At EPIC, we typically fight for opt-in consent for all personally identifiable information. And, even if someone says that they are OK with their data being used for one purpose— say they agree to allow aerial images of their property for mapping— and the actual collection of data goes beyond that limit — for example, it collects an individual’s face — there has to be a way to correct that unless additional consent is obtained. It’s almost as if, you say you’re going to take pictures, and you take them and I see that I’m in them, I would say, “I’d really like to have my face blurred out in this picture.” Maybe even, if there is a well known piece of land, even if it’s legal to take pictures of it, some feature that is picked up may be uniquely attributable to a person. These are all things to consider when you talk beyond the legal concerns and really dig in to the ethics of what we should be collecting imagery of.

[mapping fisheries in public waters is a contentious issue for these reasons. -ed]

Raymond: While I’m sensitive to these opt out provisions, in reality they can be very hard to implement. Say you decide to blur out a person in a photo, but by cross-referencing property boundaries and longitude/latitude information that person is identifiable. When data is merged, those opt-out standards may not work.

Coby: From the technical side, if real estate sites like Trulia have access to all the accessory records that define my property, why can’t there be a universal opt out, so I’d have to opt in to any data service? None of those people on Trulia opted in to a single thing other than they bought a house, and its public record. It’s tough, I mean, I have a photography background, and think, if I put a balloon, UAV, take your pick, up without asking my neighbors permission, do I have a right to take a photo of someone’s backyard even if they have a 6ft privacy fence? I’d say no, it’s invasion of privacy.

Mathew: Public Laboratory’s policy towards image collection is to either do it on public land, being very public while doing it, or if we’re over private land, to get consent to photograph the space. We try to be proactive and identify ourselves. Thinking of an example to Coby’s point— my neighbor can report if they think I’m watering my plants during a drought, would it change if a balloon and camera was used?

Amie: Now the ethics are fairly difficult, but if we talk about this from a legal perspective, as long as you aren’t out at night and using advanced imagery to determine how much water is being used underneath the soil, its perfectly legal, even if you have a 6, 7 foot fence, it’s legal to fly overhead, and see what’s going on in someone’s backyard. That said, if you’re frequently taking pictures in your neighbor’s backyard, what story could that collection tell that your neighbor wouldn’t want to it tell? This is similar to other examples of aggregation. For instance, a GPS tracker on someone’s car, tracking their movements over a week or a month wherever they go— I’ve seen studies where someone has looked at a GPS track and they can tell where your house is and where your job is based on cell phone data. Perhaps you have to go to a doctor every week, someone can tell what doctor you visit and perhaps what condition you have. More data is not an unalloyed good.

Aerial surveillance and US law

by Amie Stepanovich

Under US v. Katz, The Fourth Amendment protects a person’s reasonable expectations of privacy. The plain view doctrine explains that there is no expectation of privacy as to things that are visible to the public. In California v. Ciraolo the Supreme Court concluded that a suspect did not have a reasonable expectation of privacy as to aerial surveillance conducted with the naked eye from an altitude of 1,000 feet. On the same day, the Supreme Court held that the EPA did not violate the Fourth Amendment when it conducted aerial surveillance of a chemical plant, because the facility was like an “open field,” and in open fields there is no privacy interest. The Court insinuated that its holding was conditioned on the fact that the EPA did not use high-tech equipment to conduct its surveillance. Finally, in Florida v. Riley the Supreme Court held that there was no Fourth Amendment violation when police conducted surveillance from a helicopter flying at 400 feet. Writing for a plurality, Justice White concluded that there was no Fourth Amendment violation because any member of the public could fly at 400 feet, so the surveillance was valid under the plain view doctrine. Recently, the Supreme Court decided US v. Jones, holding that police had to obtain a warrant before using a GPS device to track a suspect’s location every day for a month. The case was decided on property grounds, but a strong concurrence by Justice Alito indicated that the long-term monitoring violated the suspect’s reasonable expectation of privacy. Alito noted that privacy expectations were bound to change as technology evolves and that “[i]n circumstances involving dramatic technological change, the best solution to privacy concerns may be legislative.” Upcoming Supreme Court cases, Florida v. Jardines and Florida v. Harris will question if a person’s expectation of privacy is violated if a search is designed to only detect the presence of contraband.

Code of Fair Information Practices

from U.S. Dep’t. of Health, Education and Welfare, Secretary’s Advisory Committee on Automated Personal Data Systems, Records, computers, and the Rights of Citizens viii (1973)

  • There must be no secret databases.
  • There must be a way for a person to find out what information about the person is in a record and how it is used.
  • There must be a way for a person to prevent information about the person that was obtained for one purpose from being used or made available for other purposes without the person’s consent.
  • There must be a way for a person to correct or amend a record of identifiable information about the person.
  • Any organization creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take precautions to prevent misuses of the data.