_being open, transparent, and friendly while making, flying, and taking pictures from balloons, kites, and drones_ Cover image by [Christopher F. Smith](http://www.mylittledroney.com/). Published in July 2012 [Grassroots Mapping Forum#2.](http://publiclab.org/forum) [Order online.](http://store.publiclab.org/products/grassroots-mapping-forum-issue-2) Informal conversations on ethics and hazards keep popping up on the Grassroots Mapping mailing list, and it seems about time to dive deeper into these issues. So I reached out to our mailing list, word got round, and me and a few great folks got together for a phone call, which we all then edited down into this. Who are we? Raymond Cha, a UX designer working on digital map interfaces and a grassroots mapper around the Gowanus Canal, Coby Leuschke, President of Metonymy and Rocketship Systems, and an open source UAS developer, Cameron Hunt, Director of Bitworld, a non-profit working on data security, and Amie Stepanovich, Council at the Electronic Privacy Information Center (EPIC), privacy advocate, and congressional witness on surveillance issues, who also helped prepare a great note on surveillance and US law. -Mathew Lippincott Public Laboratory co-founder balloon and kite developer Mathew: Let’s start with what distinguishes the ethics of surveillance using Unpiloted Aerial Systems (UAS) from that using airplanes. Why do UAS’s change the ethics? Their low-cost and accessibility? Their 24/7, ubiquitous operation? Their size and maneuverability in spaces planes can’t go? Or their automation— eliminating or minimizing human decision makers? Amie: I’ve talked about these four points with other people, but in the surveillance field, I’ve focused on what data is collected. From our perspective there are two main differences from past aerial surveillance: UAS’s are cheaper, operate longer, and therefore bring on more surveillance, and they are potentially smaller, and can peer into windows, get into office spaces, and therefore surveil much more than, say, a helicopter. Cameron: To add to what Amie is saying about ubiquity, from my perspective, it’s the automation driving down costs and increasing the possibility of 24/7 surveillance. The fact that I can put up multiple inexpensive planes with a low human labor burden is the central factor. At what point does ubiquity make aerial surveillance a different type of thing? At what point does a shift in scale become a shift in type? The size is significant, because law enforcement can now fly over fences and into your backyard, around obstacles, and potentially in the window of my house— call it your personal airspace. How will that be dealt with? Coby: I don’t want a drone over my house, looking in my backyard. I have a 6 ft privacy fence— I have a reasonable expectation of privacy. And I make these things. I’d like these questions of ethics and the law answered sooner rather than later, because we’re looking at things like natural resource management and precision agriculture, and I think it’s in everyone’s interests to get these questions answered up front, and see if we can get some best practices and regulations in place that protect people. There are a lot of things I want to be able to do with these tools that will, for lack of a better word, be impacted by the more sensational use cases. I just want us to have a reasonable framework where police can do their job, we can make the tools, and they get used in the right way. Raymond: I’m coming from a data side, so some things Cameron said stuck in my mind. Technology and behavior evolve faster than formal and informal ethical codes can develop, and we’re still in that period where we’re trying to catch up. Ubiquity and automated data analysis are changing our notions of surveillance, not just how we as citizens use it, but the way governments use it. More expansive uses of surveillance and their ubiquity are going to amplify surveillance in two vectors— we’re going to see surveillance technology used more frequently and in new kinds of situations. Mathew: This next prompt was inspired by Coby’s analogy on our list: A UAS “is a tool; like a hammer I can use it to build a house, or hit someone over the head.” Are there ‘good’ and ‘bad’ objects? What ethics play into