stories from the Public Lab community
Compressed autos at Mystic River scrap yard, Everett, Massachusetts, 1974. Spencer Grant/Getty Images. CC-NC-SA
In 2015 the New America Foundation asked @Shannon and me to write a chapter for their Drone Primer on the politics of mapping and surveillance. I worked in an example of positive citizen surveillance by the Conservation Law Foundation (CLF) that I’d heard about in a session at the 2015 Public Interest Environmental Law Conference. I’ve excerpted and adapted my writeup of the CLF case as a part of our ongoing Evidence Project series. If you know of similar cases please get in touch!
Geo-tagged aerial and street-level imagery on the web can be a boon to both environmental lawyers and the small teams of regulators tasked by US states with enforcing the Clean Water Act. Flyovers and street patrols through industrial and residential districts can be conducted rapidly and virtually, looking for clues to where the runoff in rivers is coming from. Combining aerial and street-level photographs with searchable public permitting data, the 1972 Clean Water act’s stormwater regulations are now more enforceable in practice than they have ever been (Alsentzer et. al., 2015).
State and federal environmental agencies often do not have time or resources to adequately enforce permits under the National Pollutant Discharge Elimination System (NPDES) that regulates construction and industrial stormwater runoff, and roughly half of facilities violate their stormwater permits every year (Russell and Duhigg, 2009). Enforcement can be picked up by third parties, however, because NPDES permits are public. Plaintiff groups and legal teams conduct third-party enforcement through warnings and lawsuit filings. Legal settlements from lawsuits recoup the plaintiffs legal costs, and can also include fines whose funds are directed towards community-controlled Supplemental Environmental Projects that help improve environmental conditions in the violator’s watershed. The Conservation Law Foundation (CLF), a Boston-base policy and legal non-profit, operates in precisely this manner, recouping their costs through lawsuits and directing funds to Supplemental Environmental Projects in the Mystic River Watershed.
In 2010 a neighborhood group approached the CLF about a scrap metal facility on the Mystic River. Observable runoff demonstrated the facility had never built a stormwater system, and a quick US Environmental Protection Agency (EPA) NPDES permit search revealed that they had never applied for or received a permit. The facility was flying under the EPA’s enforcement radar, and so were four of the facility’s neighbors.
Between 2010 and 2015 CLF’s environmental lawyers initiated 45 noncompliance cases by looking for industrial facilities along waterfronts in Google Street View, and then searching the EPA’s stormwater permit database for the facility’s address. Most complaints are resolved through negotiated settlement agreements, where the facility owner or operator funds Supplemental Environmental Projects for river restoration, public education, and water quality monitoring that can catch other water quality criminals. Together, CLF and a coalition of partners such as the Mystic River Watershed Association, are creating a steady stream of revenue for restoration, education, and engagement in the environmental health of one of America’s earliest industrial waterways.
Regardless of their effect, legal threats are stressful, often expensive, and can take years to resolve. Even when threatened polluters are acting in good faith to clean up their systems, the process of identifying and persuading companies to comply with environmental regulations can be strain relationships in communities. Non-compliant small businesses on the Mystic River that have been in operation since before the Clean Water Act was passed in 1972 may never have been alerted to their obligations under the law. Their absence from the EPA database reflects mutual ignorance from bureaucrats of businesses and businesses of bureaucracy. However, businesses bear the direct costs of installed equipment, staff time, and facility downtime, indirect costs of professional reputation from delayed operations or identification as a polluter, and transactional costs of paying for legal assistance or court fees. Indirect and transactional costs are hidden punishments that can accrue regardless of guilt or readiness to comply.
To combat the negative perceptions that can accrue from the use of legal threats, CLF proactively works to fit itself into a community-centered watershed management strategy. CLF and their partners run public education and outreach campaigns and start with issuing warnings that aren’t court-filed (Alsentzer et. al., 2015). Identifying and working with businesses operating in good faith is a tenet of community-based restoration efforts. By using courts as a last resort and participating in public processes where citizens can express the complexity of their landscape relationships, CLF and their partners are increasing participation in environmental decision-making and establishing the legitimacy of restoration and enforcement decisions.
Regulations and permit databases can often be tough to put to work, but the CLF’s case was fairly straightforward: They simply searched for company’s addresses in a publicly available database. We would love to hear cases of more groups using this approach or other simple modes of regulatory engagement.
Excerpted and Adapted from Mathew Lippincott with Shannon Dosemagen, The Political Geography of Aerial Imaging, 19-27 Drones and Aerial Observation, New America Foundation 2015.
Alsentzer, Guy, Zak Griefen, and Jack Tuholske. 2015. CWA Permitting & Impaired Waterways. Panel session at the Public Interest Environmental Law Conference, University of Oregon.
Conservation Law Foundation Newsletter “Coming Clean”, Winter 2014;
D.C. Denison, “Conservation Law Foundation suing alleged polluters”, Boston Globe, May 10, 2012.
Russell, Karl and Charles Duhigg, Clean Water Act Violations are Neglected at a Cost of Suffering. In The New York Times, Sept 12, 2009. Part of the Toxic Waters Series
Follow related tags:
evidence epa blog water
above: sketch of figuring out how to organize "air" into a research area, and which methods are part of the research area, and which activities would go on what grid...Photo by @nshapiro
We've been having some fun discussions over the past couple months with people on each of the topical lists about what to name the new "top-level" pages where we're organizing. That means -- when posting activities, do they end up on /wiki/balloon-mapping or /wiki/aerial-photography? Do we use the older /wiki/spectrometer page, or the new one at /wiki/spectrometry? But we're hoping for even MOAR discussion!
Let's think about:
So far we've created drafts of:
When naming new pages, some things to consider are that names should be:
Looking ahead, we have more naming to do! There are some mismatched names:
We'd really like to hear from a wide selection of voices about naming! Please pile on in the comments! Thank you!
Follow related tags:
blog with:warren with:cfastie with:nshapiro
In our continuing shift towards using the new Q&A feature and the new Activity grids as a framework for collaboration on PublicLab.org, we're encouraging people to post their work more in the spirit of Instructables.com -- "showing each other how to do something" rather than just telling people about something you've done. This shifts the emphasis from solely documenting what you've done, to helping others do it too. (image above from a Lego Technics kit)
There are several reasons we like this. A how-to guide (what we're calling Activities) must have extremely thorough and easy-to-follow steps (and may need to be revised if people get stuck). Perhaps even more importantly, its success (we hope) can be measured by how many people are able to follow the steps successfully, which exercises and fuels the power of broad communities and open science.
While there are various types of activities for various purposes, all of them ought to set out some basic information to help people get started:
Speaking of room for improvement, can folks suggest other important parts of an activity? With an eye toward making it easy for anyone to write and post activities, and for others to replicate them, what's the minimum necessary?
(IKEA Stonehenge. Justin Pollard, John Lloyd, and Stevyn Colgan designed an IKEA manual for Stonehenge, publishing it under the title HËNJ in the QI 'H' Annual)
We'd also like to suggest that people post things early -- to share ideas, solicit input, and acknowledge that most posted activities will go through some (if not many) revisions as people try them out and offer feedback. Could we even have a separate "Publish Draft" button so they're clearly marked as such, and people know they're encouraged to share early and often?
One important way we think will increase the chances that people will complete a replication of your activity is to simply write shorter activities -- perhaps breaking up a longer set of steps into several related modules. Instead of posting a long and complex activity, a few shorter ones -- each with a simple way to verify that the steps so far were correctly completed -- are much more accessible, and will tend to separate distinct possible causes of failure for easier troubleshooting.
Distinct modular activities can be linked and referenced to create a larger activity that might span, for example, building and verifying a tool functions properly, tool calibration, and lab or field tests of various materials using the tool. Even if the final activity cannot be completed without the previous activities first, breaking them out into distinct activities that build on each other will help the onboarding process.
Finally, beyond this overview, what more can we do to make it easy to write good activities? Some have suggested a kind of "assistance group" who could provide helpful tips and constructive critique to people posting on Public Lab. This sounds like a great idea, and potentially extra helpful to folks who are hesitant or unsure of what makes a good and thorough post.
Would "activity templates" be useful, to the extent that they can be generalized?
We're also, of course, posting some example Activities, such as this spectrometer calibration activity, which we hope will help set some conventions.
We're also interested in how people could be introduced to other activities on a topic once they complete the current one -- maybe there's a "sequence" of activities that grow in complexity? Or we could display a mini activity grid of "related activities" at the bottom of each one?
Finally, we're trying to figure out how people can request an activity for something they want to learn to do, but for which there is not yet an activity posted. This'll be especially important as we're starting out, since we have very few complete activities posted -- but it'll also be a great starting place for people hoping to share their knowledge and expertise. Our initial stab at this is to list "limitations and goals" for a given kit, clearly explaining the problem we'd like to solve. This is actually a list of questions using our new questions system -- and we imagine people might post an activity, then link to it as a proposed answer.
This is all quite new, and we'd love to hear other ideas for how this could work. And of course, if you're interested in giving it a try and writing an Activity, please do! Activity grids are going up on many wiki pages across the site, so if you have questions about where and how to post, please leave them in the comments below. Thanks!
Follow related tags:
collaboration community leaffest blog
Asking and answering questions is at the very heart of Public Lab. It's how we get started, it's how we make progress, it's how we get to know each other and our environmental concerns. Dedicated readers will recognize that some "getting started" exchanges have been repeated countless times on the mailing lists. (PS To those of you who are high volume question answerers -- everyone is endlessly grateful for your responses!) While it's critical that that questions from newcomers, however repetitive, will always be welcome, generating a Frequently Asked Questions (FAQ) grid will lower the barrier to exchanging information.
There are two parts to the new automated FAQ system:
1) The new Question and Answer system that @Ananyo2012 built into the plots2 codebase this summer is up and running.
See it here: https://publiclab.org/questions And read more about it here: https://publiclab.org/wiki/public-lab-q-and-a
2) The FAQ Grid is a variation of the Activity Grid insofar as it's also generated by a powertag, and sorted by Likes.
FAQs will be on every "top-level" research page, see it here https://publiclab.org/wiki/spectrometry#Frequently+Asked+Questions
You can add an automated FAQ grid to any wiki page by using this code:
## Frequently Asked Questions
the button where people can ask a new question:
<a class="btn btn-primary" href="/post?tags=question:spectrometry&template=question">Ask a question about spectrometry</a>
the grid itself:
As @mathew reported back from Write The Docs, pruning an automated system of FAQs is superior to curating a manual one. Further, linking product support directly to documentation is so important that the Kits Initiative will move their knowledge base onto the Q&A, and will interact with customers using Q&A.
Early adopters on method specific mailing lists might consider subscribing to the relavant
question:foo tag on the website. (pssst this is the start of a medium term plan to move all mailing list interactions onto the website). For instance, spectrometry list members might want to subscribe here: https://publiclab.org/tag/question:spectrometry
Please write in with ideas and new suggestions! What do you think?
Follow related tags:
community blog with:warren with:gretchengehrke
The sheer number of posts on publiclab.org by contributors from all over the world about 1) how to build tools better, and 2) how to use them for making environmental observations is breathtaking, and at times, boggling. As Public Lab has grown, so much content has been generated that it has become unnecessarily difficult to know what the "latest and greatest" version is, what the next development challenges are, or simply where newcomers should begin.
In the past couple weeks, staff have begun "gardening" on PublicLab.org and writing some new web features.
People on the spectroscopy and near-infrared lists have been discussing how to better present the overall research areas to make it easier to get involved. For each of those two research areas, we made a new top-level page. See them at spectrometry and multispectral-imaging. On those new pages, we constructed a couple tables -- the main table organizes relevant research notes into a "ladder" of activities others can replicate. There are columns to describe what type of activity it is, the status of its documentation, and how many people have replicated it.
We made a "Request A Guide" button to capture ideas about what people would like to do but don't see listed yet:
We also drafted two other kinds of tables, one to track upgrades (additions, modifications) that people have made to particular tools (for instance, the desktop spectrometer):
...and another to hold questions related to a particular research area (for instance, spectrometry):
Check out this much easier, automated way to organize content into grids:
After creating the first grids manually, WebWorkingGroup quickly created an automated way to make the grids. We created a power tag to add an Activity Grid to your wiki page with just a few characters, like this:
This automated Activity Grid fills itself in with all research notes tagged with the key word you used. Consider the keyword "spectrometry": a grid on a wiki page created with the powertag
[activities:spectrometry] will pull in all content (notes/questions) tagged with the powertag
activity:spectrometry. Check it out on https://publiclab.org/wiki/sandbox, look at the tables, then click "edit" to see how the tables were generated. The tables have various columns, such as "difficulty" (like easy, moderate, or hard), which can be filled out by adding more tags on the research notes. We're working on a tagging interface to make tagging less mysterious:
These draft "Activity Grids" are ready for you to test drive! How?
...and to pipe content into the grid, go back to your original notes and add the powertag
activity:spectrometry. To fill out the columns for each activity, use the tagging interface to add additional powertags or directly type:
If you want any assistance, email email@example.com and we'll help you get it going!
Follow related tags:
collaboration blog with:warren with:gretchengehrke
We've had some tremendous work on Public Lab software this past summer through our Google-supported Google Summer of Code program, where five students and several mentors have spent innumerable hours cooking up new features and abilities both on the PublicLab website and in the independent #webjack project.
Even just in the past month, we've seen (via Github Pulse):
Excluding merges, 9 authors have pushed 368 commits to master. 321 files have changed and there have been 7,047 additions and 1,217 deletions.
The program wraps up this week with many of the features having gone live over the past few weeks. Our five students have written up their work in a series of notes, which I'll link to here:
Thanks to all of our mentors for their ongoing input and support, with special thanks to the Community Development team, @liz and @stevie. I'd also like to shout out to @david-days, as well, who put an enormous amount of work into the Advanced Search project, and in particular, whose work was just merged for the first time last week in an epic rebase of hundreds of files and thousands of lines of code.
These projects, from including more languages on PublicLab.org to making it easier to find people and resources near you, all have helped to make Public Lab's collaborative model stronger, and we're eager to see how the new features promote the growth of our community.
All of our students this year were extremely productive, and we had our best-ever GSoC program, beyond all doubt. The fast pace of merging (twice weekly) was exciting and really ensured that student work tracked the master branch closely, and that new changes (with corresponding tests) were quickly and consistently integrated into production code instead of drifting off and resulting in larger, more difficult merges later. Thanks to all of our students for keeping up with this fast pace (and occasionally going faster than I could!). It was great to have students who knew how to do pull requests, write and run tests, and rebase their changes to make things efficient, so we could focus on doing great work.
One of the things which really made the difference this year was the way our #new-contributors work helped to ease students' entrance into the codebase, and we've asked the students to, in turn, produce some `help-wanted` and `first-timers-only` issues to draw yet more contributors into the project:
Amazingly, this has worked very well, and two new contributors (carolineh101 and ykl7) have committed code in the past two weeks, directly resulting from these outreach efforts. With so many well-documented and welcoming issues, we hope this is just the beginning. See the screenshot below for just a portion of our
So, all in all, a fantastic summer, and thanks to all who helped out!
Follow related tags:
software gsoc web blog