Public Lab Research note


A reflection of sensor journalism

by abcieslik | October 07, 2014 16:44 07 Oct 16:44 | #11247 | #11247

Perhaps the biggest challenge of sensor journalism is the fact that it is such a new and emerging field within journalism. While it is exciting to be at the forefront of sensor journalism, there are relatively few guidelines within the field right now, making so much of it trial and error. There are also plenty of misconceptions regarding sensor journalism that can only be addressed organically as it develops further. 

Going into this semester, I myself knew very little about sensor journalism. In my mind, it was an exotic field that only well established news organizations could participate in because of the high cost of sensors. However, as the Tow Center for Digital Journalism reports, “these ‘tools’ needn’t even be high tech” (Berret). As we found with our water testing project, sensors can be made for very cheap using simple technologies. But while sensors can be made at an affordable cost, there is the added challenge of limiting yourself to only reporting results that can be legitimately supported directly by the findings brought forth by the sensor. 

As a journalist, an inquisitive mind and desire for ultimate conclusions is within my nature. This underlying urge for clear solutions is not without merit in sensor journalism. The Tow Center states, “logic and imagination have as much to do with answering that question [of what can be sensed] as do the technologies” used to do the sensing (Pitt). However, sensors are limited to what we program to sense, and we should not stretch findings to explain more than what the sensor is programed to report. In the case of our water testing experiment, our sensors were designed to report sonically the conductivity levels from different water samples from across the Boston area. They were not designed to report what was prompting these conductivity levels, so assuming that a higher conductivity level directly correlates to a high contaminant level is unsupported. “Engineers and journalists make decisions that affect what can be measured, derived and the analysis that can be made,” and in this case, we made the decision to only measure conductivity levels, therefore limiting our analysis to only deal with said levels (Pitt). 

For instance, as Taylor and I found with our three water samples, the water from a drinking fountain in Emerson College’s Walker Building had a higher conductivity level than water from Dawson Pond in Arnold Arboretum. At first, we jumped to some conclusions and were slightly disgusted and very surprised to find that natural pond water was cleaner than supposedly clean drinking water. But when we further discussed our results, we realized just how out of bounds our instinctual analysis was. Additives like chlorine or fluoride that are put into drinking water during the purification process can skew conductivity levels just like pollutants, but our simple sensors cannot distinguish between the two because that is not what we designed the sensors to detect. So our results are limited to simply saying that Emerson College drinking water is more conductive than Dawson Pond water, not that one is more contaminated than the other. 

The interesting thing about sensor journalism is how it so naturally leads to further questions that can be answered by developing new and different sensors. Asking an initial question and creating a sensor to answer said question is only the tip of the proverbial iceberg. Often, “journalists [conduct] further logical analysis and [combine] other reporting processes to derive some insight into the world” (Pitt). Just as it is bad reporting to draw unsupported conclusions from sensor results, so too is it poor journalism to be lazy and limit your story to results from only one analysis solely because you were unwilling to find further results. 

In the case of our water testing, perhaps the next step in sensor reporting would be to test for specific contaminants. While it can be expensive to run an extensive test for all possible additives, you can purchase test kits for specific chemicals (like the fluoride and chlorine potentially in Emerson’s drinking water) for relatively cheap. After conducting these tests, we could then produce a more conclusive report analyzing water throughout Boston. At the crux of sensor journalism is this desire “to take human observations and impressions and make them specific, so that they might be used for comparisons. Often that [means] quantifying an observation” (Pitt). It would be a rather lackluster article if we simply published our findings saying “drinking water is more conductive than pond water.” However, quantifying these initial findings and saying something like “drinking water at Emerson College contains higher levels of such-and-such a contaminant than Dawson Pond water” would have a much greater impact on readers.  

This leads to one of the greater challenges of sensor journalism that is not necessarily obvious at first: what if your results do not produce usable information? After investing time, money, and effort into building a sensor, using it, and analyzing the results, it is only natural for us to want to find out something outrageous and exciting. But sometimes, your results simply state the obvious, or rather something we already know. For example, if we took our water sampling experiment further and tested for specific additives, it is highly unlikely that we would find high pollution levels in Emerson’s drinking water. Just because you have results quantifying an observation, that will not always translate into an interesting journalism article. The Tow Center reminds us, “We are in an era in which reporters are hungry for data, and increasingly expert in using it,” and being an expert is the key here. It is important to remain objective throughout the entire sensor process to ensure we do not stretch results or publish uninteresting or unimportant results just for the sake of publishing an article (Pitt). 

While it can be upsetting to see results that do not live up to expectations, it is not necessarily the end of the world. In our constantly developing world of open data, results that are not useful to you can be useful to someone else one day. Plenty of open data conglomerate sites (like Public Lab right here) exist for sensor journalists to publish results, regardless of if they are extremely shocking or incredibly mundane. Just because your results are not initially useful to your article, it does not mean another journalist will be able to implement them in a future article.

This communal quality is arguably my personal favorite part of sensor journalism. Because the field is so new and undefined, there is a definite sense of teamwork within sensor journalism. Whereas crime reporters or political reporters from different news organizations are often competing to be the first one to report on breaking news, sensor journalists frequently work together and use each others’ data to make further discoveries. Public Lab, for example, is “a non-profit community collective seeking to investigate environmental concerns with DIY tools and techniques” (Pitt). The site’s entire integral structure revolves around this notion of collective reporting, which is an extremely attractive quality of sensor journalism.

Ultimately, there are plenty of positive qualities in this new field of sensor journalism, as well as a handful of potential pitfalls. There is a definite sense of vulnerability within sensor journalism because the field lacks an established structure that comes naturally over time. It can be intimidating to entire a practice where so little is known, but with the right mindset, this vulnerability can be translated into a certain sense of empowerment. Yes, there are not a whole lot of rules in the field. But that means it is our job as journalists at the forefront of sensor journalism to create the rules that carry on into the future. 

0 Comments

Login to comment.