Public Lab Wiki documentation



Near-Infrared Camera

This is a revision from May 17, 2013 22:27. View all revisions
11 | 105 | | #59

This page needs revision

Introduction

Vineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data.

Background: satellite infrared imaging

The study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades.

Point & Shoot infrared imaging

The goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor.

There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.

We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the secret life of plants. We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "superblue" filter.

Superbue

The filter

Research by Chris Fastie has ...

is just a piece of "superblue" filter which you can use to turn your webcam or cheap point-and-shoot into an infrared camera.

How to process your images:

We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:

  • Ned Horning's ImageJ plugin
  • Manual processing in Photoshop or GIMP
  • Using MapKnitter.org (deprecated)

Processing overview

  • 1. Calibrate. In order to get the most meaningful data possible from your plant images, it's a good idea to "calibrate" your camera, taking into account the current lighting conditions (sunny vs. cloudy, indoors vs. outdoors) at the time that you're taking your photos: this makes it much easier to compare "plant health" images taken at different times, in different places, and by different cameras. To make this easy, we'll likely be providing an additional "white balance card" -- simply, a card that has a standard color -- in our kits. By recording an initial image that includes this card, you'll be able to use our online software to "standardize" the colors in all of your images. If you don't have a card, don't worry -- there will also be opportunities to calibrate your imagery automagically later, using our analysis software, and the results might be just as good.

  • 2. Take your snapshot. "Rhododendrons -- say cheese!" Using your own camera (modded with our DIY filter), the Infragram Webcam, or the Infragram Point & Shoot, you'll record the scene of your choosing -- ideally, with some vegetation-y life forms in it. Take pictures of household plants, garden vegetables, trees -- we've grabbed a lot of useful agricultural imagery from cameras dangling from kites and balloons! The Public Lab website and mailing list are already full of examples and suggestions related to infrared photography, and it's easy to start a discussion with community members about your ideas, or ask for advice.

  • 3. Upload. After you've finished an image capture session, you'll want to upload your images using the (free, open source) online software our community is developing. This will likely simply involve navigating to a particular URL and dragging-and-dropping your images onto a specified area of a webpage. Easy peasy.

  • 4. Analyze. If you thought the prior steps were fun, this step is fun +1 . We're planning on providing a suite of image analysis tools online, so that everyone from researchers to geek gardeners can analyze, tweak, modify, and re-analyze their imagery to their heart's content, extracting useful information about plant health and biomass assessment along the way.

  • 5. Share. And perhaps the most exciting aspect of all: your imagery, your work, and your insights can easily be shared with the rest of the Public Lab community via this online service, the Public Lab mailing lists, and wikis and research notes at http://publiclab.org. Develop a kite-based aerial imagery project with your friends; get advice from NDVI researchers in the community as to the best techniques for yielding useful information from your garden photos; create and collaborate on new methods and protocols around DIY infrared photography. Join Public Lab's "share and share alike", open source learning community: http://publiclab.org/wiki/registration