Vineyards, large farms, and NASA all use near-infrared photography for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data.
We are currently running a Kickstarter for a version of this camera we call the Infragram. Read more about it here »
What is it good for?
- Take pictures to examine plant health in backyard gardens, farms, parks, and nearby wetlands
- Monitor your household plants
- Teach students about plant growth and photosynthesis
- Create exciting science fair projects
- Generate verifiable, open environmental data
- Check progress of environmental restoration projects
- Document unhealthy areas of your local ecology (for instance, algal blooms)
Here's an example of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight:
Background: satellite infrared imaging
The study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades.
There are public sources of infrared photography for the US available through the Department of Agriculture -- NAIP and Vegscape -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots.
Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the gallery of high-res images by Chris Fastie
Point & shoot infrared photography
The goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor.
Chris Fastie's infrared/visible camera prototype
We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to try to understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis. We do this by filtering out the red light, and reading infrared in its place using a piece of carefully chosen "NGB" filter. Read more about the development of this technique here. You can also learn more about how digital camera image sensors detect colors at this great tutorial by Bigshot.
How we do it
Basically, we remove the infrared blocking filter from a conventional digital camera and replace it with a carefully chosen "infrablue" filter. This lets the camera read infrared and visible light at the same time, but in different color channels.
While we used to use a two-camera system, research by Chris Fastie and other Public Lab contributors have led to the use of a single camera which can image in both infrared and visible light simultaneously. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using a DIY spectrometer. You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera.
How to process your images:
We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches:
- Ned Horning's PhotoMonitoring plugin
- Manual processing
- Using MapKnitter.org (deprecated)
- Online image processing using our protoype webapp, here
- Source code is on github as infrapix-flask
- Command-line processing of single images and rendering of movies using a Python script. Source code is here
We're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this:
1. Calibrate. In order to get the most meaningful data possible from your plant images, it's a good idea to "calibrate" your camera, taking into account the current lighting conditions (sunny vs. cloudy, indoors vs. outdoors) at the time that you're taking your photos: this makes it much easier to compare "plant health" images taken at different times, in different places, and by different cameras. To make this easy, we'll likely be providing an additional "white balance card" -- simply, a card that has a standard color -- in our kits. By recording an initial image that includes this card, you'll be able to use our online software to "standardize" the colors in all of your images. If you don't have a card, don't worry -- there will also be opportunities to calibrate your imagery automagically later, using our analysis software, and the results might be just as good. If your Infragram camera allows custom white balance, you can follow these instructions and learn more here.
2. Take your snapshot. "Rhododendrons -- say cheese!" Using your own camera (modded with our DIY filter), the Infragram Webcam, or the Infragram Point & Shoot, you'll record the scene of your choosing -- ideally, with some vegetation-y life forms in it. Take pictures of household plants, garden vegetables, trees -- we've grabbed a lot of useful agricultural imagery from cameras dangling from kites and balloons! The Public Lab website and mailing list are already full of examples and suggestions related to infrared photography, and it's easy to start a discussion with community members about your ideas, or ask for advice.
3. Upload. After you've finished an image capture session, you'll want to upload your images using the (free, open source) online software our community is developing. This will likely simply involve navigating to a particular URL and dragging-and-dropping your images onto a specified area of a webpage. Easy peasy.
4. Analyze. If you thought the prior steps were fun, this step is fun +1 . We're planning on providing a suite of image analysis tools online, so that everyone from researchers to geek gardeners can analyze, tweak, modify, and re-analyze their imagery to their heart's content, extracting useful information about plant health and biomass assessment along the way.
5. Share. And perhaps the most exciting aspect of all: your imagery, your work, and your insights can easily be shared with the rest of the Public Lab community via this online service, the Public Lab mailing lists, and wikis and research notes at http://publiclab.org. Develop a kite-based aerial imagery project with your friends; get advice from NDVI researchers in the community as to the best techniques for yielding useful information from your garden photos; create and collaborate on new methods and protocols around DIY infrared photography. Join Public Lab's "share and share alike", open source learning community: http://publiclab.org/wiki/registration
Note: Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history