###Introduction Vineyards, large farms, and NASA all use **near-infrared photography** for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, we've developed a Do-It-Yourself way to bring this technology to everyday people, enabling us to monitor our environment through quantifiable data. We are currently running a Kickstarter for a version of this camera we call the **Infragram**. [Read more about it here »](/wiki/infragram) ###What is it good for? - Take pictures to examine plant health in backyard gardens, farms, parks, and nearby wetlands - Monitor your household plants - Teach students about plant growth and photosynthesis - Create exciting science fair projects - Generate verifiable, open environmental data - Check progress of environmental restoration projects - Document unhealthy areas of your local ecology (for instance, algal blooms) **Here's an example** of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: [![infragram](https://i.publiclab.org/system/images/photos/000/000/424/medium/aerial-split.jpg)](https://i.publiclab.org/system/images/photos/000/000/424/original/aerial-split.jpg) ###Background: satellite infrared imaging The study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing scientists quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows scientists to estimate the amount of healthy foliage in every satellite image. Thousands of scientists, including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- [NAIP](http://datagateway.nrcs.usda.gov/) and [Vegscape](http://nassgeodata.gmu.edu/VegScape/) -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots. ![PetVISNDVIcomp.jpg](https://i.publiclab.org/system/images/photos/000/000/290/large/PetVISNDVIcomp.jpg)

Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the [gallery of high-res images by Chris Fastie](https://plus.google.com/photos/116103622078305917397/albums/5878196749239180465/5878198341400814034) ###Point & shoot infrared photography The goal of Public Lab's Infragram project is to bring the power of NDVI and other infrared vegetation imaging back to earth where everyone can now take close-up images of plants or landscapes and instantly learn about their health and vigor. ![RoscoVis-201335838.jpg](https://i.publiclab.org/system/images/photos/000/000/280/medium/RoscoVis-201335838.jpg) _Chris Fastie's [infrared/visible camera prototype](/notes/cfastie/04-21-2013/rosco)_ We are able to tweak a single camera to capture near-infrared, green, and blue light. This allows us to photograph the "secret life of plants". We do this by filtering out the red light, and **reading infrared in its place** using a piece of carefully chosen "NGB" filter. Read more about [the development of this technique here](http://publiclab.org/notes/cfastie/04-20-2013/superblue). ![swap.png](https://i.publiclab.org/system/images/photos/000/000/376/medium/swap.png) ##How we do it [Research by Chris Fastie](http://publiclab.org/notes/cfastie/04-20-2013/superblue) and [other Public Lab contributors](/tag/near-infrared-camera) have led to the use of a **single camera which can image in both infrared and visible light simultaneously**. The filter is just a piece of carefully chosen theater gel which was examined using [a DIY spectrometer](/wiki/spectrometer). You can use this filter to turn your webcam or cheap point-and-shoot into an infrared camera. ##How to process your images: We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. Currently there are several approaches: * [Ned Horning's](/profile/nedhorning) [PhotoMonitoring plugin](/wiki/photo-monitoring-plugin) * Manual processing * [in Photoshop](/notes/warren/10-25-2011/video-tutorial-creating-infrared-composites-aerial-wetlands-imagery) * [or GIMP](/notes/warren/10-27-2011/video-tutorial-creating-false-color-ndvi-aerial-wetlands-imagery) * Using MapKnitter.org (deprecated) * Online image processing using our protoype webapp, [here](http://infrapix.pvos.org/) * Source code is on github as [infrapix-flask](https://github.com/Pioneer-Valley-Open-Science/infrapix-flask) * Command-line processing of single images and rendering of movies using a Python script. Source code is [here](https://github.com/Pioneer-Valley-Open-Science/infrapix) ###Processing overview We're working on a cleaner, easier way to process images, and hope to have a web app up soon which will work something like this: * ** 1. Calibrate.** In order to get the most meaningful data possible from your plant images, it's a good idea to "calibrate" your camera, taking into account the current lighting conditions (sunny vs. cloudy, indoors vs. outdoors) at the time that you're taking your photos: this makes it much easier to compare "plant health" images taken at different times, in different places, and by different cameras. To make this easy, we'll likely be providing an additional "white balance card" -- simply, a card that has a standard color -- in our kits. By recording an initial image that includes this card, you'll be able to use our online software to "standardize" the colors in all of your images. If you don't have a card, don't worry -- there will also be opportunities to calibrate your imagery automagically later, using our analysis software, and the results might be just as good. * ** 2. Take your snapshot.** "Rhododendrons -- say cheese!" Using your own camera (modded with our DIY filter), the Infragram Webcam, or the Infragram Point & Shoot, you'll record the scene of your choosing -- ideally, with some vegetation-y life forms in it. Take pictures of household plants, garden vegetables, trees -- we've grabbed a lot of useful agricultural imagery from cameras dangling from kites and balloons! The Public Lab website and mailing list are already full of examples and suggestions related to infrared photography, and it's easy to start a discussion with community members about your ideas, or ask for advice. * ** 3. Upload.** After you've finished an image capture session, you'll want to upload your images using the (free, open source) online software our community is developing. This will likely simply involve navigating to a particular URL and dragging-and-dropping your images onto a specified area of a webpage. Easy peasy. * ** 4. Analyze. ** If you thought the prior steps were fun, this step is fun +1 . We're planning on providing a suite of image analysis tools online, so that everyone from researchers to geek gardeners can analyze, tweak, modify, and re-analyze their imagery to their heart's content, extracting useful information about plant health and biomass assessment along the way. * ** 5. Share.** And perhaps the most exciting aspect of all: your imagery, your work, and your insights can easily be shared with the rest of the Public Lab community via this online service, the Public Lab mailing lists, and wikis and research notes at http://publiclab.org. Develop a kite-based aerial imagery project with your friends; get advice from NDVI researchers in the community as to the best techniques for yielding useful information from your garden photos; create and collaborate on new methods and protocols around DIY infrared photography. Join Public Lab's "share and share alike", open source learning community: http://publiclab.org/wiki/registration **Note:** Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history