_The Infragram Kickstarter video, a great introduction to the project._ ###Introduction Vineyards, large farms, and NASA all use **near-infrared photography** for assessing plant health, usually by mounting expensive sensors on airplanes and satellites. At Public Lab, **we've developed a Do-It-Yourself way to take these kinds of photos**, enabling us to monitor our environment through quantifiable data. Our technique uses a modified digital camera to capture near-infrared and blue light in the same image, but in different color channels. We then [post-process the image](#How+to+process+your+images:) (using [Infragram.org](http://infragram.org)) to attempt to infer how much it is photosynthesizing. This allows us to better understand and quantify how much of the available light plants are metabolizing into sugar via photosynthesis. > You can do this yourself (as with all Public Lab tools) but there is also an [Infragram DIY Filter Pack](http://store.publiclab.org/products/infragram-diy-filter-pack) available in the Public Lab Store. We [ran a Kickstarter](http://kickstarter.com/projects/publiclab/infragram-the-infrared-photography-project/) for a version of this camera we call the **Infragram**. [Read more about it here »](/wiki/infragram) Here's the video from the Kickstarter, which offers a nice visual explanation of the technique: ###What is it good for? Multispectral or infrared/visible photography has seen a variety of applications in the decades [since it was developed](#Background:+satellite+infrared+imaging). We have focused on the following uses: - Take pictures to examine plant health in backyard gardens, farms, parks, and nearby wetlands - Monitor your household plants - Teach students about plant growth and photosynthesis - Create exciting science fair projects - Generate verifiable, open environmental data - Check progress of environmental restoration projects - Document unhealthy areas of your local ecology (for instance, algal blooms) Notable uses include [this photograph of an unidentified plume of material in the Gowanus Canal](/notes/liz/8-3-2011/infrared-balloon-image-reveals-gowanus-plume) (and [writeup by TechPresident](http://techpresident.com/blog-entry/how-diy-science-solving-ecological-mysteries-new-york-city)) and a variety of projects at a small farm in New Hampshire [at the annual iFarm event](/tag/ifarm). The [Louisiana Universities Marine Consortium](http://lumcon.edu) has also [collaborated with Public Lab contributors to measure wetlands loss](/notes/shannon/5-29-2011/plots-and-lumcon-collaboration) following the Deepwater Horizon oil disaster. **Here's an example** of what one of our "Infragram" cameras sees (left) and the post-processing analysis which shows photosynthetic activity, or plant health (right). This picture was taken from a commercial airplane flight: [![infragram](https://i.publiclab.org/system/images/photos/000/000/424/medium/aerial-split.jpg)](https://i.publiclab.org/system/images/photos/000/000/424/original/aerial-split.jpg) ###How does it work? **Camera modification:** We've worked on several different techniques, from [dual camera systems](/wiki/dual-camera-kit-guide) to the current, single-camera technique. This involves removing the infrared-blocking filter from [almost any digital camera](/tag/infragram-conversion), and adding a [specific blue filter](/wiki/infragram#Filters). ![swap.png](https://i.publiclab.org/system/images/photos/000/000/376/medium/swap.png) This filters out the red light, and **measures infrared light in its place** using a piece of carefully chosen "NGB" or "infrablue" filter. Read more about [the development of this technique here](http://publiclab.org/notes/cfastie/04-20-2013/superblue). You can also learn more about how digital camera image sensors detect colors [at this great tutorial by Bigshot](http://www.bigshotcamera.com/learn/image-sensor/index). **Post-processing:** Once you take a multispectral photograph with a modified camera, you must [post-process it](#How+to+process+your+images:), compositing the infrared and visible data to generate a new image which (if it works) displays healthy, photosynthetically active areas as bright regions. An in-depth article on the technique by Chris Fastie (albeit using red instead of blue for visible light) [can be found here](/wiki/ndvi-plots-ir-camera-kit). **History of the project:** While we used to use a two-camera system, [research by Chris Fastie](/notes/cfastie/04-20-2013/superblue) and [other Public Lab contributors](/tag/near-infrared-camera) have led to the use of a **single camera which can image in both infrared and visible light simultaneously**. The Infrablue filter is just a piece of carefully chosen theater gel which was examined using [a DIY spectrometer](/wiki/spectrometer). You can use this filter to turn most webcams or cheap point-and-shoots into an infrared/visible camera. ###Background: satellite infrared imaging The study of Earth's environment from space got its start in 1972 when the first Landsat satellite was launched. The multispectral scanner it carried, like the scanners on all subsequent Landsat satellites, recorded images with both visible and near infrared light. Remote sensing "scientists" quickly learned that by combining visible and infrared data, they could reveal critical information about the health of vegetation. For example, the normalized difference vegetation index (NDVI) highlights the difference between the red and infrared wavelengths that are reflected from vegetation. Because red light is used by plants for photosynthesis but infrared light is not, NDVI allows "scientists" to estimate the amount of healthy foliage in every satellite image. Thousands of "scientists", including landscape ecologists, global change biologists, and habitat specialists have relied on these valuable satellite-based NDVI images for decades. There are public sources of infrared photography for the US available through the Department of Agriculture -- [NAIP](http://datagateway.nrcs.usda.gov/) and [Vegscape](http://nassgeodata.gmu.edu/VegScape/) -- but this imagery is not collected when, as often, or at useable scale for individuals who are managing small plots. ![ndvi-vis-comparison.jpg](/system/images/photos/000/001/289/medium/ndvi-vis-comparison.jpg) Caption: Normal color photo (top) and normalized difference vegetation index (NDVI) image. NDVI image was derived from two color channels in a single photo taken with a camera modified with a special infrared filter. Note that tree trunks, brown grass, and rocks have very low NDVI values because they are not photosynthetic. Healthy plants typically have NDVI values between 0.1 and 0.9. Images by Chris Fastie. Visit the [gallery of high-res images by Chris Fastie](https://plus.google.com/photos/116103622078305917397/albums/5878196749239180465/5878198341400814034) **** ## Frequently Asked Questions Ask a question about infrared imaging [notes:question:infragram] **** ## How to process your images (this section is moved to and updated at http://publiclab.org/wiki/near-infrared-imaging) We're working on an easy process to generate composite, infrared + visible images that will reveal new details of plant health and photosynthesis. There are several approaches: * The **easiest way** is to process your images online at the free, open source [Infragram.org](http://infragram.org) * [Ned Horning's](/profile/nedhorning) [PhotoMonitoring plugin](/wiki/photo-monitoring-plugin) * Manual processing * [in Photoshop](/notes/warren/10-25-2011/video-tutorial-creating-infrared-composites-aerial-wetlands-imagery) * [or GIMP](/notes/warren/10-27-2011/video-tutorial-creating-false-color-ndvi-aerial-wetlands-imagery) * Using MapKnitter.org (deprecated) * Command-line processing of single images and rendering of movies using a Python script. Source code is [here](https://github.com/Pioneer-Valley-Open-Science/infrapix) **Note:** Older versions of this page have been kept at the following wiki page: http://publiclab.org/wiki/near-infrared-camera-history...
Author | Comment | Last activity | Moderation | ||
---|---|---|---|---|---|
cfastie | "These are all very good NDVI images. The NDVI values seem to be close to the correct range, and there are not that many anomalies (most leaves are ..." | Read more » | about 9 years ago | |||
admaltais | "Have you considered using reference charts used to calibrate green rooms in cinema? Some thing like this: labsphere - Reflectance Reference Targets..." | Read more » | over 9 years ago | |||
vdiallonort | "Thanks for your answer, i found some not to expensive filter that block above 700 nm,and some that block under 700 nm so it's not the perfect solut..." | Read more » | over 9 years ago | |||
cfastie | "vdiallonort, If you are going to use a single camera with the IR block filter removed, there are two options. If you can take two photos without m..." | Read more » | over 9 years ago | |||
vdiallonort | "Hello,Sorry for the newbie question, i am trying to experiment on NDVI, i just remove the IR filter on a EOS 1000D so i am looking for filter i can..." | Read more » | over 9 years ago | |||
nedhorning | "This error will occur if you run the Create NDVI from Image plugin without having an image displayed in ImageJ. I added an error message to the nex..." | Read more » | over 9 years ago | |||
warren | "This is something that'd also benefit the general white balance process of Infragram cameras. I wonder, are there any paint sample strips that are ..." | Read more » | over 9 years ago | |||
nedhorning | "The panel order is not important as long as it's predictable and known. The "ideal" targets would probably be a few gray patches from bright to dar..." | Read more » | over 9 years ago | |||
warren | "QR code libraries are available in many languages, and most QR code libraries will probably return a location of the center of the code. But maybe ..." | Read more » | over 9 years ago | |||
nedhorning | "At this point I haven't put much thought into ideal targets mostly because I don't have access to a spectrometer to measure and test different targ..." | Read more » | over 9 years ago | |||
nedhorning | "When I first started thinking about this a couple years ago I figured auto-detecting the calibration targets would be key but that turned out to be..." | Read more » | over 9 years ago | |||
warren | "Ah, I see - its the difference between a new paragraph and a <br /> break tag. Thanks for the clarification. Yeah, that makes sense... It'd ..." | Read more » | over 9 years ago | |||
cfastie | "In markdown, leaving a single blank line between two blocks of content produces a single blank line between the two blocks of content. If you want ..." | Read more » | over 9 years ago | |||
warren | "Standard for an extra carriage return in Markdown is leaving a single blank line between two blocks of content - that may more sense, or be easier ..." | Read more » | over 9 years ago | |||
nedhorning | "Yep - that's it " | Read more » | over 9 years ago | |||
cfastie | "Ned, So like this. We need to make a different file of reflectances for each filter because the filters we were using were very narrow band filt..." | Read more » | over 9 years ago | |||
nedhorning | "Chris - I'll email you the master list with all of the reflectance values for each target since they aren't really useful without the images. You n..." | Read more » | over 9 years ago | |||
cfastie | "Cool. Ned, do you have a reflectance csv file for all of the calibration targets in the test photos from iFarm (the lead image)? And a key for whic..." | Read more » | over 9 years ago | |||
eustatic | "I honestly hadn't seen this! Great minds! http://publiclab.org/notes/eustatic/03-12-2014/floating-camera-rig " | Read more » | over 9 years ago | |||
cooperbreeden | "David, I'm wanting to explore using this technology down in Nashville TN to monitor algae blooms. It's still completely brand new to me but I'd lov..." | Read more » | over 9 years ago | |||
nedhorning | "The image for this note with the blue color was probably a JPEG image white balanced using red paper. The raw image will have much higher values in..." | Read more » | over 9 years ago | |||
agustin_castellano | "Thanks Ned. I have a red filter (600nm) and the filter color is red and the pictures that i take with this filter are red why you have a similar fi..." | Read more » | over 9 years ago | |||
nedhorning | "Hi Agustin - I have a feeling the problem with with the calibration targets you are using. The targets I sued for this note were scanned by a labor..." | Read more » | over 9 years ago | |||
agustin_castellano | "Hi Ned me again, could be that the analisys that you did it was over the image without the red filter? because when i read the article i understand..." | Read more » | over 9 years ago |