The Infragram project brings together a range of different efforts to make Do-It-Yourself plant health comparisons possible with infrared photography. Use the web-based app at Infragram.org to process your imagery.
This project was made possible with support from Google and the AREN Project at NASA.
Vineyards, large farms, and NASA all use near-infrared photography for assessment, usually by mounting expensive sensors on airplanes and satellites. Infrared analysis is used for a variety of things, but most focus on assessing plant health, such as:
- to assess crops and the efficacy of #agriculture practices
- better #soil management (reducing #fertilizer, soil treatments)
- in #wetlands analysis
Just as cell phone video has become instrumental in accountability today, we aim to democratize and improve reporting about environmental impacts.
To start to do your own infrared analysis project, you'll need:
- Photos: A way to take near-infrared photos AND matching regular visible light photos from the same angle (in a single camera or two matched cameras)
- A question you'd like to answer -- look through these templates for how to design your analysis
- Software for processing and analysing your photos
- Analysis -- tips and support to interpret and understand what your images are telling you
To get a kit with this already set up on an SD card and/or Raspberry Pi camera, see:
How it works
In 2014, we launched an early version of this project on Kickstarter, and the video is a good overview of the project, although we've come a long way since:
The Infragram Kickstarter video, a great introduction to the project.
Infragram starter kits
Also see Getting images, below.
There are three major ways to produce multispectral "infragram" images:
- Two-camera - one near-infrared camera and one normal RGB camera
- Single camera w/ red filter - replacing blue with infrared
- Single camera w/ blue filter - replacing red with infrared
Since these are a hard to keep track of, here's a diagram to show the three main types (you can edit the diagram here):
Comparing plant health
An important limitation of most DIY techniques is that we are using uncalibrated cameras, and so the analysis works best when we compare two vegetated areas under the same conditions (light, angle, time of day) rather than just take a photo of a single region. That is, the DIY approach is based on relative, or comparative, uses -- you can't learn a lot without a point of comparison.
An easy way to do a comparison is:
- compare two areas (with the same type of plants) within a single photograph
- compare two photographs with the same camera settings and lighting conditions (angle, brightness, color), of the same plants
Learn more at Comparing Plant Health
Doing NDVI analysis on plants requires post-processing both infrared and visible images (or a combined image -- see Getting images) into a composite image, using the NDVI equation (or another like it). This can be done with a variety of software; see this page for more:
Post questions or troubleshooting requests here, for example about:
- making sure your camera is working (color balance, exposure, light source)
- what to compare against (see study design)
- different lighting conditions or angles
Activities should include a materials list, costs and a step-by-step guide to construction with photos. Learn what makes a good activity here.
We're working to refine and improve DIY infrared photography on a number of fronts; here, take a look at the leading challenges we're hoping to solve, and post your own. For now, we're using the Q&A feature, so just click "Ask a question" to post your own challenge.
Be sure to add:
- constraints: expense, complexity
- goals: performance, use cases
|Can we use a color calibration reference card to calculate absolute values for DIY NDVI?||@warren||about 4 years ago||1||1|
|Best low cost camera for indoor plants?||@DrC||over 3 years ago||1||3|
|I am getting very low NDVI values.||@Anice||about 3 years ago||1||3|
|What is the working principle of Blue/Red filter||@nickyshen0306||over 2 years ago||0||2|
|How can we stream video to SpectralWorkbench.org or Infragram.org from a Raspberry Pi camera?||@warren||about 4 years ago||2||2|
|Can we create a guide or set of guides to interpreting infrared or NDVI images?||@warren||about 4 years ago||0||3|
|Can I use a Raspberry Pi with the Pi Noir camera to make NDVI images?||@warren||about 4 years ago||0||2|
To start, you'll need near-infrared images and regular visible light images of the same scene -- or an image which combines these in different color channels.
There are sources of #remote-sensing imagery from satellites and planes you can use, but the Infragram project is about making and using low-cost converted cameras to take our own images.
There are both single camera and dual camera ways of doing this, and each has pros and cons.
Get a kit here or learn about converting a camera here:
We've learned that careful white balancing of your converted Infragram camera is essential for good NDVI images. Learn how in this short video and read in depth on the topic in research by Chris Fastie. There is also a wiki page on the subject at http://publiclab.org/wiki/infrablue-white-balance
If you're using an Infragram Point & Shoot (aka Mobius Action Cam), see this page for a guide on setting the white balance of that camera.
Should you use a RED or BLUE filter?
See Infragram filters for more on different filters and how well they work.
Early research by Public Lab contributors led to a blue filter technique for making Infragram cameras. But recent research on PublicLab.org has shown that red filters work better -- and on a wider range of cameras. Blue filters did not work on most CMOS cameras, especially cheaper webcams. Public Lab kits now ship with the red Rosco #19 "Fire" filter.
Give or get help
Here are some resources to get help converting or using your Infragram camera. Keep in mind that we are a peer driven community and we encourage everyone to give as well as receive assistance and support!
When describing your question or answer, please include details of the equipment and process you are using as described here for Infragram photos .
Also see our older FAQ here: https://publiclab.org/wiki/infragram-faq
"We're excited that Public Laboratory is developing a low-cost infrared camera which will help us track the success of wetland restoration projects in the Gulf Region--as well as help us track pollution. The Gulf Restoration Network has been using the aerial monitoring techniques that Public Lab developed, so having the infrared camera available to put on the balloon and kite rig will only expand the applications of that technology as well as add value to airplane monitoring flights that help us watchdog the oil industry in the Gulf." -- Scott Eustis, M.S., Gulf Restoration Network
The Public Lab community has been building up a knowledge base in DIY infrared imaging for years. Read more about the history of this project here
Infragram instructions and graphics
Digital files for the filter pack envelope (including logo) and instructions:
Sketchup model for the "filter switch" graphic: filter-switch.skp
Datasheet for Infragram Webcam: infragram-webcam-new-old-diagram.pdf
Focal length of the camera:3.27mm. Chip sensor size for the camera: Sensor:ov2643,SIZE:1/4"