Jeff Warren asked me to post some images of my setup for the UV spectra I posted on Spectral Work...
Public Lab is an open community which collaboratively develops accessible, open source, Do-It-Yourself technologies for investigating local environmental health and justice issues.
Public Lab chatroom
Reset your password
Read more: publiclab.org/n/1833
Jeff Warren asked me to post some images of my setup for the UV spectra I posted on Spectral Workbench. I'm having trouble getting the upload to work correctly, so I don't have a set including all the spectra from my tests. Soon I'll probably just re-do the tests, but I'm waiting to get some more samples to test first.
Last night I used this setup to test UV passing through tap water, boiled tap water, All Free and Clear laundry detergent, and Clorox bleach.
Here are the spectra:
Now that we dan download the CSV file for the spectra, I've imported the data for my UV tests into Excel, and have normalized each graph (each graph has the same total area under the curve, so they can be compared directly to each other.
I did this Help out by offering feedback!
Browse other activities for "spectrometry"
None yet. Be the first to post one!
I love how the "Free & Clear" detergent has a bluing dye in it. Kinda defeats the purpose, if you asked me.
Haha, yeah. I'd like to try to repeat some of these tests with a green laser, i've been having a lot of success and its easier to isolate the source light. But harder to get a good amount of light onto the sensor...
Yeah, we use cloth diapers for our kids and it's important to find detergent without any dyes or whiteners because that stuff will build up over time, so we've spent a good amount of time trying to find detergents without them. My wife was pretty impressed to see we could use this for a quick test of company's claims ;)
By the way Jeff, I saw you've added the option to download the data in CSV format now. I tried it in excel and it imported perfectly, thanks! I'm pretty busy with work for the next couple weeks, but I'll keep you posted with some of the things I can do with the data now.
sorry, captcha fooled me into multiple posting
Wow, what a great use for the spectrometer -- I tend to think of it as just a prototype but I love hearing how people are already putting it to use, and many are!
Im curious-- what wavelengths are you looking at to ID the bluing agent? Do you have a control sample for comparison?
Is this a question? Click here to post it to the Questions page.
The control group is here: spectralworkbench.org/sets/show/13
If I recall, I did just the UV, UV through tap water, and UV through boiled tap water. There were no differences in those greater than experimental error. Those can be compared to the detergent spectra to see the shift from the UV region to the blue region, as a result of the brighteners used in detergent (they actually make your clothes fluoresce).
Cool - should we add a "normalize data" function to the online analysis tool?
That would definitely be a useful function! An important part of making that accurate would be to also have the ability to "trim" the data. For example, most of my spectra have a bit of the second-order spectrum showing up in the 900nm region, and I need to trim that out before normalizing.
I was definitely thinking of a trim function so I'm glad you mentioned it! Also I think I have spectrum reversal issues figured it... the app can now figure out which is the red end of the spectrum and correct.
Do you think it should discard data outside the defined range? I guess you can always recover it from the image.
Yeah, I don't know a lot about how the programming works, but if you have it completely delete the data outside the specified range, we still have the "re-exract from image" button, so we can start over if necessary.
I'm creating a new Github issue for the equal-area feature... but if i understand, it would only be useful for fluorescence, is that right? not absorbance?
Hmmm... I'm trying to remember what the differences would be with an absorption experiment. Maybe I'm missing something, but as far as I can think, the equal-area/normalization feature would be useful for any time you're comparing two or more spectra. The purpose is to compare the spectra as if the total amount of light were the same for each one, making it easier to see or calculate the differences. This is especially helpful if you move your experiment or take it apart and reassemble it between taking the spectra, or even more so for us to compare spectra with each other. I think this would be useful for absorption spectra as much as with fluorescence, it just depends on what you're looking for in your experiment.
So, for example, if you want to see which source emits or absorbs the highest amount of yellow light, you don't want to normalize the data, that would skew the results. If you want to compare which source has the highest percentage of yellow light compared to other colors in the source, regardless of the total output of the source, then you definitely want to normalize the data.
It sounds like what you're describing is like a "tare" function of a scale. You want to ignore the reading of the container, and just pay attention to sample. Is that right, or am I missing something?
Very Cool, but I can't tell what's what on your graph with the key so tiny.
For future reference saving screen captures as PNG files won't lose any of the info and does a better job on graphs. JPG is good for more natural/non-linear stuff.
I want to implement this next but need to find a best-practice way to normalize the area under the graph... ideas? It would be nice if it were fairly efficient since we'll be running it live, for every frame.
Posegate - We're actually talking about a few different tasks. One is the ability to trim the ends off the left and right of the graph, so that only the area in the visible spectrum is included. This would help eliminate error from second-order spectra or reflections inside our spectrometers. The second is the ability to remove "noise", which could definitely be compared to the tare function on a scale. This would subtract a certain number from every point on the graph, which would help eliminate error from light entering the spectrometer other than the spectrum itself. The difference from a tare function would be that it have a minimum of zero. If you tare a scale with the container on it, then remove the container, it will probably give you a negative reading, but a negative amount of light does not make sense here. The third task is to normalize the data, which can't really be compared to a scale. To normalize data means we have a a few lines plotted on the same graph, and we want to be able to compare them to each other, so we adjust them to each have the same area under the line. This is done by adding up every point on the line, then finding a number to multiply by that will cause that total area to equal 1 (or any other arbitrary number, as long as it's the same each time). Then we multiply each data point by that number, and re-plot the line. When this is done for each line, they'll all be normalized and ready to compare to each other.
DSuds - Sorry the key is tough to read on the graph. I used JPG because I just don't like PNG files. When I save as a PNG is so often ends up a different size than what I was looking at before, and that bugs me. I need to take a little time and figure them out, but it's easier just to grouch about it and stick to JPGs ;)
Warren - Again, I don't know what the most efficient way of writing the code will be, but I can outline what needs to happen, in what order.
Calibrate the spectrometer. It would be helpful if this calibration could be saved on your computer, so you didn't have re-do it each time or wait until it was uploaded to calibrate, because this needs to be the first step each time. After the calibration is saved, it could be an option in the settings to remove the calibration or re-calibrate when necessary. If you want to get fancy, you could offer a few "device profiles" so if someone is using a few different spectrometers they could save calibrations for each one.
Trim down to only the visible spectrum. It would be helpful to have a default range that could be adjusted by the user. If you're thinking of implementing this on the live capture program, it could be a checkbox so that it can be turned on or off while you capture.
De-noise the spectrum. Again, this could be a check-box to turn on and off. If it doesn't take too much processing power, it could have an automatic or manual setting. The automatic setting could detect the lowest value in the range and subtract that from every point. The manual setting could be adjustable by the user, and maybe display a discreet warning if you have it to high and it's giving negative values at some points.
Normalize the graph. With this one it's pretty important to be able to turn it on and off easily, as it will probably only be wanted around half the time. The easiest way to normalize, like I wrote above, is to first find the sum of all the points, then divide 1 by that sum to find your normalization coefficient, then multiply each data point by that normalization coefficient. If you want to check it, re-sum the new data points and make sure the sum is 1. I don't know if this could be done live. If not, it could be an option to normalize the data after you capture the spectrum. In fact, if you are doing this in the capture program, you'll want it to record that normalization coefficient somewhere so that the data for the original intensity of the source is not lost.
Thanks, the normalization steps are very helpful.
Also, i'm starting to compile use cases here: http://publiclaboratory.org/wiki/spectral-analysis
more soon! busy busy!
Hello Josh, nice UV spectra.
What a fine application this turns out to be, I think I am going to back kickstarter but probably will build my own before I get mi pledge delivered.
But I would like to build a UV/Visible Spectrometer, that would go down at least 350nm and, if I may ask, even as low as 300nm. For this I need to figure out about 2 issues, which you and Jeff may be able to clarify:
I will recieve a 3D printer in a couple of weeks and will be printing the parts to my spectrometer. Hope to be able to upload some of my progress by then.
Will respond some of the issues I raised in my previous post here in this thread.
Will also look in the wiki if there is a page on this subject and maybe add/centralise the info there.
There are a bunch of posts titled "Spectrometry UROP" which bring some very nice info and I fund this one particularly infomative:
Basically it talks about the UV filtering properties of the polycarbonate layer of the DVD grating, which attenuates or absorbs the UV lines you want to capture. The proposed solution is splitting the DVD's layers at the middle, exposing the metal grating and using it as a grating mirror shining the light reflected into the camera, instead of shining the light through the grating into the camera.
Crucial is also the removal of any UV filter present in the camera optics.
Cool. i don't think there's a page yet but one like http://publiclaboratory.org/wiki/uv-spectrometry or similar would be great. I've started tagging UV-related posts as "ultraviolet" so that should help.
Alex did some great work -- i wonder if the "cheap" solution of using the DVD-R reflective layer could be replaced with a purchased reflective grating -- i think they're called "holographic" gratings. Because the DVD material warps a bit. But it's worth a try either way!
My guess was that the lenses will absorb UV even after we've removed the IR block filter. So maybe a pinhole could work?
Great, thanks for the wiki page! I will start gathering and publishing info there soon.
I was thinking about the reflective gratings as well, but didn't know they were called holographic. I have found a couple of websites selling them. As they are bit more expensive (85-105$) and also delicate, this would be an application for more complex spectrometers. Maybe there are some cheaper sources.
What about the grating density, do you think 1000/mm is still good? As I understand it, to keep the angle of first order spectra well separated from higher order ones, it would be nice to keep close to that figure, but seing that the geometry of the device will have to change anyway, this being a reflective grating, maybe going for a 1200/mm would be better? I ask because as we get into lower wavelengths, we may need higher grating densities?
Looked up some HD pinhole cameras, some do 1280x960 resolution video and 3264x2448 stills, that's pretty good right? Does the SpectralWorkbench software need stills input or video?
Also, I wonder if the pinhole cameras that are being sold are really pinhole cameras or if they are just tiny, fixed lens cameras, which would kind of defeat the purpose....
I'm hoping to put together a page at http://publiclaboratory.org/wiki/diffraction-gratings to list different types and pros/cons. There have been some good mailing list discussions on this.
Do you have links for HD pinhole cameras? It's easiest if they are USB webcams. I was going to try removing a lens and building my own.
I actually only made a search of "HD pinhole camera" on eBay and found a long list of candidates:
Some have a very good resolution, but the specs lack sizes of the CMOS/CCD chip and if they use lenses. Anyway, I take it for certain that all use some kind of a cheap plastic lens.
I wonder if there is a way I could measure the absorption of these lenses, or find documentation on just that subject online...
double pots... gateway problem?
Hi, this is impressive. I'm wondering how you could normalize the graphs so that they have the same total areas under the curve (which function in Excel can do it? )
Hi, Minh! There's a thread discussing this feature in progress here: https://github.com/jywarren/spectral-workbench/issues/108
Hey friends! I've been very busy with work lately, so it's taken some time for me to get back here.
Fernando: My webcam seems to go down to around 375ish. AFAIK, most webcams don't include any sort of intentional UV filter, because the sensor isn't too sensitive in the UV range, though the lens may well block some of it that the sensor would otherwise pick up. A pinhole camera might be able to pick up more. It might also be worth studying the properties of the sensors used in various cameras. I know some are more sensitive to IR than others, it may be the same case for UV. As for the source I'm using, it's one of these: http://www.batteryjunction.com/p60-uv-buld.html. If you're interested in it, I can tell you what you'll need to set it up, it's a "P60-style dropin". A 3D printer sounds like a lot if fun! You'll have to show us some of your creations! As for the DVD messing with the UV, if you're worried about that, I agree with Warren that a dedicated diffraction grating is a good option, and I think you could find one relatively cheap. With the spectral workbench software, it uses video in[ut, but you also have the option to upload an image straight to the site without using the capture software (this would mean more work for you to prepare the image on your own).
Minh: AFAIK, there is no "normalize" function in Excel. You have to do it in several steps, and you can follow the outline I gave above (in the comment on Sept 4th). If you'd like some help figuring out exactly how to make Excel do that, let me know and I'd be glad to give you some tips.
I'm doing some research on public labs' use of spectrometers, and I was wondering if you could confirm a few details for me? firstname.lastname@example.org.
You must be logged in to comment.
This is marked this as an activity for others to try.
Try it now
Click here to add some more details.
How long does this activity take?
How hard is this activity?
What kind of activity is it?
What is it's current status?