Light Intensity Measurements
Date: Spring 2013
I am asking this question for my daughter who did in experiment testing lux value changes over distance from an omnidirectional light source. From the inverse square law we learned how much the emitted luminance should decrease with distance. The problem we had was that when the lux meter was held directly to the light bulb and, then at one foot, the measured numbers do not correlate to this calculation. When we measured at 2, 4, and 8 feet, the data plot of measured versus calculated was almost dead on when we used the one foot measurement as the benchmark. We must have repeated this measurement at least a 1/2 dozen times and I cannot figure out why from 0 to 1 foot the lux measurement does not match the calculation. Is it because I must use a unit of distance smaller than one foot when the meter is held directly to the light in order to come up with the correct divisor?
A light bulb is not a point source. Close to the light, the intensity will not vary much with distance because as you move away more of the glowing surface comes into view.
Richard E. Barrans Jr., Ph.D., M.Ed.
I think the problem you are encountering is that the inverse square law is valid only for a point source. The farther you are from the light bulb, the more like a point source it becomes. Use your 1 ft measurement as the reference value. If you go to 2 ft, you should have 1/4 the intensity; if you go to 3 ft, you should have 1/9 the intensity, and so on. You could also try using a smaller bulb - say one like a bare flashlight bulb (no reflector or lens). I suspect you would find very good correlation to the inverse square relationship.
Hope this helps.
Thanks for the question. The Inverse Square Law for light intensity is only valid when the source is a point-source or in non-technical terms, the light behaves as if it is coming from a single point in space. When you are far away from the light bulb, the light bulb appears as a point source and that is why you measurements are "dead on". When you are close to a light bulb, it is not really a point source anymore since the light bulb occupies a finite volume. There are mathematical models (involving the Bessel functions) which can account for the fact that the light bulb is not a point-source. However, these models are challenging at the college level.
I hope this helps. Please let me know if you have more questions.
The inverse square law should strictly hold at all reasonable distances from an UNOBSTRUCTED POINT light source. When the source has a finite size (as in the filament of a light bulb), or if there is an optics involved (such as a reflector), or if the glass bulb around the filament diffuses the light, then the light measured in the vicinity of the source can vary significantly as one moves the lux meter around but at a short fixed distance from the center of the source. However, as one moves away from the source, the source would increasingly appears like a distant point source and the inverse square law is followed.
To check whether this a correct answer, you can repeat your experiment with the same set up but by using a smaller light source. The 1/R^2 law should hold at distances closer to the source.
There are other less likely possibilities for what you have observed, and, for example, to do with the lux meter, how it is held, etc. These are to be explored if the above explanation does not resolve the issue.
Dr. Ali Khounsary
Advanced Photon Source
One possibility is glare. At a distance of one foot, the light reaching the lux meter is essentially perpendicular to the meter's surface. This is true for further distances as well. When very close to the light source, much of the light is at a significant angle from perpendicular. This light glares, keeping some light from entering the meter. Also, the distance from light source to the meter surface is less accurate at a close distance. The center of the meter's surface is the part that is closest to the surface. The edge of the sensor's surface is not quite as close. When a foot away, this variation is small enough to have not effect on the measurements. When close to the light source, the distance from the source to the edge of the sensor's surface is significantly further than the distance to the center of the sensor's surface. It is difficult to tell what to use for distance in such a case. Calculus would be needed to accommodate such short distances.
Dr. Ken Mellendorf
Illinois Central College
Not knowing exactly what kind of test meter you have, this is only a ?educated guess?. Every instrument has a range of operation. If the light source is either too bright, or too dim, the reading may be ?saturating? the detector resulting in a measurement error. I suspect that is what is happening in your case, but I cannot be sure.
A common ?trick? for extending the range of a meter is to measure the light intensity, let us say a 2 feet and/or 4 feet, with and without a light filter. I would suggest a sheet of polaroid plastic. You can find this in a hobby store, a camera store, or if desperate a pair of neutral sun glasses. This will give you a conversion of the meter reading with and without the filter. Use the readings with the light filter to obtain a reading at distances less than 2 feet. You will have ?tricked? the meter into ?thinking? the light is dimmer than it actually is, so that now the amount of light impinging on the meter is within the range of the instrument. You can use ratios, or a plot of filtered vs. unfiltered measurement to convert the filtered data into the unfiltered data.
Even if it does not work, which I doubt, you will have eliminated an instrumental error as the source of the problem.
Click here to return to the Physics Archives
Update: November 2011