Shot Noise in CCD Cameras

By Jo on

Following on from read noise, we’d like to take a look at the second flavour of noise that affects CCD cameras – shot noise. Steve Chambers explains what shot noise is and what we can do to lessen its effects on our astrophotography.

 

Transcript – Shot Noise with Steve Chambers

Hello. What I’d like to talk about this time is shot noise and how it affects our astronomical images. Shot noise is the uncertainty with measuring pretty much anything with a CCD camera. It’s associated with the quantum processes associated both in the generation of the photons, and the conversion of those into electrons within the camera.

So if we have a part of the sky that we are expecting to get 50 photons from, we might expect from those 50 photons to generate 40 electrons. In one given period of time, we might get those 40 electrons, then the next time we measure, we might get 35 electrons, the next time it might be 45 electrons – there’s always that kind of uncertainty associated with it.

On the slide we’ve got here, that then is represented by what’s a flat field, all the pixels are illuminated evenly. We might then expect all the pixels to have the same value. But they don’t, they have this histogram, and the histogram actually gets wider as the illumination gets higher, and this is shot noise.

Shot noise is actually associated with the square root of the actual signal level itself. Though shot noise increases with increasing signal, it actually increases more slowly than the increasing signal. So, if we look at the ratio of signal to noise regarding shot noise, then the signal to noise ratio increases – gets better – with increasing signal.

One of the things related to this is an idea that pretty much all astronomers will have at one time when imaging from light polluted areas is ‘why don’t we just subtract the light pollution from the image?’ So, if we’re imaging a piece of sky, light polluted area, we might have a value of 10,000 – all the pixels might get 10,000 worth of background counts.

It would be easy just to subtract 10,000 from all those pixels, and expect we’d then have an image that’d correspond to a dark sky location. It doesn’t work out like that, and it doesn’t work because of shot noise. This next couple of pictures just kind of demonstrate that.

These are two pictures, they were taken about thirty minutes apart of a piece of sky. The intensity of the stars is actually pretty similar, pretty much identical on the two images. But what’s different is the background. So in the second image, the moon was coming up and the sky was becoming brighter. So on the one of the left, we have a background of around 8,000 counts, and on the right we have about 11,000 counts.

So following this through, all we have to do is remove 8,000 from every pixel on the left, and 11,000 from every pixel on the right and then we have pretty much the same image. So if we do that, and then maybe put a linear stretch through to about 3,500 counts to start to reveal the nebula underneath – well, what we see is the image on the left, taken with a darker sky, we actually see quite a lot more detail in that image. So there are fainter parts of the nebula that are visible, and the image on the right just appears to have more noise in the background. And again we can see that on the histograms underneath where the noise, that shot noise, produces a bigger, broader peak on the background on the right hand image.

So that’s shot noise. The only real way of avoiding these detrimental effects of background shot noise is by making the backgrounds darker by travelling to a dark sky location. We can introduce nebula filters, or narrowband filters, if we are imaging from a light polluted area, or light pollution filters can be useful as well.

If we’re imaging brighter objects and we want to make sure we’ve got the most signal to noise ratio, the best quality signal, then we want to be imaging for, or taking as longer exposures as possible so it’s collecting as many photons from that area as we can before we saturate the sensors, because we know that as signal is increasing, the shot noise is decreasing at the square root of the signal which is at a slower rate than the signal. So that means that cameras with a bigger full well depth are more able to take advantage of these longer exposures.

Okay, I hope that’s been helpful, thank you for watching.

Back to News

Comments