ASPH 611 Term Project A University of Calgary Department of Physics and Astronomy Graduate Course in Radio Astronomy 

Noise Investigation of the SpectrometerChristy Bredeson, December 12, 2005In order to better understand the effects of noise on spectral data, we can examine the noise properties of our spectrometer. In particular, we want to understand the relationship between the noise level and integration time and how we can choose a suitable integration time for the project. It is important to choose an integration time that will not only increase the signal to noise ratio of the output, but will also be useful for giving adequate sky coverage. Due to our observation strategy of letting the sky drift by the telescope as it acquires data, we also cannot have too long of an integration time or we will be integrating over more than one object in the sky per time point. The strategy for investigating how noise varies with integration time involved making observations for a range of integration times and then comparing the root mean squared amplitudes of the noise level in each case. Four integration times were chosen for the task: 10 ms, 100 ms, 1000 ms and 10000 ms. The hot load was placed in front of the receiver for each measurement. About five time points were taken for each integration time. In order to determine the noise level in each data set, a running 31 point boxcar average was computed across the spectrum. This procedure acts to smooth out the data and gives a running average that follows the band shape. Two examples can be seen below, the first is taken for a 10 ms integration time and the second is for a 10000 ms integration time.
This image shows a plot of the spectrum for the first time point of the 10 ms integration time. We are seeing the amplitude (voltage squared) in arbitrary units versus the channel frequency. The smoothed line is plotted on top in a light grey colour. The noise can clearly be seen here as a messy oscillation about the average line.
This image shows a plot of the spectrum for the first time point of the 10000 ms integration time. We are seeing the amplitude (voltage squared) in arbitrary units versus the channel frequency. The smoothed line is plotted on top in a light grey colour. This plot shows noise also, but it is greatly reduced from the 10 ms integration time. We can notice a drastic reduction in the noise level between these two examples. In fact, the radiometer equation, which tells us about the sensitivity of our telescope, indicates that for a longer integration time, our sensitivity should increase or equivalently, our noise should decrease by a factor of 1/sqrt(integration time). Given our algorithm for finding the noise (average square difference between the data points and the running average), one does not expect the strict 1/sqrt(dt) behaviour because the noise in the running average will also be in your data. The noise is the quadratic sum of the noise in the data and the noise in the running average. In order to quantify the noise in each of the four plots (all similar to the examples shown above), we need to take the difference between a given point and the average of the function at that point. Since that quantity can become negative for noise that is less than the average, we must take the square of this value. Two example plots can be seen below.
This plot shows the difference between a point and its average squared, again in arbitrary units, at each frequency channel. The integration time here is 10 ms. As seen by the scale of the yaxis, there is much more deviation from the average than in the next example.
This plot shows the difference between a point and its average squared, again in arbitrary units, at each frequency channel. The integration time here is 10000 ms. This plot is much less noisy than the previous plot. For the longer integration time, we again see much less noise, which is as expected. Once we have this difference squared, we can now calculate the root mean square of the amplitude of the noise. This is given by A_rms = sqrt((d1^2+d2^2+…+dn^2)/N)1. In other words, if we sum the differences squared, divide them by the total number of points we have and take the square root of the result, we will get the rms amplitude of the noise level. We expect that if we plot A_rms versus the integration time, denoted tau, we should get a relationship A_rms = tau^0.5, again from the radiometer equation. A_rms was computed for each integration time using all the spectral channels except a few hundred that were cut off each end to account for the rolloff of the band. For easy comparison, the results can be seen below, plotted on a loglog plot.
This is a loglog plot of log A_rms versus log integration time. All four integration times are shown on the plot. On a loglog plot, the relationship is approximately linear. We can use IDL's linear regression routine, linfit, to return the slope and intercept of this plot as well as their errors. For this graph, a slope of 0.8± 0.2 was found and an intercept of 22.3± 0.5. In the loglog plot, we are not expecting a sensible intercept since as the integration time approaches zero, the noise will approach infinity. Of course, linfit is merely extrapolating back to find the linear intercept, rather than the physical limit of the plot. The slope however, is following a similar relationship to the expected slope. In general, this exercise has confirmed that if you increase the integration time, your noise will be reduced with decreasing returns for longer and longer integration times. This has led us to choose an integration time of 1000 ms for the majority of our observations. We were then able to increase our signal to noise ratio to an acceptable level and sample the sky adequately at the same time.

Last modified: 10:22 am July 17, 2014