#
Decrease in noise versus integration time.

####
sep 2002

Data was taken for 100 minutes (6000 seconds) on 05sep02
with the IF connected to the downstairs noise source. 1.5625 Mhz bandwidth
and 1024 channels were used. Each record was integrated for 6 seconds
giving a total of 1000 records. A second order baseline was removed from
all of the spectra and then they were summed for 6,12, 24, 48, 96, 192,
384, 768, 1536, 3072, and 6000 seconds. ,The plots
show the rms noise (deltaT/Tsys) versus integration. The axis use a
log scale. Black is polA and red is polB. The 4 boards are plotted one
after another.
A linear fit was made to the log of the equation:

dT=Tsys*1./ sqrt(B*Tau) --> log(dT)= -.5*log(Tau) + C

It is the green line over plotting the data. The slope should
be -.5. The fit value for the slope is printed on each plot.
The expected value for the rms after 6000 seconds
should have been:

1.04/sqrt(1.5625*1e6/1024* 1.208*6000) = .000312 dT/Tsys.

(1.04 from 9 level sampling, 1.208 for the filter shape). The measured
values were .00035 dT/Tsys. The larger measured value is probably coming
from residual filter shape after the 2nd order baseline removal.
This shows that the correlator and downstairs IF/LO
is integrating properly for at least 100 minutes.

processing: x101/020905/.pro
home_~phil