Simulate a Gaussian distribution of width sigma. Randomly choose N values from this distribution n times. For each sample of N, calculate the mean and median. Estimate the standard deviation of the mean and median given n measurements of N values. Calculate this for a range of N that is reasonable for the combination of astronomical data, e.g. N=3-100. Use a value of n sufficiently large to make an accurate assessment of the standard deviation of the mean and median, e.g n=100. Compare the ratio of the standard deviation of the mean and median as a function of N.
Calculate the effects of detector non-linearity on the measured source counts:
Consider that you are given a set of dome flats which consist of exposures of different length of the same, diffuse, illuminating source. You plot the mean counts per pixel per second vs. the mean counts per pixel for the total exposure. The plot (figure 1) shows a systematic decrease in the counts/pix/sec as the total counts/pix increases. In other words, the detector has a non-linear response. Normalizing the counts/pix/s (y) to unity at counts/pix (x) = 0, the relation is well described to half-well as
y = 1 + epsilon*x
where epsilon = -5e-6 for a typical near-IR InSb detector. The linearization factor G is then
G = 1 + epsilon*x,
such that if you divide each pixel of a raw image by this factor (where x is the raw pixel value), you will "linearize" the data.
Assume that image processing consists of: (i) subtracting a set of temporally adjacent, median filtered target frames, each with the same exposure length and roughly the same background; (ii) flattening the image with a large set of target frames, also all with roughly the same background. This second step consists of a division, but then the resultant image is renormalized (multiplied) by the average value of the flat-field image.
Derive a general relation, to first order in epsilon, that would allow you to predict how much the source flux will change between the raw and linearized frames (otherwise reduced identically) for both background and source-limited images. Assume a typical background-limited target exposure in the K band has a background level (B) of 8200 counts. Estimate the percentage change in flux at this level between raw and linearized images. Does this agree with your intuition? Does this agree with the observations in figure 2 ?