Timescale for averaging solution interval

The solution interval, or timescale used to average the solution for the gain variations, is the result of a tradeoff between the timescale of the true gain variations (e.g., changes in the atmospheric conditions or electronics) and the data averaging which is necessary to reach a minimum SNR value for the visibilities. In principle, this timescale should be smaller than the coherence time of the atmospheric fluctuations for the phase solution, typically one minute. It is in general longer for the amplitude gains, since here these are changes in the atmospheric transparency or antenna gains which matter. It is therefore recommended to start iterations with a not too short averaging time, and to decrease it in a second step if the self-calibration was successful. It is also recommended to use the same integration time for the last two iterations, in order to ease the interpretation of the results and of the SELFCAL SHOW display.

The current self-calibration method computes baseline-based gains (observed complex visibility divided by model prediction) for each visibility, and performs the time averaging on these baseline-based gains. Antenna based gains are derived from the time-averaged baseline-based gains. This is in principle an optimal method, since the model is not noisy: the linearity of the first step guarantees a noise decrease as square root of time.

The method also has the advantage that it applies equally well to single-fields as to Mosaics. The time averaging of baseline-based gains can be done even if these gains were derived from different pointing directtions.