This site requires javascript to be enabled to fuction correctly

Knowledgebase: ALMA Observing Tool (OT)
Why does the OT time estimate stop decreasing as I ask for progressively poorer sensitivity?
Posted by Sarah Wood, Last modified by Andy Biggs on 11 March 2019 01:55 PM

The Observing Tool (OT) time estimate has various minima as described in the OT Users Manual, available from the Science Portal under 'Documents & Tools':

"Broadly speaking, the OT time estimates are arrived at by (silently and invisibly) converting the information in the Science Goal into the Scheduling Blocks that will be executed at the telescope. It does this by taking the required on-source times, the default times for the various calibrations (both of which are frequency- and array-dependent), the best estimate of the current overheads and latencies, as well as a standard time for how much on-source time an SB will typically contain (50~minutes for a 12-m Array SB). Required integration times that are longer than this will therefore require multiple executions of the same SB (each of which is called an Execution Block, or EB). On the other hand, the requested sensitivity can imply a very short amount of observing time and therefore the OT enforces a minimum amount of time that can be spent observing a single pointing (10~seconds). In addition, the total time for all sources combined cannot be less than 5~minutes (or 50~per~cent of the calibration time, whichever is largest)."

Therefore, as you ask for progressively poorer sensitivity (increase the value of the "desired sensitivity per pointing") for a Science Goal past some level, you will hit one of these limits and the total integration time will stop decreasing.

(0 vote(s))
Not helpful

Comments (0)