The OT total time estimate does not scale linearly with the theoretical "time on source". For short observations, this is because there are some minimum times imposed (see KB article "Why does the OT time estimate stop decreasing as I ask for progressively poorer sensitivity?"). For longer observations, this happens because the total time request depends on the number of individual SB executions needed, and each new execution requires its own set of initial calibrations (e.g. bandpass and amplitude calibration). As a result, the OT total time estimates do not increase linearly, but exhibit "jumps" as the threshold for adding an additional execution is crossed. This threshold is usually reached when the total estimated time on source for an execution (summed over all sources, offset pointings and tunings) exceeds ~50min.
The estimated number of SB executions for the largest configuration required is directly reported in the OT "Time Estimate" button on the "Control & Performance" node of each Science Goal.
Related KB article: How do I see what the expected calibration overheads are for my proposal?