16
A device configured to use realistic timing selects its tick counts to encompass the same number of machine
instructions as would be executed in hardware. In real-time mode, an operation taking ten milliseconds in hardware
will complete after 15772 machine instructions have executed, based on an average 1000 E-Series hardware
instruction execution time of 0.634 microseconds. In this mode, a software program will execute approximately the
same number of instructions during a device operation that it would do on a real machine. Host machine speed
and execution of concurrent host programs will not affect the number of simulated instructions executed for a given
operation.
A device configured to use calibrated timing selects its tick counts to align the simulated operation periods with the
corresponding time periods on the host system. In calibrated-time mode, an operation taking ten milliseconds in
hardware will complete after ten milliseconds has elapsed on the host system. Because the simulator is generally
one or two orders of magnitude faster than the hardware, a software program will execute far more code during a
device operation than it would do on a real machine. In this mode, the amount of code that executes will vary with
the speed of the host machine and the load placed on that machine by other concurrent processes, as the
simulator continually adjusts the tick counts up and down to maintain synchronization with the host time.
A device configured to use optimized timing selects its tick counts to minimize the operation delays. In fast-time
mode, an operation taking ten milliseconds in hardware will complete after the minimum amount of time acceptable
to the executing software has elapsed. In this mode, device operations complete far more quickly than they do in
hardware. There are limits, however, to how fast I/O operations may occur without causing software malfunctions.
In practice, system software often contains assumptions regarding the time certain operations take, so event timing
may not be arbitrarily reduced. For instance, an I/O driver may “know” that a line printer takes 50 milliseconds to
print a line, and therefore it can ignore interrupts safely for several milliseconds after initiating the print cycle. If the
printing time is reduced below that threshold, the driver may fail to operate correctly. The default optimized-timing
settings have been empirically determined to work with the supported operating systems listed above, and each
device simulator allows the user to modify those settings via registers if needed.
To illustrate how the modes affect timing, consider a simulation of a Teletype terminal that operates at 10
characters per second. If the simulator runs 15 times faster than a real machine, then a user would observe that
printing 100 characters takes:
• 10 seconds in CALTIME mode
(100 characters × n event ticks per character adjusted to take exactly 100 mS each on the host system)
• 667 milliseconds in REALTIME mode
(100 characters × 157,729 event ticks per character × 0.634 µS per tick ÷ 15 times hardware speed)
• 845 microseconds in FASTTIME mode
(100 characters × 200 event ticks per character × 0.634 µS per tick ÷ 15 times hardware speed)
If the SCP SET THROTTLE command is used to reduce the speed of the simulator, CALTIME operations will not
be affected, but REALTIME and FASTTIME operations will slow proportionally. Reducing simulator speed to that of
the original hardware will cause REALTIME operation times to equal CALTIME times.
Devices offer only those modes that are generally useful. For example, the HP 12539C Time Base Generator may
be configured to use REALTIME or CALTIME modes. The real-time mode will satisfy the expectations of the TBG
diagnostic that checks the timing of operations via delay loops, whereas the calibrated mode will update the DOS,
RTE, and TSB time-of-day clocks as expected by users of the simulated system. FASTTIME mode is not offered,
as it makes no sense to ignore the programmed time period settings and use a fixed arbitrary period instead.
Devices performing input or output typically offer a REALTIME mode for use when running the diagnostics and a
FASTTIME mode for use when running operating systems. In general, software running under simulation will run
faster when devices are configured for optimized-time mode, and this is the default for all peripheral devices.