Computers use a job control system to manage their tasks, letting one process use resources in one area while another uses resources in a different area.
Job control systems developed early on in the design of modern computers to alleviate speed problems.
Additions to peripheral hardware, such as buffers and cache systems, only made certain processes faster and didn’t solve the problem.
When the hardware was able to take a new process, the highest-priority process would be made active. This process would then move into the hardware.
While a single queue held all the active and inactive processes, information pertaining to the hardware came from the hardware itself.
The driver for the hardware would report whether it was available or not to the scheduler—if a process entered an error state and continued to use the hardware, it would never become available. In addition, some processes only needed the central processor or a certain peripheral; this would tie up the entire system until those processes ended.
When one process comes up that is more important than one that is currently active, the active one will suspend and the new process will take over.