Parallelism




1. parallel processing.

2. <parallel> The maximum number of independent subtasks in a given task at a given point in its execution.

E.g. in computing the expression

(a + b) *

(c + d) the expressions a, b, c and d can all be calculated in parallel giving a degree of parallelism of (at least) four. Once they have been evaluated then the expressions a + b and c + d can be calculated as two independent parallel processes.

The Bernstein condition states that processes P and Q can be executed in parallel (or in either sequential order) only if:

(i) there is no overlap between the inputs of P and the outputs of Q and vice versa and

(ii) there is no overlap between the outputs of P, the outputs of Q and the inputs of any other task.

If process P outputs value v which process Q reads then P must be executed before Q.

If both processes write to some variable then its final value will depend on their execution order so they cannot be executed in parallel if any other process depends on that variable's value.



< Previous Terms Terms Containing parallelism Next Terms >
parallel computer
parallel computing
Parallel FORTH
Parallel Fortran
Parallel Haskell
Adaptor
Andorra-I
artificial neural network
BeBOP
Calculus of Communicating Systems
Parallel Pascal
parallel port
Parallel Presence Detect
parallel processing
parallel processor