What is concurrency?
Concurrency describes the ability of a system to process or execute processes independently of one another. The term is often equated with parallelism. While this is not entirely wrong, it is not entirely right either.
IT systems must be able to carry out numerous processes. This includes, for example, calculations, commands or instructions. If a system is able to handle two (or more) such processes separately from each other, one speaks of concurrency. Basically, a distinction is made between an “apparent” and a “real” version.
Apparent vs. real concurrency
A system apparently works concurrently in “preemptive multitasking”. This variant was standard for a long time and still is in many places. After a certain time, a process is interrupted and “put to sleep”. The system is now looking for another process that is ready to be processed and starts it. This can also be the process that has just been interrupted.
All processes are given a so-called “time slice”. This is to be understood as the length of time they are allowed to run before they are interrupted. The so-called scheduler in operating systems decides the duration . The resulting queue is also called a queue.
If a process requires a function of the operating system during its execution, the process is interrupted immediately and marked as “not ready for processing”. It will not resume until the function has been made available.
Multi-core processors or physically separate CPUs work genuinely concurrently. In this way, the processes can also be carried out separately from one another at the same time. Networks of several computers therefore always work concurrently, for example.
Concurrency and parallelism
Concurrency as a term does not specify when a process is being processed. The following models are conceivable if there are two processes A and B:
- sequential: first A and then B (or vice versa)
- toothed: first parts of A and then parts of B etc.
- parallel: A and B at the same time
Only in the latter case one speaks of parallelism, since the processes now actually run simultaneously and independently of one another. The term parallelism is therefore not a direct synonym for expressing concurrency. Rather, the latter is the general generic term that always includes parallelism. Incidentally, it also makes no difference in terms of definition whether the processed processes serve the same task or separate projects.
Concurrent systems seem advantageous at first glance. The ability to work on processes separately from one another is always desirable. However, there are also two issues that need to be addressed. The so-called deadlock and starvation. Both are difficulties that predominantly occur in parallel processing.
In the event of a deadlock, several processes require identical resources at the same time. Imagine an intersection where cars come from all directions at the same time. When starving, the second process waits for the first to release a certain resource, but it doesn’t. Concurrent processes are therefore controlled with preference and processed in the form of preemptive multitasking.