Concurrency in programming refers to the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the outcome). It is the composition of independently executing processes, and it allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems). Concurrency is everywhere in modern programming, and it is essential in modern programming, as web sites must handle multiple simultaneous users, mobile apps need to do some of their processing on servers, and graphical user interfaces almost always require background work that does not interrupt the user.
Concurrency programming is when multiple sequences of operations are run in overlapping periods of time. It is not to be confused with parallelism, which is the simultaneous execution of multiple processes on separate CPU cores. In concurrent programming, two or more processes start, run in an interleaved fashion through context switching, and complete in an overlapping time period by managing access to shared resources on a single core of CPU.
Dealing with constructs such as threads and locks and avoiding issues like race conditions and deadlocks can be quite cumbersome, making concurrent programs difficult to write. There are two common models for concurrent programming: message passing and shared memory, and processes and threads. The main challenge in designing concurrent programs is concurrency control, which involves ensuring the correct sequencing of the interactions or communications between different computational executions, and coordinating access to resources that are shared among executions.